You can subscribe to this list here.
2002 
_{Jan}

_{Feb}

_{Mar}

_{Apr}

_{May}

_{Jun}

_{Jul}

_{Aug}

_{Sep}
(21) 
_{Oct}
(24) 
_{Nov}
(80) 
_{Dec}
(13) 

2003 
_{Jan}
(13) 
_{Feb}
(34) 
_{Mar}
(22) 
_{Apr}
(21) 
_{May}
(48) 
_{Jun}
(33) 
_{Jul}
(3) 
_{Aug}
(2) 
_{Sep}
(3) 
_{Oct}
(12) 
_{Nov}
(12) 
_{Dec}
(7) 
2004 
_{Jan}
(22) 
_{Feb}
(5) 
_{Mar}
(58) 
_{Apr}
(80) 
_{May}
(74) 
_{Jun}
(79) 
_{Jul}
(15) 
_{Aug}
(122) 
_{Sep}
(63) 
_{Oct}
(8) 
_{Nov}
(1) 
_{Dec}
(2) 
2005 
_{Jan}
(13) 
_{Feb}
(24) 
_{Mar}
(5) 
_{Apr}
(16) 
_{May}
(27) 
_{Jun}

_{Jul}
(5) 
_{Aug}

_{Sep}

_{Oct}
(3) 
_{Nov}
(8) 
_{Dec}
(2) 
2006 
_{Jan}
(1) 
_{Feb}
(5) 
_{Mar}
(11) 
_{Apr}
(1) 
_{May}
(16) 
_{Jun}
(5) 
_{Jul}
(2) 
_{Aug}

_{Sep}

_{Oct}
(1) 
_{Nov}
(1) 
_{Dec}
(5) 
2007 
_{Jan}
(4) 
_{Feb}
(4) 
_{Mar}
(4) 
_{Apr}
(5) 
_{May}
(1) 
_{Jun}

_{Jul}

_{Aug}
(3) 
_{Sep}
(1) 
_{Oct}
(1) 
_{Nov}
(1) 
_{Dec}
(3) 
2008 
_{Jan}
(1) 
_{Feb}

_{Mar}
(1) 
_{Apr}

_{May}
(1) 
_{Jun}
(1) 
_{Jul}
(6) 
_{Aug}
(5) 
_{Sep}
(3) 
_{Oct}

_{Nov}

_{Dec}

2013 
_{Jan}
(1) 
_{Feb}

_{Mar}

_{Apr}

_{May}

_{Jun}

_{Jul}

_{Aug}

_{Sep}

_{Oct}

_{Nov}

_{Dec}

S  M  T  W  T  F  S 





1
(2) 
2
(4) 
3

4
(1) 
5
(6) 
6
(7) 
7

8

9
(3) 
10
(2) 
11
(1) 
12
(5) 
13
(3) 
14
(4) 
15

16
(1) 
17
(1) 
18

19

20
(1) 
21

22

23

24

25
(1) 
26

27
(3) 
28

29

30
(2) 
31
(1) 
From: <sebastian.egner@ph...>  20030509 07:28:26

Sebastian wrote: > > By the way, if you really intend to do serious work on this... Jens Axel wrote: > I'm not. This is strictly for fun. All the better! Same with me. If you have any question relating to my implementation of Walker's method (in particular you will need to have two SRFIs available) let me know. Cheers, Sebastian.  Dr. Sebastian Egner Senior Scientist Philips Research Laboratories Prof. Holstlaan 4 (postbox WY63, kamer WY 655) 5656 AA Eindhoven The Netherlands tel: +31 40 2742069 *** new telephone number fax: +31 40 2742566 *** new fax number email: sebastian.egner@... Jens Axel S=F8gaard <jensaxel@...> 09052003 00:01 =20 To: Sebastian Egner/EHV/RESEARCH/PHILIPS@... cc: <schematicsdevelopment@...> Subject: Re: [Schematicsdevelopment] discrete random variab= les Classification:=20 sebastian.egner@... wrote: > By the way, if you really intend to do serious work on this... I'm not. This is strictly for fun. > Recently, there has been a renewed interest in the topic > of drawing at unimodally at random, but with dynamically > changing weights (think of data compression applications). > Walker's method requires constructing a new set of aliases > and cutoffs when any of the weights change, but this can > be avoided. There is an article by Mtias, Vitter, and Ni > > http://citeseer.nj.nec.com/correct/507561 > > who have devised a method for fast drawing at random > while the weights (or probabilities) are changing. It would > be interesting to know if the algorithms are practical. But I might read the article  the other you recommended was enjoyable. Yours,  Jens Axel S=F8gaard 
From: <jensaxel@so...>  20030509 02:12:03

Noel Welsh wrote: >Jens Axel Søgaard wrote: >> After writing the documentation, I accidently fell >> over >> "SRFI 27: Sources of Random Bits", ... >> I think I'll port it this summer, and then use it to >> rewrite some of the functions. > > This would be nice as we don't currently have a port. > Francisco's the man for this stuff. I did it today (I needed it for Sebastian's code. It turns out that it was very easy to port the Scheme reference implementation of srfi27. Sebastian has indeed made a very good job of srfi27. Sebastian has also provided a Cimplementation of the generator, and I'm sure it would provide substantial speed improvements if somebody volunteers to port it. The srfi makes room for different sources of random numbers, so I thought I'd allow the user to use the builtin random generator (I couldn't find documentation on the algorithm used), but I couldn't figure out how to retrieve the current state, so I dropped the idea again. I am however planning to turn the Schemathics dev/random support into another random source.  Jens Axel Søgaard 
From: <jensaxel@so...>  20030509 00:59:47

sebastian.egner@... wrote: > By the way, if you really intend to do serious work on this... I'm not. This is strictly for fun. > Recently, there has been a renewed interest in the topic > of drawing at unimodally at random, but with dynamically > changing weights (think of data compression applications). > Walker's method requires constructing a new set of aliases > and cutoffs when any of the weights change, but this can > be avoided. There is an article by Mtias, Vitter, and Ni > > http://citeseer.nj.nec.com/correct/507561 > > who have devised a method for fast drawing at random > while the weights (or probabilities) are changing. It would > be interesting to know if the algorithms are practical. But I might read the article  the other you recommended was enjoyable. Yours,  Jens Axel Søgaard 
From: <jensaxel@so...>  20030506 18:53:26

Ian Glover wrote: >On Tuesday 06 May 2003 18:00, Jens Axel Søgaard wrote: > > >>>Generalized functions consist of functions and measures. >>> >>> >>I'm not quite sure it's that simple. >> >> > >It never is! But I don't know any better, so I will bow before your superior >knowledge. :> > Uh. I don't want the ball. I think there's more than one way to generalize functions. How on earth did I end up discussing math this advanced at a Scheme mailing list? I like Schemers :) (and math  obviously).  Jens Axel Søgaard 
From: Ian Glover <ian@ma...>  20030506 18:39:50

On Tuesday 06 May 2003 18:00, Jens Axel S=F8gaard wrote: > >Generalized functions consist of functions and measures. > > I'm not quite sure it's that simple. It never is! But I don't know any better, so I will bow before your super= ior=20 knowledge. :> 
From: Ian Glover <ian@ma...>  20030506 18:39:36

On Tuesday 06 May 2003 17:53, Jens Axel S=F8gaard wrote: > >(And after all the Dirac delta is a statistical distribution). > > I'm not quite sure that's right  a statistical distribution is still a > function (isn't it?). Probably not quite expressed correctly, I think I should have said the de= lta=20 measure is statistical distribution. The idea being that continuous=20 probabilities are defined better by measures than functions. The uniform=20 distribution f(x) =3D 1 for 0 < x < 1, the actual value of the function a= t=20 point isn't really statistical meaningful only it's integral over subsets= of=20 R is  just like the values of delta(x) aren't always meaningful but the=20 integrals are. > >But then that is maths in a nutshell, "assume this and what interestin= g > > things drop out." Though for delta (I'm fairly sure that) you can ass= ume > > more fundamental things and it's existence, integrability etc. drops = out. > > It's not just playing with axioms. You can make intermediate (ary?) > calculations involving delta, > and in the end all the deltas cancel out, and you are left with an > ordinary function as a result. > One can use this to study certain kinds of differential equations. Agreed. 
From: <jensaxel@so...>  20030506 18:11:45

Ian Glover wrote: >On Monday 05 May 2003 22:19, MJ Ray wrote: > > >>Ian Glover <ian@...> wrote: >> >> >>>(And after all the Dirac delta is a statistical distribution). >>> >>> >>I thought so too, so I'm confused why these functions aren't a subset >>of all functions. >> >> > >[Disclaimer  I haven't come across generalized functions before as such and >my brain melted half way through the Mathworld page, but this I think what's >going on.] > I agree that the MathWorld page makes a lot more sense if you know the stuff already. There are no attempt to explain the motivation behind. [good explanation about the existance of severeal methids of integration] >Generalized functions consist of functions and measures. > > I'm not quite sure it's that simple.  Jens Axel Søgaard 
From: <jensaxel@so...>  20030506 17:57:17

Ian Glover wrote: >On Monday 05 May 2003 09:48, MJ Ray wrote: > > >>=?ISO88591?Q?Jens_Axel_S=F8gaard?= <jensaxel@...> wrote: >> >> >>>Yes, but by distributions (also called generalized functions) I do not >>>mean functions given by the distribution of a random variable. >>> >>> >>Why do physicists insist on corrupting mathematical language? >> >> >> >I don't think you can really blame physicists, my impression is that >Mathematicians picked up on Dirac's idea, generalized and needed a name. > That's my impression too. They saw that the method of calculation gave valied results and wanted to make the mathematics involved riogorous. >(And after all the Dirac delta is a statistical distribution). > > I'm not quite sure that's right  a statistical distribution is still a function (isn't it?). >>If you go beyond real functions, then playing with the limits of 2 gives >>the desired result, but is that the same as what is written above? >> >>You could assume that such a function exists, but then you're just >>playing with the axioms, aren't you? >> >> > >But then that is maths in a nutshell, "assume this and what interesting things >drop out." Though for delta (I'm fairly sure that) you can assume more >fundamental things and it's existence, integrability etc. drops out. > > It's not just playing with axioms. You can make intermediate (ary?) calculations involving delta, and in the end all the deltas cancel out, and you are left with an ordinary function as a result. One can use this to study certain kinds of differential equations.  Jens Axel Søgaard 
From: <jensaxel@so...>  20030506 17:51:07

MJ Ray wrote: >Jens_Axel_S=F8gaard?= <jensaxel@...> wrote: > > >>Yes, but by distributions (also called generalized functions) I do not mean >>functions given by the distribution of a random variable. >> >> > >Why do physicists insist on corrupting mathematical language? > > > >>In physics Dirac needed a function with the properties >> 1. f(x) = 0 when x <> 0 >> 2. integral f(x) from oo to oo = 1 >> >> > >Why is any function so defined integrable? It seems to be discontinuous. > Suppose g is defined as g(x) = 0, if x<>0 g(x) = 1, if x=0 Then g is a discontionous functions, which is 0 almost every where. As Ian explains in mathematical analysis one extends the Riemann Integral to the Lebegue Integral[1], which is defined even for some discontinous functions. However, the integral of g is still equal to the area between the graph of g and the xaxis no matter what integral you use. This means that 1. integral g(x) from oo to oo = area under graph = 0 is satisfied. If one changes the value g(0) to something other than 1, the value of the integral will not change. Let's try and interpret 2. in terms of the area under the graph. If the area under the graph of g should become 1, but g is only different from 0 at one point, it must mean that the value g(0) *have* to be "infinity". So much infinity that the area becomes 1. Now the above paragraph does not make mathematical sense, but that's how Dirac was thinking. The problem is that g(x) = 0, if x<>0 g(x) = "infinity", if x=1 is not a function from R to R. And what is this "infinity"? >If you go beyond real functions, then playing with the limits of 2 gives >the desired result, but is that the same as what is written above? > >You could assume that such a function exists, but then you're just >playing with the axioms, aren't you? > It's worse, one can prove that no function from R to R can satisfy both 1. and 2. at the same time. Thus if one needs "something" that satisfies both 1. and 2. at the same time, one needs to generalise the notion of functions to a larger space, wherein an element exists, that satisfies the interpretation of 1. and 2. in the new space.  Jens Axel Søgaard 
From: Ian Glover <ian@ma...>  20030506 08:31:10

On Monday 05 May 2003 22:19, MJ Ray wrote: > Ian Glover <ian@...> wrote: > > (And after all the Dirac delta is a statistical distribution). > > I thought so too, so I'm confused why these functions aren't a subset > of all functions. [Disclaimer  I haven't come across generalized functions before as such = and=20 my brain melted half way through the Mathworld page, but this I think wha= t's=20 going on.] There are (at least) two types of integration. For most things (like=20 functions) that you meet the simple version, Riemann integrals works and = does=20 everything you'd expect. This is what is you'll find in most books on=20 Calculus (add up lots of distinct rectangles, take the limit). However=20 mathematicians being perverse beasts (no offence intended to watching=20 mathematicians!) there are more general versions.=20 The most common of the nasty integrations is Lebesque integration which i= s=20 based on measure theory and it what's needed to integrate things like del= ta=20 or that horrible example I gave that's 1 on irrational's and 0 on rationa= ls.=20 [This is were I start getting ropey.] The basic idea is a "measure" which= =20 maps subsets of the space being integrated over to values. A measure, P, = is a=20 probability measure if it maps to the reals and that the measure of the=20 entire domain is 1. If you define a probability measure D on the reals such that D(S) =3D 1 i= f 0 is=20 in S, and D(S) =3D 0 otherwise, then this is perfectly well defined. The = dirac=20 delta is what you get if you try to view this as a functions from reals t= o=20 reals rather than from sets of reals to reals. Integration is then defined by combining functions and a measure. (I can'= t=20 immediately come up with a good example.) Since integration can only happ= en=20 over subsets of spaces this all hangs together. Generalized functions consist of functions and measures.=20 After all that: Why isn't delta a function? I suspect because it's value at zero is not=20 defined. However it's properties if it's viewed as a measure are defined = and=20 since functions are integrable delta has to be view as something more=20 general. Hope that makes some sort of sense. > > > Better: Discontinuity is no bar on integrability [ f(x) =3D3D 1 for 0= < x > > <=3D > > Indeed, but for this function to give the desired result, isn't it eith= er > continuous in an area around 0 or doesn't exist as a function? > > > But then that is maths in a nutshell, "assume this and what interesti= ng > > things drop out." > > Only certain classes of pure mathematicians consider axioms toys. > Hmmm, not sure about that. I'd say most mathematicians and physicists are= =20 guilty (though in Physics it's usually viewed as simplification to make t= he=20 problem solvable). 
From: MJ Ray <markj@cl...>  20030505 22:18:31

Ian Glover <ian@...> wrote: > (And after all the Dirac delta is a statistical distribution). I thought so too, so I'm confused why these functions aren't a subset of all functions. > Better: Discontinuity is no bar on integrability [ f(x) =3D 1 for 0 < x <= Indeed, but for this function to give the desired result, isn't it either continuous in an area around 0 or doesn't exist as a function? > But then that is maths in a nutshell, "assume this and what interesting > things drop out." Only certain classes of pure mathematicians consider axioms toys. MJR 
From: Ian Glover <ian@ma...>  20030505 10:49:08

On Monday 05 May 2003 09:48, MJ Ray wrote: > =3D?ISO88591?Q?Jens_Axel_S=3DF8gaard?=3D <jensaxel@...> wrot= e: > > Yes, but by distributions (also called generalized functions) I do no= t > > mean functions given by the distribution of a random variable. > > Why do physicists insist on corrupting mathematical language? > I don't think you can really blame physicists, my impression is that=20 Mathematicians picked up on Dirac's idea, generalized and needed a name. = (And=20 after all the Dirac delta is a statistical distribution). > > In physics Dirac needed a function with the properties > > 1. f(x) =3D 0 when x <> 0 > > 2. integral f(x) from oo to oo =3D 1 > > Why is any function so defined integrable? It seems to be discontinuou= s. Bad answer: Because it's defined to be integrable! Better: Discontinuity is no bar on integrability [ f(x) =3D 1 for 0 < x <= 1,=20 f(x) =3D 0 otherwise, is integrable and discontinous, rather less obvious= ly=20 g(x) =3D 1 for x where irrational between 0 and 1, g(x) =3D 0 otherwise i= s also=20 integrable] > > If you go beyond real functions, then playing with the limits of 2 give= s > the desired result, but is that the same as what is written above? > > You could assume that such a function exists, but then you're just > playing with the axioms, aren't you? But then that is maths in a nutshell, "assume this and what interesting t= hings=20 drop out." Though for delta (I'm fairly sure that) you can assume more=20 fundamental things and it's existence, integrability etc. drops out. 
From: MJ Ray <markj@cl...>  20030505 09:47:23

=?ISO88591?Q?Jens_Axel_S=F8gaard?= <jensaxel@...> wrote: > Yes, but by distributions (also called generalized functions) I do not mean > functions given by the distribution of a random variable. Why do physicists insist on corrupting mathematical language? > In physics Dirac needed a function with the properties > 1. f(x) = 0 when x <> 0 > 2. integral f(x) from oo to oo = 1 Why is any function so defined integrable? It seems to be discontinuous. If you go beyond real functions, then playing with the limits of 2 gives the desired result, but is that the same as what is written above? You could assume that such a function exists, but then you're just playing with the axioms, aren't you?  MJR http://mjr.towers.org.uk/ IM: slef@... This is my home web site. This for Jabber Messaging. How's my writing? Let me know via any of my contact details. 
From: <sebastian.egner@ph...>  20030505 09:34:57

By the way, if you really intend to do serious work on this... Recently, there has been a renewed interest in the topic of drawing at unimodally at random, but with dynamically changing weights (think of data compression applications). Walker's method requires constructing a new set of aliases and cutoffs when any of the weights change, but this can be avoided. There is an article by Mtias, Vitter, and Ni http://citeseer.nj.nec.com/correct/507561 who have devised a method for fast drawing at random while the weights (or probabilities) are changing. It would be interesting to know if the algorithms are practical. Cheers, Sebastian.  Dr. Sebastian Egner Senior Scientist Philips Research Laboratories Prof. Holstlaan 4 (postbox WY63, kamer WY 655) 5656 AA Eindhoven The Netherlands tel: +31 40 2742069 *** new telephone number fax: +31 40 2742566 *** new fax number email: sebastian.egner@... Jens Axel S=F8gaard <jensaxel@...> 05052003 10:18 =20 To: Sebastian Egner/EHV/RESEARCH/PHILIPS@... cc: schematicsdevelopment@... Subject: Re: [Schematicsdevelopment] discrete random variab= les Classification:=20 sebastian.egner@... wrote: > The algorithm in its most simple form selects a minimum and a maximum > repeatedlyin fact once for every alias constructed. You may want to > consult Section 3.2.6 > > [4] M.C. Fu: Simulation of DiscreteEvent Systems. > Course notes for course BMGT 835 (Chapter 3) Spring 2003, > Robert H. Smith School of Business, University of Maryland. > http://www.rhsmith.umd.edu/MSS/mfu/bmgt835/notesCh3.pdf I'll do that. Thanks for the reference. > for a modern and well written explanation (beware: a typo in 2.(a)). > (The original article of A. J. Walker is terrible.) :) > In fact, since there may be up to n aliases, the worst case run time > is O(n^2). With a priority queue, this could be brought back to O(n) > with sophisticated things, and O(n log n) with small constants without > much of effort. Ok  I have Okasaki's book on functional datastructures and that contains quite a few heap algorithms. (so far I have implemented 7). Amongst them are heaps where insert takes O(1) and deletemin O(log n), but I better read the article first to figure out which heap to choose. > Good luck with you exams! Thanks  but it's not exactly my exams (I'm a teacher). The trick is that in the exam period, I don't (always) have to prepare for the next day, which mean that I have for time for Scheming. =20 Jens Axel 
From: <jensaxel@so...>  20030505 08:22:30

sebastian.egner@... wrote: > The algorithm in its most simple form selects a minimum and a maximum > repeatedlyin fact once for every alias constructed. You may want to > consult Section 3.2.6 > > [4] M.C. Fu: Simulation of DiscreteEvent Systems. > Course notes for course BMGT 835 (Chapter 3) Spring 2003, > Robert H. Smith School of Business, University of Maryland. > http://www.rhsmith.umd.edu/MSS/mfu/bmgt835/notesCh3.pdf I'll do that. Thanks for the reference. > for a modern and well written explanation (beware: a typo in 2.(a)). > (The original article of A. J. Walker is terrible.) :) > In fact, since there may be up to n aliases, the worst case run time > is O(n^2). With a priority queue, this could be brought back to O(n) > with sophisticated things, and O(n log n) with small constants without > much of effort. Ok  I have Okasaki's book on functional datastructures and that contains quite a few heap algorithms. (so far I have implemented 7). Amongst them are heaps where insert takes O(1) and deletemin O(log n), but I better read the article first to figure out which heap to choose. > Good luck with you exams! Thanks  but it's not exactly my exams (I'm a teacher). The trick is that in the exam period, I don't (always) have to prepare for the next day, which mean that I have for time for Scheming.  Jens Axel 
From: <sebastian.egner@ph...>  20030505 07:54:42

Hi, The algorithm in its most simple form selects a minimum and a maximum repeatedlyin fact once for every alias constructed. You may want to consult Section 3.2.6 [4] M.C. Fu: Simulation of DiscreteEvent Systems. Course notes for course BMGT 835 (Chapter 3) Spring 2003, Robert H. Smith School of Business, University of Maryland.=20 http://www.rhsmith.umd.edu/MSS/mfu/bmgt835/notesCh3.pdf for a modern and well written explanation (beware: a typo in 2.(a)). (The original article of A. J. Walker is terrible.) In fact, since there may be up to n aliases, the worst case run time=20 is O(n^2). With a priority queue, this could be brought back to O(n) with sophisticated things, and O(n log n) with small constants without much of effort. Good luck with you exams! Sebastian.  Dr. Sebastian Egner Senior Scientist Philips Research Laboratories Prof. Holstlaan 4 (postbox WY63, kamer WY 655) 5656 AA Eindhoven The Netherlands tel: +31 40 2742069 *** new telephone number fax: +31 40 2742566 *** new fax number email: sebastian.egner@... Jens Axel S=F8gaard <jensaxel@...> 04052003 15:00 =20 To: Sebastian Egner/EHV/RESEARCH/PHILIPS@... cc: schematicsdevelopment@... Subject: Re: [Schematicsdevelopment] discrete random variab= les Classification:=20 sebastian.egner@... wrote: > Mike Sperber forwarded me a short discussion related to using > pseudo random values with given probability distribution. > In other words, given probabilities p[0], .., p[n1] summing to one, > construct a pseudorandom process for values x in {0..n1} > such that Pr{x =3D k} =3D p[k]. > > I would like to bring to your attention that there is a method to > draw the next pseudorandom value in O(1), after having > prepared a datastructure of size O(n). This method is known > as the "method of aliases" by A. J. Walker (Electronic Letters 1974). > It will be much faster than the code currently used in Schemathics > which uses time O(n) for drawing the next random value. For those that have Knuth, I have found the algorithm in volume 2, section 3.4.1 (page 120). He writes: However, there is actually a faster way to select x1,...,xk with arbitrarily given probabilities, based on an ingenious approach introduced by A. J. Walker [Electronics Letters 10, 8 (1974)]. Note the "ingenious". > For convenience, I just wrote some code for this :) > If there would be a good implementation for priority queues > around, the preparation phase could be improved... Is there special requirements for the priority queue? At first glance at the exercise in Knuth, it seems that one must be able to find both the minimum and the maximum element of the priority queue. Is the priority queue used after the initial preparation of the table? =20 Jens Axel S=F8gaard 
From: <jensaxel@so...>  20030504 13:04:21

sebastian.egner@... wrote: > Mike Sperber forwarded me a short discussion related to using > pseudo random values with given probability distribution. > In other words, given probabilities p[0], .., p[n1] summing to one, > construct a pseudorandom process for values x in {0..n1} > such that Pr{x = k} = p[k]. > > I would like to bring to your attention that there is a method to > draw the next pseudorandom value in O(1), after having > prepared a datastructure of size O(n). This method is known > as the "method of aliases" by A. J. Walker (Electronic Letters 1974). > It will be much faster than the code currently used in Schemathics > which uses time O(n) for drawing the next random value. For those that have Knuth, I have found the algorithm in volume 2, section 3.4.1 (page 120). He writes: However, there is actually a faster way to select x1,...,xk with arbitrarily given probabilities, based on an ingenious approach introduced by A. J. Walker [Electronics Letters 10, 8 (1974)]. Note the "ingenious". > For convenience, I just wrote some code for this :) > If there would be a good implementation for priority queues > around, the preparation phase could be improved... Is there special requirements for the priority queue? At first glance at the exercise in Knuth, it seems that one must be able to find both the minimum and the maximum element of the priority queue. Is the priority queue used after the initial preparation of the table?  Jens Axel Søgaard 
From: <jensaxel@so...>  20030502 22:29:39

sebastian.egner@... wrote: > > Dear people at Schematics, > > Mike Sperber forwarded me a short discussion related to using > pseudo random values with given probability distribution. > In other words, given probabilities p[0], .., p[n1] summing to one, > construct a pseudorandom process for values x in {0..n1} > such that Pr{x = k} = p[k]. > > I would like to bring to your attention that there is a method to > draw the next pseudorandom value in O(1), after having > prepared a datastructure of size O(n). This method is known > as the "method of aliases" by A. J. Walker (Electronic Letters 1974). > It will be much faster than the code currently used in Schemathics > which uses time O(n) for drawing the next random value. > > For convenience, I just wrote some code for this :) That's service. I'll incorporate this as soon as possible. [When exam start for real]  Jens Axel Søgaard 
From: <jensaxel@so...>  20030502 22:15:27

MJ Ray wrote: >Jens_Axel_Søgaard <jensaxel@...> wrote: > > >>at all, but a so called distributiun [the set of >>distributions is a extension of the space of functions], >> >> > >Do you mean distributions? > Yes, but by distributions (also called generalized functions) I do not mean functions given by the distribution of a random variable. The relationship between functions and distribution are somewhat similar to the relationship between the reals R and the complex numbers C. During the process of solving an eqution of degree three, one can use numbers from C as temporary results, and in the end get a solution from R [which historically led to the discovery of C]. Thus some problems concerning real numbers is easier to solve, if one sees R as a subset if the larger space C. In physics Dirac needed a function with the properties 1. f(x) = 0 when x <> 0 2. integral f(x) from oo to oo = 1 The equation 2. is not solvable in the space of real functions of one variable, since the integral of function does not depend on the value of f in any specific point. Nevertheless Dirac made calculations with a function named delta that satisfied equation 2, and got correct results (in the space of real functions of a real variabel). The idea is now to embed the set of functions in a larger space, where thus equation has a solution. The definition of distributions and a lenghty discussion can be found at MathWorld. <http://mathworld.wolfram.com/GeneralizedFunction.html>; The short version is: Generalized functions [a.ka.distributions] are defined as continuous linear functionals <http://mathworld.wolfram.com/Functional.html>; over a space <http://mathworld.wolfram.com/Space.html>; of infinitely differentiable functions such that all continuous functions have derivatives which are themselves generalized functions. One clever thing (if I remember correctly) is that the differential operator can be extended to the entire space of distribution s.t. you (if you calculate in the space of distributions) suddenly can differentiate more functions than you used to. Hmm. This got pretty long, but I couldn't help my self.  Jens Axel Søgaard 
From: <sebastian.egner@ph...>  20030502 15:37:05

Dear people at Schematics, Mike Sperber forwarded me a short discussion related to using pseudo random values with given probability distribution. In other words, given probabilities p[0], .., p[n1] summing to one, construct a pseudorandom process for values x in {0..n1} such that Pr{x = k} = p[k]. I would like to bring to your attention that there is a method to draw the next pseudorandom value in O(1), after having prepared a datastructure of size O(n). This method is known as the "method of aliases" by A. J. Walker (Electronic Letters 1974). It will be much faster than the code currently used in Schemathics which uses time O(n) for drawing the next random value. For convenience, I just wrote some code for this :) If there would be a good implementation for priority queues around, the preparation phase could be improved... Cheers, Sebastian. ; Random Source for Discrete Distribution ; ======================================= ; ; Sebastian.Egner@..., 2May2003 ; in R5RSScheme using ; SRFI23 (error), ; SRFI27 (random sources), ; SRFI42 (eager comprehensions) ; load this into Scheme48 0.57 with SRFI27: ; ,config ,load srfi27.scm ; ,open srfi27 srfi23 ; ,load ec.scm ; from srfi.schemers.org, SRFI42 ; (randomsourcemakediscretes s w) ; given a source s of random bits in the sense of SRFI27 ; and a vector w of n >= 1 nonnegative real numbers, ; this procedure constructs and returns a procedure rand ; such that (rand) returns the next integer x in {0..n1} ; from a pseudo random sequence with ; ; Pr{x = k} = w[k] / Sum(w[i] : i) ; ; for all k in {0..n1}. (define (randomsourcemakediscretes s w) (define (checkweights w) (if (not (and (vector? w) (>= (vectorlength w) 1) (forall?ec (:vector wi w) (and (number? wi) (not (negative? wi)))))) (error "bad vector of weights" w))) (define (makecutoffalias w) (define eps (expt 10 6)) (define n (vectorlength w)) (define cutoff (makevector n 1)) (define alias (makevector n 0)) (define avgw (/ (sumec (:vector wi w) wi) n)) (define b (vectoroflengthec n (:vector wi w) ( wi avgw))) (let loop ((sumabsb (sumec (:vector bi b) (abs bi)))) (if (< sumabsb eps) (list cutoff alias) (let ((imin 0) (bmin (vectorref b 0)) (imax 0) (bmax (vectorref b 0))) (doec (:vector bi (index i) b) (begin (if (< bi bmin) (begin (set! imin i) (set! bmin bi))) (if (> bi bmax) (begin (set! imax i) (set! bmax bi))))) (vectorset! cutoff imin (+ 1 (/ bmin avgw))) (vectorset! alias imin imax) (vectorset! b imin 0) (vectorset! b imax (+ bmin bmax)) (loop (+ sumabsb ( (abs bmin)) ( (abs bmax)) (+ (abs (vectorref b imax))))))))) (checkweights w) (let* ((ca (makecutoffalias w)) (cutoff (car ca)) (alias (cadr ca)) (randi (randomsourcemakeintegers s)) (randu (randomsourcemakereals s)) (n (vectorlength cutoff))) (lambda () (let ((i (randi n))) (if (< (randu) (vectorref cutoff i)) i (vectorref alias i)))))) ; example ;(define r1 ; (randomsourcemakediscretes ; defaultrandomsource ; (vectorec (: i 0 29) (* 0.1 (expt 0.9 i))))) ; Geo(0.1) ; ; now (r1) > x in {0..29} with Pr{x = i} = w[i]/Sum(w[j] : j) ; where w[j] = 0.1 0.9^j. ; ;(define (hist rand n iters) ; gather a histogram ; (let ((h (makevector n 0))) ; (doec (:range i iters) ; (:let x (rand)) ; (vectorset! h x (+ (vectorref h x) 1))) ; h)) ; ; ,time (hist r1 30 1000000) ; > a vector approximating the geometric distribution  Dr. Sebastian Egner Senior Scientist Philips Research Laboratories Prof. Holstlaan 4 (postbox WY63, kamer WY 655) 5656 AA Eindhoven The Netherlands tel: +31 40 2742069 *** new telephone number fax: +31 40 2742566 *** new fax number email: sebastian.egner@... 
From: MJ Ray <markj@cl...>  20030502 13:29:00

=?iso88591?Q?Jens_Axel_S=F8gaard?= <jensaxel@...> wrote: > at all, but a so called distributiun [the set of > distributions is a extension of the space of functions], Do you mean distributions? I would have thought that the set of distributions was a restriction of the space of functions, because of the requirements for them to have certain properties. ICBW: it's been far too long.  MJR http://mjr.towers.org.uk/ IM: slef@... This is my home web site. This for Jabber Messaging. How's my writing? Let me know via any of my contact details. 
From: Ian Glover <ian@ma...>  20030501 18:48:31

> I think its a Markov Process, not a Chain. Ian can > speak up if he thinks I'm telling lies Well if I remember rightly a Markov Chain is a set of observations from a= =20 Markov Process, so I guessed you say that the graph is a representation o= f=20 the process and a path is a chain. > (don't tell anyone but he's actually a trained mathematician ;) > Drat, my secret is out. So I'll be doubly embarassed when someone rips th= e=20 above to pieces. :> 
From: Noel Welsh <noelwelsh@ya...>  20030501 17:30:00

 Jens_Axel_Søgaard <jensaxel@...> wrote: > Ok, the argument is simply the distribution function > represented as an association list. Yes. > Do you know a reference to a definition? I don't > think graphs of random variables was covered in the > probality course I had [although I vaguely remember > calculating eigenvalues for some Markov matrices]. There is always MathWorld: http://mathworld.wolfram.com/MarkovChain.html http://mathworld.wolfram.com/MarkovProcess.html I think its a Markov Process, not a Chain. Ian can speak up if he thinks I'm telling lies (don't tell anyone but he's actually a trained mathematician ;) > The Dirac delta function is "infinity" in one point > and zero elsewhere. Well, actually it's not a > function > at all, but a so called distributiun [the set of > distributions is a extension of the space of > functions], > but that doesn't bother physists. (I can't blame > Dirac, > the theory of distributions was initiated by his > work, if I remember correctly]. I'll take your word for this. ;) > Thanks  speak up if you find spelling errors and > the like (english is not my native tongue). Will do. > After writing the documentation, I accidently fell > over > "SRFI 27: Sources of Random Bits", ... > I think I'll port it this summer, and then use it to > rewrite > some of the functions. This would be nice as we don't currently have a port. Francisco's the man for this stuff. cya, Noel ===== Email: noelwelsh <at> yahoo <dot> com Jabber: noelw <at> jabber <dot> org __________________________________ Do you Yahoo!? The New Yahoo! Search  Faster. Easier. Bingo. http://search.yahoo.com 