Eulises Ulloa - 2013-07-31

Hello all,

I am working in a project in which I need to set up my network using a DELAY, JITTER and PARETO DISTRIBUTION. I have run some test I calculated the jitter of my network according to RFC3550 (RTP) using the following equations:

If Si is the RTP timestamp from packet i, and Ri is the time of arrival in RTP timestamp units for packet i, then for two packets i and j, D may be expressed as

D(i,j) = (Rj - Ri) - (Sj - Si) = (Rj - Sj) - (Ri - Si)

The interarrival jitter SHOULD be calculated continuously as each data packet i is received from source SSRC_n, using this difference D for that packet and the previous packet i-1 in order of arrival (not necessarily in sequence), according to the formula

J(i) = J(i-1) + (|D(i-1,i)| - J(i-1))/16

My prove of concept consists in setting up the delay and jitter in WANem and pinging to other computers in my network. The problem I am facing is that after analyzing my data, every time I find different jitter form the one that I set up.

I am suspicious that my definition for jitter is not the same that WANem is implementing. Do you know how WANem defines Jitter and also hay it uses it for the pareto distribution?

Thanks in advance