Search This Blog

Tuesday, May 11, 2010

How can i calculate an optimal UDP packet size for a datastream?

Programmer Question

Short radio link with a data source attached with a needed throughput of 1280Kbps over IPv6 with a UDP Stop-and-wait protocol, no other clients or noticeable noise sources in the area. How on earth can i calculate what the best packet size is to minimise overhead?



UPDATE



Thought it would be an idea to show my working so far:
IPv6 has a 40B header, so including ACK responses, thats 80B overhead per packet.
To meet the throughput requirement, 1280K/p packets need to be sent a second, where p is the packet payload size.



So by my reckoning that means that the total overhead is (1280K/p)*(80), and throwing that into Wolfram gives a function with no minima, so no 'optimal' value.



I did alot more maths trying to shoehorn bit error rate calculations into there but came up against the same thing; if theres no minima, how do i choose the optimal value?



Thanks guys.



Find the answer here

No comments:

Post a Comment

Related Posts with Thumbnails