A comparison of client-server and P2P file distribution delays
In this problem, you'll compare the time needed to distribute a file that is initially located at a server to clients via either client-server download or peer-to-peer download. Before beginning, you might want to first review Section 2.5 and the discussion surrounding Figure 2.22 in the text.
The problem is to distribute a file of size F = 9 Gbits to each of these 10 peers. Suppose the server has an upload rate of u = 93 Mbps.
The 10 peers have upload rates of: u1 = 24 Mbps, u2 = 17 Mbps, u3 = 28 Mbps, u4 = 20 Mbps, u5 = 16 Mbps, u6 = 13 Mbps, u7 = 10 Mbps, u8 = 22 Mbps, u9 = 13 Mbps, and u10 = 23 Mbps
The 10 peers have download rates of: d1 = 17 Mbps, d2 = 18 Mbps, d3 = 31 Mbps, d4 = 35 Mbps, d5 = 12 Mbps, d6 = 28 Mbps, d7 = 35 Mbps, d8 = 24 Mbps, d9 = 35 Mbps, and d10 = 32 Mbps
Question List
1. What is the minimum time needed to distribute this file from the central server to the 10 peers using the client-server model?
2. For the previous question, what is the root cause of this specific minimum time? Answer as 's' or 'ci' where 'i' is the client's number
3. What is the minimum time needed to distribute this file using peer-to-peer download?
4. For question 3, what is the root case of this specific minimum time: the server (s), client (c), or the combined upload of the clients and the server (cu)
Solution
1. The minimum time needed to distribute the file = max of: N*F / US and F / dmin = 967.74 seconds.
2. The root cause of the minimum time was s.
3. The minimum time needed to distribute the file = max of: F / US, F / dmin, and N * F / sum of ui for all i + uS = 750 seconds.
4. The root cause of the minimum time was c.
That's incorrect
That's correct
The answer was: 967.74
The answer was: s
The answer was: 750
The answer was: c