Ross's conjecture explained

In queueing theory, a discipline within the mathematical theory of probability, Ross's conjecture gives a lower bound for the average waiting-time experienced by a customer when arrivals to the queue do not follow the simplest model for random arrivals. It was proposed by Sheldon M. Ross in 1978 and proved in 1981 by Tomasz Rolski. Equality can be obtained in the bound; and the bound does not hold for finite buffer queues.[1]

Bound

Ross's conjecture is a bound for the mean delay in a queue where arrivals are governed by a doubly stochastic Poisson process[2] or by a non-stationary Poisson process.[3] [4] The conjecture states that the average amount of time that a customer spends waiting in a queue is greater than or equal to

λ\operatornameE(S2)
2\{1-λ\operatornameE(S)\
}where S is the service time and λ is the average arrival rate (in the limit as the length of the time period increases).

Notes and References

  1. .
  2. .
  3. .
  4. .