(c) Larry Ewing, Simon Budig, Garrett LeSage
Ó 1994 Ç.

Department of Computer Science

PetrSU | Software projects | AMICT | Staff | News archive | Contact | Search

Regenerative simulation of the overflow probability in the finite buffer queue with Brownian input

Dr. Evsey V. Morozov, Ruslana S. Goricheva, Oleg V. Lukashenko(IAMR KarSC RAS, Petrozavodsk, Russia), Prof. Michele Pagano(University of Pisa, Pisa Italy)

We discuss application of the regenerative simulation to estimate the overflow probability in a queueing system with finite buffer which is fed by a Brownian input. Brownian input is an important particular case of the Gaussian processes, which are now well-recognized model to describe the traffic dynamics of a wide class of modern telecommunication networks. Usually overflow probability estimation is captured by the so-called Large Deviation Theory (LDT), which is based on a complicated mathematical technique and uses functions which are hard to be calculated or estimated. In this work, we do not use LDT since are interested mainly the system with small of moderate size of the buffer. Indeed, interactive multimedia applications have stringent requirements in terms of end-to-end delay and delay jitter; since queueing delay is a major component of the overall delay, the infinite buffer approximation is not a realistic assumption in real networks. In this framework, the loss rate prediction may represent a useful tool to define new call admission control algorithms in QoS supporting network architectures and for QoS routing protocols.

The overflow, being an interflow, has typically very complicated dependence structure, and it makes evaluation of its parameters very hard problem in the framework of classical statistics. Hopefully, Brownian process has independent increments, and thus stationary performance of the considered model can be accurately estimated by means of well-developed regenerative simulation technique. In particular, we develop confidence estimation of the stationary overflow probability using classical regenerations which occur when the system (server and buffer) becomes empty. Some numerical results are assumed to be included. As a side result, in this way it will be possible to evaluate the goodness of LDT results and their applicability to real networks.