# Simulation of the fluid system with long-range dependent input

**Oleg V. Lukashenko (IAMR KarSC RAS, Petrozavodsk, Russia), Mikhail J. Nasadkin (Petrozavodsk State University, Russia)**

Nowadays Gaussian processes are well-recognized models to describe the traffic dynamics of a wide class of modern telecommunication networks. We discuss the application of the simulation to estimate the loss or overflow probability in a queuing system with a finite or infinite buffer, which is fed by a Gaussian input. We mainly consider fractional Brownian input (fBi) because it satisfies some properties such as self-similarity and long-range dependence that network traffic usually obeys. In case of infinite buffer queue distribution analysis reduces to analysis of extremes of Gaussian inputs. Unfortunately, so far for general Gaussian processes no explicit expressions, such results are available only for specific cases. There are some asymptotic results for fBi (so-called large buffer and many-source asymptotic), which are based on large deviation theory. But this theory cannot be applied in case of small or moderate size buffer. So explicit or asymptotic results for finite buffer systems are much less available (at least for non Markovian systems), and simulation is often the only way to calculate the corresponding performance measures. In this framework, the loss rate estimation may represent a useful tool to provide suitable level of Quality of Service.