Description:
The purpose of this project will be to simulate the performance of a first-in-first-out (FIFO)
queue with fixed sized packets and Markov arrivals to the queue. The simulation will be
performed using only one input source to the queue.
The following parameters will be used throughout the entire project:
c = link speed = 10 Mbps
p = packet size = 4000 bits
Simulation of a FIFO Queue with a Single Source:
Create your own, newly created simulation program for simulation of the queue.
You should write a program that does the low-level management of an event list as described in
the lecture.
Your purpose will be to simulate the performance of the queue for various packet arrival rates.
Within one simulation, you must simulate many packets and find the average response time for
those packets. Then you should perform multiple iterations of the simulation so as to compute
the average response times for packets through the M/D/1 queue across several simulation runs.
Use the following parameters:
λ = arrival rates = 1000 to 2000 in steps of 100
10 simulation runs per arrival rate to compute the average response times
The simulation time should be long enough to simulate 1000 packet arrivals.
To generate the exponential random number needed to create the interarrival times for the
packets, you can use your programming environment’s random number generator. Use the
function that generates a random number X uniformly between 0 and 1. Then use the following
formula to create an exponentially distributed random number Y, where ln(X) is the natural
logarithm function.
Problem 01:
A single plot that includes the response time for each simulation at each arrival rate,
the average response time at each arrival rate, and the theoretical response time as
given by the following equation.

#!/usr/bin/env python
# Copyright (C) 2010-11-13 by Antonio081014
import random
import numpy
import matplotlib.pyplot
LinkRate = 10000000 #kbps;
pkgSize = 4000 #bits;
simTimes = 10 #Number of simulations;
longda = numpy.arange(1000, 2001, 100)
pkgNum = 2000
servTime = 1. * pkgSize / LinkRate
M = servTime
def getArrivalTime(lnda):
r = random.random()
return -numpy.log(r) / lnda;
def performance(rate):
startTime = 0.
deptTime = servTime
totalTime = servTime
for i in range(1, pkgNum):
startTime += getArrivalTime(rate)
if deptTime - startTime <= 0.:
deptTime = startTime + servTime
totalTime += servTime
else:
totalTime += deptTime + servTime - startTime;
deptTime += servTime
return 1. * totalTime / pkgNum
def simulate():
ret = numpy.zeros((simTimes, longda.shape[0]), 'float')
for count in range(simTimes):
for r in range(longda.shape[0]):
rate = longda[r]
ret[count][r] = performance(rate);
return (ret)*1000.
def getTheory(lnda):
ld = numpy.zeros(lnda.shape[0], 'float')
for i in range(lnda.shape[0]):
d = lnda[i]
temp = (2.*M - d*(M**2))/(2.*(1.-d*M));
ld[i] = temp
return ld*1000.
def plotSimulate(record):
m = record.shape[0]
n = record.shape[1]
sum = numpy.sum(record, axis=0);
temp = numpy.arange(n)*100 + 1000
matplotlib.pyplot.plot(temp, sum/m, 'g', label='Simulated Average Response Time');
matplotlib.pyplot.plot(temp, getTheory(longda), 'r', label='Theoretical Average Response Time');
for i in range(n):
for j in range(m):
matplotlib.pyplot.plot(i*100+1000, record[j][i], 'r*');
matplotlib.pyplot.legend();
matplotlib.pyplot.xlabel('Arrival Rate (packages per second)');
matplotlib.pyplot.ylabel('Response Time (ms)');
matplotlib.pyplot.title('Simulated Response Time for an M/D/1 Queue');
if __name__ == '__main__':
record = simulate();
# print record
matplotlib.pyplot.ion()
matplotlib.pyplot.figure()
plotSimulate(record);
matplotlib.pyplot.savefig('SimulationT.png')
Problem 02:
Two plots of actual queue fill with respect to time for one simulation. Show the queue
fill at the time of each packet arrival before the packet is entered into the queue or
serviced. The first plot should be for an arrival rate of 2000 packets per second, and
the second plot should be for an arrival rate of 3000 packets per second.
The result figure:

#!/usr/bin/env python
# Copyright (C) 2010-11-13 by Antonio081014
# Single arrival rate specified
import random
import numpy
import matplotlib.pyplot
LinkRate = 10000000 #kbps;
pkgSize = 4000 #bits;
simTimes = 10 #Number of simulations;
longda = numpy.array([2000, 3000])
pkgNum = 1000
servTime = 1. * pkgSize / LinkRate
M = servTime
def getArrivalTime(lnda):
r = random.random()
return -numpy.log(r) / lnda;
def performance(rate):
start = numpy.zeros((pkgNum), 'float');
dept = numpy.zeros((pkgNum), 'float');
startTime = 0.
deptTime = servTime
totalTime = servTime
start[0] = startTime
dept[0] = deptTime
for i in range(1, pkgNum):
startTime += getArrivalTime(rate)
if deptTime - startTime <= 0.:
deptTime = startTime + servTime
totalTime += servTime
else:
totalTime += deptTime + servTime - startTime;
deptTime += servTime
start[i] = startTime
dept[i] = deptTime
return start*1000., dept*1000.
def simCount(rate):
start,dept = performance(rate)
count = numpy.zeros((pkgNum));
for i in range(pkgNum):
for j in dept[:i]:
if j > start[i]:
count[i]+=1
return count
def plotSimulate(rate):
record = simCount(rate);
m = record.shape[0]
temp = numpy.arange(m)
matplotlib.pyplot.plot(temp, record, 'r*', label='The situation of filled queue before next package\'s arrival');
matplotlib.pyplot.legend();
matplotlib.pyplot.xlabel('Package Number');
matplotlib.pyplot.ylabel('Number of package in the queue when its comming');
matplotlib.pyplot.title('The sistuation of filled queue before next package\'s arrival when arrival rate is ' + str(rate));
if __name__ == '__main__':
matplotlib.pyplot.ion()
matplotlib.pyplot.figure()
plotSimulate(longda[0]);
matplotlib.pyplot.savefig('Rate01.png')
matplotlib.pyplot.figure()
plotSimulate(longda[1]);
matplotlib.pyplot.savefig('Rate02.png')
The result figures: