CS 2401 Assignment #8

Due Date: Monday, Nov 14 or Tuesday, Nov 15, depending on the day of your lab.

Objective: The goal of this assignment is to practice the use of queues.

High-performance computing clusters provide computational resources for accomplishing demanding computing tasks. Each computational task is called a job, and these jobs may take varying amounts of time. The computing cluster has a limited number of machines available to complete jobs. As jobs come in they are placed in a queue to be completed when there is a machine available. Your task is to simulate the job allocation for a simple computing cluster, and answer some questions about the performance of this cluster.

Your simulation will be based on 1000 discrete time units (for example, each time step might represent 1 second of real time). Each job requires between 1 and 10 time units to complete. New jobs arrive over time, with some randomness. In your simulation, the number of new jobs that arrive on each time step will vary between 1 and 5.

To represent the available computer resources you will use an array of integers of length n, where n is the number of computers available. The numbers stored in each position in the array represent the number of time steps until the current job assigned to the machine is completed. For example, if a job requiring 6 time steps is assigned to machine 4, the counter for that machine will start at 6 and count down to 0 before the machine can be assigned a new job. The jobs waiting to be processed will be stored in a queue of integers. Each integer in the queue represents the number of time steps required to process the job.

To execute the simulation, you need to process each time step separately using a for loop. For each time step, perform the following steps:

Now, suppose that you are operating the cluster and want to decide how many machines you need to purchase. You run the simulation to determine the performance of the cluster with different numbers of machines: 5, 10, 15, 20, 25, and 30. You are interested in two important factors. First is the average length of the queue of jobs. The second is the maximum length of the queue. Add code to track each of these values, and then run your simulation for 1000 time steps for each different number of machines listed above. Record your observations in a table, and turn this in with your code.

Write the code of the main part of the program so that the code only uses the ADT operations enqueue, dequeue, size and isEmpty. Then, show that your program works with both types of queue implementation -- as a linked list and as an array.