Objective: The goal of this assignment is to practice the use of queues.
High-performance computing clusters provide computational resources for accomplishing demanding computing tasks. Each computational task is called a job, and these jobs may take varying amounts of time. The computing cluster has a limited number of machines available to complete jobs. As jobs come in they are placed in a queue to be completed when there is a machine available. Your task is to simulate the job allocation for a simple computing cluster, and answer some questions about the performance of this cluster.
Your simulation will be based on 1000 discrete time units (for example, each time step might represent 1 second of real time). Each job requires between 1 and 10 time units to complete. New jobs arrive over time, with some randomness. In your simulation, the number of new jobs that arrive on each time step will vary between 1 and 5.
To represent the available computer resources you will use an array of integers of length n, where n is the number of computers available. The numbers stored in each position in the array represent the number of time steps until the current job assigned to the machine is completed. For example, if a job requiring 6 time steps is assigned to machine 4, the counter for that machine will start at 6 and count down to 0 before the machine can be assigned a new job. The jobs waiting to be processed will be stored in a queue of integers. Each integer in the queue represents the number of time steps required to process the job.
To execute the simulation, you need to process each time step separately using a for loop. For each time step, perform the following steps: