Optimal Workload Allocation in a Network of Computers with Single Class Job

  •  Rahela Rahim    
  •  Ku Ku-Mahamud    


Queueing models for multiple queue with multiple server are used to model workload allocation problems in a network of computers. The problem of determining optimal allocation of workload with single class jobs to a parallel of computers using optimization technique is presented. The generalized exponential (GE) distributional model has been used to represent general inter arrival and service time distributions as various jobs have various traffic characteristic. A close-loop expression is derived from a non-linear optimization problem based on a queueing theory objective function to obtain an optimal value for jobs arrival. The analysis of the recomputation has been done and has shown improvement.

This work is licensed under a Creative Commons Attribution 4.0 License.