Optimal Round Robin CPU Scheduling Algorithm Using Manhattan Distance

International Journal of Electrical and Computer Engineering

Optimal Round Robin CPU Scheduling Algorithm Using Manhattan Distance

Abstract

In Round Robin Scheduling the time quantum is fixed and then processes are scheduled such that no process get CPU time more than one time quantum in one go. The performance of Round robin CPU scheduling algorithm is entirely dependent on the time quantum selected. If time quantum is too large, the response time of the processes is too much which may not be tolerated in interactive environment. If time quantum is too small, it causes unnecessarily frequent context switch leading to more overheads resulting in less throughput. In this paper a method using Manhattan distance has been proposed that decides a quantum value. The computation of the time quantum value is done by the distance or difference between the highest burst time and lowest burst time. The experimental analysis also shows that this algorithm performs better than RR algorithm and by reducing number of context switches, reducing average waiting time and also the average turna round time.

Discover Our Library

Embark on a journey through our expansive collection of articles and let curiosity lead your path to innovation.

Explore Now
Library 3D Ilustration