摘要:We consider the problem of efficiently scheduling jobs on data centers to minimize the cost of renting machines from "the cloud." In the most basic cloud service model, cloud providers offer computers on demand from large pools installed in data centers. Clients pay for use at an hourly rate. In order to minimize cost, each client needs to decide on the number of machines to be rented and the duration of renting each machine. This suggests the following optimization problem, which we call Rent Minimization. There is a set J={j_1,j_2,...,j_n} of n jobs. Job j_i is released at time r_i >= 0, has a deadline of d_i, and requires p_i>0 contiguous processing time, r_i,d_i,p_i in R. The jobs need to be scheduled on identical parallel machines. Machines may be rented for any length of time; however, the cost of renting a machine for l>=0 time units is [l/D] (the smallest integer >= l/D) dollars, for some given large real D; in particular, one pays dollar 2 whether the machine is rented for D+1 or 2D time units. The goal is to schedule all the jobs in a way that minimizes the incurred rental cost. In this paper, we develop offline and online algorithms for Rent Minimization problem. The algorithms achieve a constant factor approximation for the offline version and O(log(p_max/p_min)) for the online version, where p_max and p_min are the maximum and minimum processing time of the jobs respectively. We also show that no deterministic online algorithm can achieve an approximation factor better than log_{3}(p_max/p_min) within a constant factor. Both of these algorithms use the well-studied problem of Machine Minimization as a subroutine. Machine Minimization is a special case of Rent Minimization where D = max_{i}{d_i}. In the process of solving the Rent Minimization problem, in this paper, we also develop the first online algorithm for Machine Minimization.