I am trying to find an algorithm for this problem that is given in my homework:
Assume that you have m jobs and n machines. Each machine i has a nondecreasing latency function li : N → R that only depends on the number of jobs assigned to machine i. To illustrate, if lj(5) = 7, then machine j needs to work 7 units of time if it is assigned (any) five of the jobs. Assume that li(0) = 0 for all machines i.
Given a set of m jobs, and n machines, where each machine is associated with a nondecreasing latency function as described above. You are asked to give a O(m · lgn) algorithm that assigns each job to a machine such that the makespan(the maximum amount of time any machine executes) is minimized. Needless to say, but just in case, you need to prove that your algorithm is correct.
I am allowed to get some help so this is not cheating.
I am stuck in where and how to start to find an algorithm for this problem, could you help me please?
O(m · lgn)
is good clue.
How assigning every of m
jobs can take O(logn)
time? Apparently machines should be organized in some data structure with said time per operation.
Think about priority queue based on binary heap.