I am trying to think of an algorithm that always produces the optimum solution in the best possible time to this problem:
There are n
candidates for a job, and k
rooms in which they have scheduled interviews at various times of the day. Interviews have a specific schedule in each room, with each interview having a specified start time (si), finish time (fi), and interview room (ri). All time units are always integers. In addition we need to schedule pictures with the people currently being interviewed throughout the day. The pictures don't effectively take any time, but at some point in the day each interviewee must be in a picture. If we schedule a picture at time t
, all people currently being interviewed will be in that picture. Taking a picture has no affect on the rest of each interviews start and end time. So the problem is this: with an unordered list of interviews , each with variables (si, fi, ri), how do you make sure every interview candidate is in a picture, while taking as few pictures as possible?
So ideally we would take pictures when there are as many people present as possible to minimize the number of pictures taken. My original idea for this was sort of a brute force, but it would be a really bad big-O runtime. It is very important to minimize the runtime of this algorithm while still returning the fewest possible photographs. That being said, if you can think of a fast greedy algorithm that doesn't perfectly solve the problem, I would like to hear that too.
I'm sure my description here was far from flawless, so if you would like me to clarify anything, feel free to leave a comment and I'll get back to you.
Start with the following observations:
j
is an arrival, there is no need to take a picture between si and sj, since everyone available at si is still available at sj.k
of them) and wait to take a picture until someone is about to leave.Thus I think the following algorithm should work:
A
of size n
to keep track of whether each interviewee is available (interview is in progress).P
of size n
to keep track of whether each interviewee has been photographed.Loop over the sorted time list (index variable i
):
a. If an arrival is encountered, set A[i]
to true
.
b. If a departure j
is encountered, check P[j]
to see if the person leaving has been photographed already. If not, take a picture now and record its effects (for all A[k] = true
set P[k] = true
). Finally set A[i]
to false
.
The sort is O(n log n
), the loop has 2n
iterations, and checking the arrays is O(1). But since on each picture-taking event, you may need to loop over A
, the overall runtime is O(n
2) in the worst case (which would happen if no interviews overlapped in time).