We have n students. For every i'th student we know its math(mi) and coding skills(ci). We want each student to have one teacher. However, a teacher can teach a student only if he or she has same or better skills in both math and coding. Each teacher can teach as many students as possible.
The teacher with skill M in math and C in coding gets C * M salary. The problem is to minimize the overall salary of all teachers.
Example:
4
1 6
4 2
2 2
2 5
Output: 20.
Because we can get (2, 6) and (4,2), so we need to pay 2*6 + 4*2 = 20
The easiest approach I found to be working is just brute force all possible values of c and m(with some limits) and minimize c * m. But this problem comes under dynamic programming section. So can anyone give any idea how to solve it more efficiently?
Step 1, clean data:
theorem 1
)Step 2, calculate:
f(i) = the minimal salary for teaching student 1..i
Before that, we define:
s(i,j) = the salary of employing only one teacher to teaching student i..j(i < j)
s(i,j) = max(c_(i..j)) * max(m_(i..j))
s(i,j) = c_j * m_i (by theorem 1)
So we can get:
f(i) = min(f(j) + s(j+1,i)) j >= 1 and j < i
With a simple implementation, we can calculate f(n)
(the answer) in O(n^2).
Feel free to ask any questions here and I will reply as soon as possible.