Before starting implementation of solution for my problem I just want to be sure if I will not “reinvent wheel” and if I can reuse work that someone have done before. So my problem is:
I have made image matcher using OpenCV library. This matcher receives a set of image files and trying to find similar images in database. At the end it returns statistical results according to ROC Curves definition (True Positive, True Negative, False Positive and False Negative number of matches). These results can vary because of OpenCV’s library algorithm parameters values, which are about 10. This means that parameters adjustment will bring to more True Positive matches and to less False Positive matches. As I have to adjust more or less 10 parameters, brute force adjuster will be very slowly. By brute force I mean:
While(param1 < 50){
While(param2 < 50){
While(param3 < 50){
…
PerformMatching();
param3 +=2;
}
param2++;
}
pram1 +=5;
}
What I want to do is to choose parameters randomly and then analyze if statistical results are becoming better. Then this analysis will help to alter method which is randomly generates parameters in order to choose better parameters.
So my question is if there is library in Java or if there is any AI algorithm, which will return better parameter set based on evaluation of True Positive and False Positive values?
Could appreciate some help, thanks.
You could try some stochastic optimization algorithms, e.g., Hill Climbing, in which you start with a random solution (like @Don Reba pointed out) and look at the set of neighboring solutions for those that are better with respective to the cost function. I will use some sample python code to explain the idea.
For neighbors in your case, you can use a simple function like:
n_params = 5 # number of parameters
upper_bound = 5 # upper limit of your parameters
lower_bound = 0 # lower limit of your parameters
def get_neighbors(solution):
neighbors = []
for i in range(n_params):
x = copy.deepcopy(solution)
if x[i] < upper_bound:
x[i] += 1 # increment one of the components
neighbors.append(x)
x = copy.deepcopy(solution)
if x[i] > lower_bound:
x[i] -= 1 # decrement one of the components
neighbors.append(x)
return neighbors
If you have a current solution of [1,3,4,2,2], by incrementing or decrementing any of the components, you get 10 different neighbors as below:
[2, 3, 4, 2, 2],
[0, 3, 4, 2, 2],
[1, 4, 4, 2, 2],
[1, 2, 4, 2, 2],
[1, 3, 5, 2, 2],
[1, 3, 3, 2, 2],
[1, 3, 4, 3, 2],
[1, 3, 4, 1, 2],
[1, 3, 4, 2, 3],
[1, 3, 4, 2, 1]
Here we assume every parameter is an integer. You could achieve more granularity by adjusting the step size (e.g., 0.05).
def hill_climb():
initial_solution = np.random.randint(lower_bound, upper_bound, n_params)
current_solution = initial_solution
print 'initial solution', initial_solution
current_cost = get_cost(initial_solution)
step = 1
while True:
#try to replace each single component w/ its neighbors
lowest_cost = current_cost
lowest_solution = current_solution
print 'hill-climbing cost at step %6d: %d' % (step, lowest_cost)
neighbors = get_neighbors(current_solution)
for new_solution in neighbors:
neighbor_cost = get_cost(new_solution)
if neighbor_cost < lowest_cost:
lowest_cost = neighbor_cost
lowest_solution = new_solution
if lowest_cost >= current_cost:
break
else:
current_solution= lowest_solution
current_cost = lowest_cost
step += 1
return current_solution
For completeness, I will use my own cost function (just for a demo purpose), which is
f(x) = x1^1 - x2^2 + x3^3 - x4^4 + x5^5
That is :
def get_cost(solution):
cost = 0
for i,param in enumerate(solution):
cost += (-1.)**i * param**(i+1)
return cost
Here is the result. We use a random initial guess of [4, 0, 1, 3, 1]. After 14 steps (evaluation of 14*10 = 140 neighbors), we find the optimal answer of [0, 5, 0, 5, 0] which minimize the cost. For brute force, you have to evaluate 6^6 = 46656 solutions. Much more running time could be saved while you have a high dimensional solution.
Note since this is a stochastic method, local minimum is found as the final result (though sometimes it's identical to the global minimum, it's not guaranteed). However practically it's good enough.
initial solution: [4 0 1 3 1]
hill-climbing cost at step 1: -75
hill-climbing cost at step 2: -250
hill-climbing cost at step 3: -619
hill-climbing cost at step 4: -620
hill-climbing cost at step 5: -621
hill-climbing cost at step 6: -622
hill-climbing cost at step 7: -623
hill-climbing cost at step 8: -624
hill-climbing cost at step 9: -627
hill-climbing cost at step 10: -632
hill-climbing cost at step 11: -639
hill-climbing cost at step 12: -648
hill-climbing cost at step 13: -649
hill-climbing cost at step 14: -650
Final solution: [0 5 0 5 0]
A related but more complex question is here: Algorithm for selecting n vectors out of a set while minimizing cost
All codes above can be found here.