I'm trying to count occurrences within an array that are constrained by first having the same value in the array then by the number of different variables found an associated array.
I am starting with this model
set of int: OPTS = 1..4;
set of int: ASSIGN = 1..12;
array[ASSIGN] of var OPTS: result;
that can generate a result [ 1,2,1,3,3,4,1,3,1,1,2,4 ]
and would like to combine with these statements
enum CONDS = {A,B,C,D};
array[ASSIGN] of CONDS : const = [A,A,A,B,B,B,C,C,C,D,D,D];
array[OPTS] of var 0..max(CONDS): result2;
output ["\(result) \(result2)"]
and produce an output that counts how many different CONDS
appear for each OPTS
[ 1,2,1,3,3,4,1,3,1,1,2,4 ] [3,2,2,2]
implying that OPTS[1] is associated with at least one of A,C and D, OPTS[2] has As and Ds, OPTS[3] has Bs and Cs, and OPTS[4] has Cs and Ds. I've scrambled my brains a bit with this and earlier questions and can't determine whether to look at creating a less complex constraint, or a more complex one. I could have missed a simple trick here...or maybe it's not so obvious?
I would love to know why my attempt with the following produces unsatisfiable when X = j
, [0,0,3,0]
when X = 1
and '[0,0,0,0]' when X = i
(which would make east sense to me, but clearly I don't know how this position 'X' is supposed to work with the rest).
constraint forall(i in OPTS, j in CONDS)(
result2[i] =
count([ k == j /\ result[k] = i | k in const],X)
);
Here's a model that solves your original question. The construct card({const[i] | i in ASSIGN where result[i]=o} )
, counts the number of different values of OPTS with the set comprehension {....}
, using card
(the cardinality of the set) to do the counting.
include "globals.mzn";
set of int: OPTS = 1..4;
set of int: ASSIGN = 1..12;
array[ASSIGN] of var OPTS: result = [1,2,1,3,3,4,1,3,1,1,2,4];
enum CONDS = {A,B,C,D};
array[ASSIGN] of CONDS : const = [A,A,A,B,B,B,C,C,C,D,D,D];
array[OPTS] of var 0..max(ASSIGN): result2;
solve satisfy;
constraint
forall(o in OPTS) (
% count the number of different CONDS for this OPTS value o
result2[o] = card({const[i] | i in ASSIGN where result[i]=o} )
)
;
output [
"result:\(result)\n",
"result2:\(result2)\n",
];
The solution is
result:[1, 2, 1, 3, 3, 4, 1, 3, 1, 1, 2, 4]
result2:[3, 2, 2, 2]
Update: Answer to the additional question
You asked why this constraint don't give the proper answer:
constraint forall(i in OPTS, j in CONDS)(
result2[i] =
count([ k == j /\ result[k] = i | k in const],X)
);
Here are some problems with this approach.
First I should say that this kind of loop might in general be a good approach, but the problem statement requires that we count the number of different chunks (were each chunk contains many values), not the number of all occurrences.
The k in const
loops through all the values A,A,A,B,B,...
(not the indices) which mean thatresult[k]
is interpreted as result[A] ... result[D]
which is then translated to as result[1] ... result[4]
(A=1,B=2,C=3,D=4) which is not what you want.
Instead, k
should loop though ASSIGN
and instead of the plaink
is should then be conds[k]
.
Another problem is that result2[i]
is evaluated for every j in CONDS
, i.e. it do "double counting" which almost always mean trouble. (Note also that in constraint programming values are never updated. If one got one value in results[i]
in one loop and another value in another, then the model is UNSAT.)
Instead a nested loop should be used:
constraint
forall(i in OPTS) (
forall(j in CONDS) (
% ...
)
);
count
(or sum
) means that you count all the
occurrences of the CONDS
, e.g. for the value 1
this counts
two A
's (from the first chunk) which is not correct.To fix this one have to just count one occurrence of the CONDS
. In the model above I use the set approach. Here's another way, using exists
which just count one of each matching CONDS
values.
constraint
forall(o in OPTS) (
result2[o] = sum(c in CONDS) (
exists(k in ASSIGN) ( const[k] = c /\ result[k] = o)
)
)
;