I was performing some evaluation experiments in my system, checking the time taken to insert different (in number of rules) rule bases and number of the facts in the memory. Note: I'm not using persistance. Drools version: 6.3.0.Final.
Table - Rule base (RB), Time to insert (TTI), time in milliseconds.
Based on the table preseted above, I would like to know why the time to include the facts in the memory increase when the rule base size is increased?
I'm not expert in Drools.
Fact insertion, retraction and update result in instantaneous evaluation of conditions, as far as is possible with the given state of the network. It isn't merely the number of rules; it is the number of references of the fact's type(s) that matters.
You observe an increase less than O(N), N being the number of rules, which nicely fits the theory.
The number of 12,000 rules is quite extraordinary (unless you have invented rules just to test scalability). If they are real, and if you are worried about performance, you should revise the rule structure.
Edit due to OP's comment.
The "number of references of a fact's type" is the number of times a certain class (= type) occurs in a pattern. In your examples
rule x when
$spec1 : Specification ( )
$spec2 : Specification ( $spec1.id ==2, id == 3, value > $spec1.value )
rule "x+1" when
$spec1 : Specification ( )
$spec2 : Specification ( $spec1.id ==3, id == 4, value > $spec1.value )
You have 2 rules but 4 references to Specification
. This is bound to create a more complex network in the Engine. Also, a constraint depending exclusively on data in the first pattern ($spec1.id ==X
) but occurring in the second pattern is almost certainly an anti-pattern.
Using poor constructs like these, there's virtually no limit for slowing down a RBS, not just Drools. Apparently you are indeed just testing scalability. Sticking to well-written rules might give you more conclusive results.