Here is an example script that reproduces the issue
require 'mechanize'
agent = Mechanize.new
agent.history.max_size = 0
5000.times do |i|
agent.get('http://www.yahoo.com')
agent.history.clear
p `ps -o rss -p #{$$}`.strip.split.last.to_i * 1024 # Prints out memory usage of the ruby process
end
I'm doing both agent.history.max_size
and agent.history.clear
but it seems that memory usage is increasing with each loop.
Here is the output showing increasing memory usage (starts at 48MB and increases by 1-2MB with every loop).
48603136
50274304
51470336
53260288
54984704
55836672
56799232
57884672
59150336
60358656
61349888
62193664
...
How do I get Mechanize to stop leaking memory?
That's not a memory leak, some things just haven't been gc'ed yet. Put:
GC.start
in the loop if you feel like you need it, otherwise it's probably safe to ignore.