So the issue I am having is that I want to measure the memory consumption and the time for a cell line in iPython on Google Colab, where the line is returning an object too.
I am doing this with memory-profiler
, so I can store all results of the benchmark in a file automatically.
I found out that memit -o
gives me a MemitResult
object, but
the issue is that the object assignment gives an undefined m̀odel
:
NameError: name 'model' is not defined
A short MWE:
%%time
import pmdarima as pm
trainMemory = %memit -o model= pm.auto_arima(df["y"], seasonal=True, m=12, maxiter=10, njobs = -1)
Ideally I would assign all 3 variables, for time, memory and object in one train function call, to save on the training time.
In the end, I used Weights and biases for tracking GPU and memory consumption at the same time.
The setup was really easy, however I haven't yet found how to extract information like maximum, mininum etc.
Setup:
import wandb
import pmdarima as pm
wandb.init()
model= pm.auto_arima(df["y"], seasonal=True, m=12, maxiter=10, njobs = -1)
Then on the website, you can see the consumption over time of the run
It turns out you can get the maximum usage via the api instead of looking at charts
import wandb
api = wandb.Api()
run = api.run(f"YOURNAME/{wandb.run.name}/{wandb.run.id}")
system_metrics = run.history(stream = 'events') print(system_metrics["system.gpu.0.memory"].max())
system_metrics.to_csv("metrics.csv")
You can then access the properties of the system_metrics object