Search code examples
pythontensorflowmachine-learningkeraswandb

How to prevent Weights & Biases from saving unnecessary parameters


I am using Weights & Biases (link) to manage hyperparameter optimization and log the results. I am training using Keras with a Tensorflow backend, and I am using the out-of-the-box logging functionality of Weights & Biases, in which I run

wandb.init(project='project_name', entity='username', config=config)

and then add a WandbCallback() to the callbacks of classifier.fit(). By default, Weights & Biases appears to save the model parameters (i.e., the model's weights and biases) and store them in the cloud. This eats up my account's storage quota, and it is unnecessary --- I only care about tracking the model loss/accuracy as a function of the hyperparameters.

Is it possible for me to train a model and log the loss and accuracy using Weights & Biases, but not store the model parameters in the cloud? How can I do this?


Solution

  • In order to not save the trained model weights during hyperparam optimization you do something like this:

    classifier.fit(..., callbacks=[WandbCallback(.., save_model=False)]
    

    This will only track the metrics (train/validation loss/acc, etc.).