Search code examples
tensorflowjupyter-notebookpytorchgoogle-colaboratory

Avoiding reloading weights/datasets in ML edit-compile-run loop


In machine learning, the edit-compile-run loop is pretty slow as your script has to load large models and datasets.

In the past, I've avoided this by loading just a tiny subset of the data, and not using pre-initialized weights when setting up the code for training.


Solution

  • Use a Jupyter notebook or google colab.

    You can edit and compile a cell at a time, and the dataset and trained weights in another cell will be persisted.

    Somehow this didn't click, until just now.