Search code examples
pythontensorflowkerasdistributed-training

is there a way to train a ML model on multiple laptops?


I have two laptops and want to use both for the DL model training. I don't have any experience in distributed systems and want to know is it possible to use the processing power of two laptops together to train a single model. What about tf.distribute.experimental.ParameterServerStrategy? Will it be of any use?


Solution

  • Yes, you can use multiple devices for training your model and you need to have cluster and worker configuration to be done on both the devices like below.

    tf_config = {
        'cluster': {
            'worker': ['localhost:12345', 'localhost:23456']
        },
        'task': {'type': 'worker', 'index': 0}
    }
    

    This Tutorial from Tensorflow on Multi-worker training with Keras will show you all the details about the configuration and training your model.

    Hope this answers your question.