Search code examples
tensorflowdistributed-computing

Tensorflow: is there a rule to set the port of worker/ps when creating ClusterSpec?


When creating ClusterSpec in distributed setting, we need to assign hostname: port to workers/ps. Is there a rule for choosing port? Or is it free to set it to be any number?


Solution

  • You can set it to any available TCP port on the host where you start the tf.train.Server.

    I usually use port 2222 because it's easy to type (and that number has ended up in a lot of the documentation that I originally wrote).