I'm currently experimenting with tensorflow distribution and I was wondering if it is necessary to include the parameter server.
The method that I am using is tf.estimator.train_and_evaluate. My setup is one master, one worker, and one parameter server running on three servers.
It seems that the parameter server is just listening to the other two servers but it's not doing anything else. Based on mrry's answer from Tensorflow: Using Parameter Servers in Distributed Training, I tried distributing with only one worker and one master and I was still able to get results.
Does tf.estimator.train_and_evaluate need a parameter server?
After doing multiple tests...yes you do need a parameter server