As discussed previously, and shown in the diagram here, in asynchronous updates all the worker tasks send the parameter updates when they are ready, and the parameter server updates the parameters and sends back the parameters. There is no synchronization or waiting or aggregation of parameter updates:
For asynchronous updates, the graph is created and trained with the following steps:
- The definition of the graph is done within the with block:
with tf.device(device_func): ...