November 2019
Intermediate to advanced
304 pages
8h 40m
English
The idea behind the threshold (the encoding threshold, in our context) is that the parameter updates will happen across clusters, but only for the values that come under the user-defined limit (threshold). This threshold value is what we refer to as the encoding threshold. Parameter updates refer to the changes in gradient values during the training process. High/low encoding threshold values are not good for optimal results. So, it is reasonable to come up with a range of acceptable values for the encoding threshold. This is also termed as the sparsity ratio, in which the parameter updates happen across clusters.
In this recipe, we also discussed how to configure threshold algorithms for distributed training. The default ...