In ML, we are often interested in how to approximate some target function by adjusting the parameters of ML algorithms. If we think of the ML algorithm itself as a mathematical function (which is the case for NNs), we would like to know how the output of that function changes when we change some of its parameters (weights). Thankfully, differential calculus deals with the rate of change of a function with respect to a variable that the function depends on. The following is a (very) short introduction to derivatives.
Let's say that we have a function, f(x), with a single parameter, x, which has the following graph: