With the ever-growing popularity of deep learning, much attention has been focused on tools to improve researcher efficiency. Backpropagation, the crucial step in training neural networks, is nothing but the chain rule with memoization. Autograd automates the computation of such gradients of a function. This enables the user to quickly explore the model space, by saving them from either having to code up exact analytical derivatives or make do with finite differences. This works whenever one can express a function using common differentiable python, numpy and scipy primitives (those with .deriv). This talk will describe the algorithm behind automatic differentiation and illustrate key steps with examples. Brief guidance will follow on how to make use of Autograd within your own projects. I will finish by reviewing some of the more recent advances and the exciting use cases to which these methods are being applied!