Search code examples
pythonbackpropagationpytorchautograd

How to find and understand the autograd source code in PyTorch


I have a good understanding of the autograd algorithm, and I think I should learn about the source code in PyTorch. However, when I see the project on GitHub, I am confused by the structure, cuz so many files include autograd. So which part is the most important core code of autograd?


Solution

    1. Try to understand the autograd variable is probably the first thing, what you can do. From my understanding is autograd only a naming for the modules, which containing classes with enhancement of gradients and backward functions.

    2. Be aware a lot of the algorithm, e.g. back-prop through the graph, is hidden in compiled code.

    3. If you look into the __init__.py, you can get a glimpse about all important functions (backward & grad)