Error: checkpointing is not compatible with .grad(), please use .backward() if possible - autograd - PyTorch Forums
![RuntimeError: Trying to backward through the graph a second time, but the buffers have already been freed. Specify retain_graph=True when calling backward the first time - #57 by Eis - PyTorch Forums RuntimeError: Trying to backward through the graph a second time, but the buffers have already been freed. Specify retain_graph=True when calling backward the first time - #57 by Eis - PyTorch Forums](https://discuss.pytorch.org/uploads/default/original/2X/0/0d8508357f1a2b7e09cc57b3e2a47239d654e8af.jpeg)
RuntimeError: Trying to backward through the graph a second time, but the buffers have already been freed. Specify retain_graph=True when calling backward the first time - #57 by Eis - PyTorch Forums
![Confused about simple PyTorch backward() code. How does A.grad know about the x derivative? : r/learnmachinelearning Confused about simple PyTorch backward() code. How does A.grad know about the x derivative? : r/learnmachinelearning](https://preview.redd.it/confused-about-simple-pytorch-backward-code-how-does-a-grad-v0-akm4v538bb3a1.png?auto=webp&s=454e0aac8fd54611ffcc449286a6a6b85ff30ac4)