News
Amended Feedback on Assignment 5
Written on 02.12.2025 14:39 by Diana Victoria Davidson
Hi everyone,
For question 3.2 (and subsequently 3.5.c), there are actually two correct ways of writing the backpropagation sequence. Either it's self.optimizer.zero_grad(), loss.backward(), self.optimizer.step(), or loss.backward(), self.optimizer.step(), self.optimizer.zero_grad(). Different parts of the PyTorch documentation give one of these options (example here and here), so as long as you justified the order in question 3.5.c you get full credit for those parts of the assignment. Grades have been adjusted to account for that.
However, if you are implementing multiple loss functions in a more complicated neural network, self.optimizer.zero_grad() should come first (as seen in this discussion post).
Best,
Diana
