mirror of
https://github.com/fastai/fastbook.git
synced 2025-04-05 02:10:48 +00:00
"throw you off"?
This commit is contained in:
parent
aa76db42ee
commit
0a2e646703
@ -2666,7 +2666,7 @@
|
||||
"source": [
|
||||
"Notice the special method `requires_grad_`? That's the magical incantation we use to tell PyTorch that we want to calculate gradients with respect to that variable at that value. It is essentially tagging the variable, so PyTorch will remember to keep track of how to compute gradients of the other, direct calculations on it which you will ask for.\n",
|
||||
"\n",
|
||||
"> a: This API might throw you if you're coming from math or physics. In those contexts the \"gradient\" of a function is just another function (i.e., its derivative), so you might expect gradient-related APIs to give you a new function. But in deep learning, \"gradients\" usually means the _value_ of a function's derivative at a particular argument value. PyTorch API also puts the focus on that argument, not the function you're actually computing the gradients of. It may feel backwards at first but it's just a different perspective.\n",
|
||||
"> a: This API might throw you off if you're coming from math or physics. In those contexts the \"gradient\" of a function is just another function (i.e., its derivative), so you might expect gradient-related APIs to give you a new function. But in deep learning, \"gradients\" usually means the _value_ of a function's derivative at a particular argument value. PyTorch API also puts the focus on that argument, not the function you're actually computing the gradients of. It may feel backwards at first but it's just a different perspective.\n",
|
||||
"\n",
|
||||
"Now we calculate our function with that value. Notice how PyTorch prints not just the value calculated, but also a note that it has a gradient function it'll be using to calculate our gradient when needed:"
|
||||
]
|
||||
|
Loading…
Reference in New Issue
Block a user