mirror of
https://github.com/fastai/fastbook.git
synced 2025-04-05 02:10:48 +00:00
proofreading(missing word in "we don't to consider")
proofreading(changing from "we don't to consider" to be"we don't need to consider" )
This commit is contained in:
parent
b2f1c12d4c
commit
df43ac86d6
@ -1159,7 +1159,7 @@
|
||||
"\n",
|
||||
"The really interesting thing here is that this actually works just as well with more than two columns. To see this, consider what would happen if we added a activation column above for every digit (zero through nine), and then `targ` contained a number from zero to nine. As long as the activation columns sum to one (as they will, if we use softmax), then we'll have a loss function that shows how well we're predicting each digit.\n",
|
||||
"\n",
|
||||
"We're only picking the loss from the column containing the correct label. We don't to consider the other columns, because by the definition of softmax, they add up to one minus the activation corresponding to the correct label. Therefore, making the activation for the correct label as high as possible, must mean we're also decreasing the activations of the remaining columns.\n",
|
||||
"We're only picking the loss from the column containing the correct label. We don't need to consider the other columns, because by the definition of softmax, they add up to one minus the activation corresponding to the correct label. Therefore, making the activation for the correct label as high as possible, must mean we're also decreasing the activations of the remaining columns.\n",
|
||||
"\n",
|
||||
"PyTorch provides a function that does exactly the same thing as `sm_acts[range(n), targ]` (except it takes the negative, because when applying the log afterward, we will have negative numbers), called `nll_loss` (*NLL* stands for *negative log likelihood*):"
|
||||
]
|
||||
|
Loading…
Reference in New Issue
Block a user