From 8d7d852ddc526005ed9d93c48706ae25b7c73d4e Mon Sep 17 00:00:00 2001 From: hamelsmu Date: Sat, 5 Mar 2022 12:00:14 -0800 Subject: [PATCH] spelling --- 04_mnist_basics.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/04_mnist_basics.ipynb b/04_mnist_basics.ipynb index ff963b0..e52070b 100644 --- a/04_mnist_basics.ipynb +++ b/04_mnist_basics.ipynb @@ -2870,7 +2870,7 @@ "w -= gradient(w) * lr\n", "```\n", "\n", - "This is known as *stepping* your parameters, using an *optimizer step*. Notice how we _subtract_ the `gradient * lr` from the parameter to update it. This allows us to adjust the parameter in the direction of the slope by increasing the parameter when the slope is negative and decreasing the parameter when the slope is positive. We want to adjsut our parameters in the direction of the slope because our goal in deep learning is to _minimize_ the loss.\n", + "This is known as *stepping* your parameters, using an *optimizer step*. Notice how we _subtract_ the `gradient * lr` from the parameter to update it. This allows us to adjust the parameter in the direction of the slope by increasing the parameter when the slope is negative and decreasing the parameter when the slope is positive. We want to adjust our parameters in the direction of the slope because our goal in deep learning is to _minimize_ the loss.\n", "\n", "If you pick a learning rate that's too low, it can mean having to do a lot of steps. <> illustrates that." ]