"many batches" -> "mini-batches" (#402)

This commit is contained in:
Kerrick Staley 2022-04-25 02:17:10 -04:00 committed by GitHub
parent 2f010aab2d
commit 150e224fda
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23

View File

@ -4180,7 +4180,7 @@
"\n",
"As we saw in our discussion of data augmentation in <<chapter_production>>, we get better generalization if we can vary things during training. One simple and effective thing we can vary is what data items we put in each mini-batch. Rather than simply enumerating our dataset in order for every epoch, instead what we normally do is randomly shuffle it on every epoch, before we create mini-batches. PyTorch and fastai provide a class that will do the shuffling and mini-batch collation for you, called `DataLoader`.\n",
"\n",
"A `DataLoader` can take any Python collection and turn it into an iterator over many batches, like so:"
"A `DataLoader` can take any Python collection and turn it into an iterator over mini-batches, like so:"
]
},
{
@ -4239,7 +4239,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"When we pass a `Dataset` to a `DataLoader` we will get back many batches which are themselves tuples of tensors representing batches of independent and dependent variables:"
"When we pass a `Dataset` to a `DataLoader` we will get back mini-batches which are themselves tuples of tensors representing batches of independent and dependent variables:"
]
},
{