mirror of
https://github.com/fastai/fastbook.git
synced 2025-04-04 01:40:44 +00:00
"many batches" -> "mini-batches" (#402)
This commit is contained in:
parent
2f010aab2d
commit
150e224fda
@ -4180,7 +4180,7 @@
|
||||
"\n",
|
||||
"As we saw in our discussion of data augmentation in <<chapter_production>>, we get better generalization if we can vary things during training. One simple and effective thing we can vary is what data items we put in each mini-batch. Rather than simply enumerating our dataset in order for every epoch, instead what we normally do is randomly shuffle it on every epoch, before we create mini-batches. PyTorch and fastai provide a class that will do the shuffling and mini-batch collation for you, called `DataLoader`.\n",
|
||||
"\n",
|
||||
"A `DataLoader` can take any Python collection and turn it into an iterator over many batches, like so:"
|
||||
"A `DataLoader` can take any Python collection and turn it into an iterator over mini-batches, like so:"
|
||||
]
|
||||
},
|
||||
{
|
||||
@ -4239,7 +4239,7 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"When we pass a `Dataset` to a `DataLoader` we will get back many batches which are themselves tuples of tensors representing batches of independent and dependent variables:"
|
||||
"When we pass a `Dataset` to a `DataLoader` we will get back mini-batches which are themselves tuples of tensors representing batches of independent and dependent variables:"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
Loading…
Reference in New Issue
Block a user