| Name | Modified | Size | Downloads / Week |
|---|---|---|---|
| Parent folder | |||
| textbrewer-0.2.0.tar.gz | 2020-07-30 | 43.2 kB | |
| README.md | 2020-07-30 | 419 Bytes | |
| TextBrewer 0.2.0 source code.tar.gz | 2020-07-30 | 8.3 MB | |
| TextBrewer 0.2.0 source code.zip | 2020-07-30 | 8.4 MB | |
| Totals: 4 Items | 16.7 MB | 0 | |
New Features
-
Now supports distributed data-parallel training with
torch.nn.parallel.DistributedDataParallel! You can passlocal_rankto theTrainingConfigto setup for the distributed training. The detailed usage ofDistributedDataParallelcan be found at the PyTorch docs. -
We also added an example (Chinese NER task) to demonstrate how to use TextBrewer with distributed data-parallel training.