首页 > 其他 > 详细

Speeding up training

时间:2021-05-19 16:48:45      阅读:20      评论:0      收藏:0      [点我收藏+]

地址:https://parl.ai/docs/tutorial_fast.html

We’ll start with an example training command, which trains a transformer/generator on ConvAI2 for 1 epoch,

using a batchsize of 64, with a roughly 20M parameter model. We’ll train using ADAM optimizer, with a learning rate of 1e-3.

We’ll build the dictionary ahead of time to ensure it’s kept the same. This will be our baseline.

parlai build_dict --task convai2 --dict-file dictfile
parlai train --dict-file dictfile --model transformer/generator --task     --task convai2 --num-epochs 1.0 --batchsize 64 --n-layers 8      --embedding-size 250 --ffn-size 1000 --optimizer adam --learningrate 1e-3

 

  

 

Speeding up training

原文:https://www.cnblogs.com/yanghh/p/14784681.html

(0)
(0)
   
举报
评论 一句话评论(0
关于我们 - 联系我们 - 留言反馈 - 联系我们:wmxa8@hotmail.com
© 2014 bubuko.com 版权所有
打开技术之扣,分享程序人生!