Make Pytorch Transformer twice as fast on sequence generation. | Scale AI
Scale AI logo
Book a Demo
→
Log In
←
Blog
Engineering
Making Pytorch Transformer Twice as Fast on Sequence Generation.
by
Alexandre Matton
and
Adrian Lam
Published
December 17, 2020
Reading Time
11 min read
Copy Link
Products
Research
Enterprise
Government
Customers
Resources