M
Michael Siu
Publications - 2
Citations - 1377
Michael Siu is an academic researcher. The author has contributed to research in topics: IEEE floating point & decimal64 floating-point format. The author has an hindex of 1, co-authored 1 publications receiving 1167 citations.
Papers
More filters
StandardDOI
IEEE Standard for Floating-Point Arithmetic
Dan Zuras,M. F. Cowlishaw,Alex Aiken,Matthew Applegate,David H. Bailey,Steve Bass,Dileep Bhandarkar,Mahesh Bhat,David Bindel,Sylvie Boldo,Stephen Canon,Steven R. Carlough,Marius Cornea,John H. Crawford,Joseph D. Darcy,Debjit Das Sarma,Marc Daumas,Bob Davis,Mark Davis,Dick Delp,James Demmel,Mark A. Erle,Hossam A. H. Fahmy,J. P. Fasano,Richard J. Fateman,Eric Feng,Warren E. Ferguson,Alex Fit-Florea,Laurent Fournier,Chip Freitag,Ivan Godard,Roger A. Golliver,David Gustafson,Michel Hack,John R. Harrison,John Hauser,Yozo Hida,Chris N. Hinds,Graydon Hoare,David G. Hough,Jerry Huck,Jim Hull,Michael Ingrassia,David V. James,Rick James,William Kahan,John Kapernick,Richard Karpinski,Jeff Kidder,Plamen Koev,Ren-Cang Li,Zhishun A. Liu,Raymond Mak,Peter Markstein,David W. Matula,Guillaume Melquiond,Nobuyoshi Mori,Ricardo Morin,Ned Nedialkov,Craig Nelson,Stuart Oberman,Jon Okada,Ian Ollmann,Michael Parks,Tom Pittman,Eric Postpischil,Jason Riedy,Eric M. Schwarz,David Scott,Don Senzig,Ilya Sharapov,Jim Shearer,Michael Siu,Ron Smith,Chuck Stevens,Peter Tang,Pamela J. Taylor,James W. Thomas,Brandon Thompson,Wendy Thrash,Neil Toda,Son Dao Trong,Leonard Tsai,Charles Tsen,Fred Tydeman,Liang Wang,Scott Westbrook,Steve Winkler,Anthony Wood,Umit Yalcinalp,Fred Zemke,Paul Zimmermann +91 more
Journal ArticleDOI
FP8 Formats for Deep Learning
Paulius Micikevicius,Dusan Stosic,N. Burgess,Marius Cornea,Pradeep Dubey,Richard Roy Grisenthwaite,Sangwon Ha,Alexander Heinecke,Patrick K. Judd,John Kamalu,Naveen Mellempudi,S. Oberman,Mohammad Shoeybi,Michael Siu,Hao Wu +14 more
TL;DR: This paper proposes an 8-bit FP8 binary interchange format consisting of two encodings - E4M3 and E5M2 - and demonstrates the efficacy of the FP8 format on a variety of image and language tasks, effectively matching the result quality achieved by 16-bit training sessions.