In order to speed up research into artificial intelligence (AI), Facebook is creating a system that will specifically be used for neural network training that it calls “Big Sur”.
To power this new system, Facebook is using the recently announced Nvidia Tesla accelerated Computing Platform—a graphics processor unit designed to help researchers enable deep neural networks for a number of applications that they want to power with AI. Nvidia says Facebook is the first company to adopt the Tesla M40 GPU accelerators.
Nvidia calls deep learning the new era of computing and will enable designers to solve problems never possible before. “Huge industries from web services and retail to healthcare and cars will be revolutionized,” says Ian Buck, vice president of accelerated computing at Nvidia.
Nvidia worked with Facebook to deliver enhanced performance in the Big Sur server including the training of large neural networks across multiple Tesla GPUs. Facebook says Big Sur will enable the company to train twice as many neural networks and to create neural networks that are twice as large in order to develop new classes of advanced applications.
“The key to unlocking the knowledge necessary to develop more intelligent machines lies in the capability of our computing systems," says Serkan Piantino, engineering director for Facebook AI Research’s (FAIR). "Most of the major advances in machine learning and AI in the past few years have been contingent on tapping into powerful GPUs and huge data sets to build and train advanced models.”
Nvidia says Big Sur will be the first time a computing system designed for machine learning will be released as an open source option.