Facebook Big Sur

Facebook Likes Machine Learning, Announces Open Source GPU Server Big Sur

We may earn money or products from the companies mentioned in this post.

Facebook is one of those companies that dabbles in all sorts of tech awesomeness. While most known for their heavily-used social media platform, Facebook recently announced something else you might like: an open source GPU server.

Facebook Big Sur

Dubbed the Big Sur, Facebook’s forthcoming open source GPU server boasts a whopping eight (yes, you heard right, eight) Nvidia M40 GPUs. For reference, a single M40 totes 5GB of GDDR5 memory, 3072 cores, and at optimal performance of 7 teraflops. Chances are that could run Witcher 3 in 4k with ease (minus the obvious detail of the M40’s lack of video outputs, like this AMD behemoth). Yet though Nvidia is more commonly known for delivering powerful gaming hardware, these GPUs serve a different purpose, machine learning, and more specifically deep neural networks.

Similar to Facebook users, the Big Sur will be using its GPUs to analyze data. Examples include running pictures and videos through neural networks, so that they can be tagged with information. Then these artificial neural networks can make deductions about new data. In a way, it’s a similar concept to uploading pictures to your own Facebook page and tagging photos. Through looking back at these images one might notice that a certain friend has a fondness for tequila, and is always wearing a flannel shirt. Hopefully however Facebook’s research is a bit more insightful. Of course, a few Google researchers used artificial neural networks to identify cat pictures (and reportedly it required 16,000 computers).

Big Sur kicks off Facebook’s open-source hardware expedition into the realm of machine learning, and deep neural networks (topics they’ve been working on for a while). It’s no ordinary server, packing quite a performance hike. Facebook researchers Kevin Lee and Serkan Piatino claimed in a Dec. 10, 2015 blog post that Big Sur is 2x as speedy as their last-gen servers. As per a news article on the Nvidia website, Tesla GPUs can decrease the training time of machine learning in deep neural networks by as high as 10-20x. That’s a substantial performance boost.

Facebook’s foray into machine learning, and deep neural networks, specifically focuses on advancements in artificial intelligence (AI). AI has become more prevalent, evidenced by IBM’s Watson, Microsoft’s Cortana, and Apple’s Siri. Now let’s just hope Facebook, armed with its new Big Sur, doesn’t unwittingly unleash Skynet or HAL.

This post may contain affiliate links. We are a participant in affiliate programs such as the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites. However, all products are thoroughly tested and reviews are honest and unbiased.

//z-na.amazon-adsystem.com/widgets/onejs?MarketPlace=US&adInstanceId=67884eb8-ff29-4605-941f-cc425e194952