Madshrimps Forum Madness

Madshrimps Forum Madness (https://www.madshrimps.be/vbulletin/)
-   WebNews (https://www.madshrimps.be/vbulletin/f22/)
-   -   Xilinx announced PCIe Alveo AI accelerator (https://www.madshrimps.be/vbulletin/f22/xilinx-announced-pcie-alveo-ai-accelerator-181262/)

Stefan Mileschin 4th October 2018 08:52

Xilinx announced PCIe Alveo AI accelerator
 
Data center beat Nvidia Volta on latency and inference

The Xilinx CEO has just introduced a new product category called the Alveo PCIe based hardware accelerator that will challenge machine learning data center compute accelerators. It has dramatically better latency compared to Nvidia or AMD's GPU based solution, it's claimed.

Nvidia, Intel, and AMD have their own solution and what Xilinx brings is a PCIe machine learning card that has an adaptable programmable engine. Victor Peng, the CEO of Xilinx, showed that the Amazon AWS owned Twitch can accelerate 1080P 120 FPS for streaming to millions of customers using the Xilinx Zinq ultrascale existing 16nm product.

Twitch simply used the Zinq programmability part to run VP9 codec on it and get incredible performance and companies like Nvidia or AMD simply cannot change their hardware. Nvidia solutions are great for training but Xilinx Alveo may well win in latency simply because of a different architecture. The Alveo card was introduced by Peng, and we had a chance to see the card up and running and talk to some early adopters who already havine the cards and running their programs on it.

Now Xilinx has the PCIe based card and has showcased that in partnership with AMD EPYC platform could even beat some world records. The record in inference performance shows GoogLee Net scoring 30.000 picture per second with eight Xilinx Alveo U250 accelerator cards.

IBM also has a 922 Power system with an Alveo card as Big Blue claims that it has better inferencing scores using Xilinx while Nvidia Volta does a better job in training. So Inference customers will go after Alveo based IBM machines as that will happen faster and with lower latency.

https://fudzilla.com/news/ai/47316-x...ai-accelerator


All times are GMT +1. The time now is 17:36.

Powered by vBulletin® - Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Content Relevant URLs by vBSEO