It appears you have not yet registered with our community. To register please click here...

 
Go Back [M] > Madshrimps > WebNews
Xilinx announced PCIe Alveo AI accelerator Xilinx announced PCIe Alveo AI accelerator
FAQ Members List Calendar Search Today's Posts Mark Forums Read


Xilinx announced PCIe Alveo AI accelerator
Reply
 
Thread Tools
Old 4th October 2018, 08:52   #1
[M] Reviewer
 
Stefan Mileschin's Avatar
 
Join Date: May 2010
Location: Romania
Posts: 148,533
Stefan Mileschin Freshly Registered
Default Xilinx announced PCIe Alveo AI accelerator

Data center beat Nvidia Volta on latency and inference

The Xilinx CEO has just introduced a new product category called the Alveo PCIe based hardware accelerator that will challenge machine learning data center compute accelerators. It has dramatically better latency compared to Nvidia or AMD's GPU based solution, it's claimed.

Nvidia, Intel, and AMD have their own solution and what Xilinx brings is a PCIe machine learning card that has an adaptable programmable engine. Victor Peng, the CEO of Xilinx, showed that the Amazon AWS owned Twitch can accelerate 1080P 120 FPS for streaming to millions of customers using the Xilinx Zinq ultrascale existing 16nm product.

Twitch simply used the Zinq programmability part to run VP9 codec on it and get incredible performance and companies like Nvidia or AMD simply cannot change their hardware. Nvidia solutions are great for training but Xilinx Alveo may well win in latency simply because of a different architecture. The Alveo card was introduced by Peng, and we had a chance to see the card up and running and talk to some early adopters who already havine the cards and running their programs on it.

Now Xilinx has the PCIe based card and has showcased that in partnership with AMD EPYC platform could even beat some world records. The record in inference performance shows GoogLee Net scoring 30.000 picture per second with eight Xilinx Alveo U250 accelerator cards.

IBM also has a 922 Power system with an Alveo card as Big Blue claims that it has better inferencing scores using Xilinx while Nvidia Volta does a better job in training. So Inference customers will go after Alveo based IBM machines as that will happen faster and with lower latency.

https://fudzilla.com/news/ai/47316-x...ai-accelerator
Stefan Mileschin is offline   Reply With Quote
Reply


Similar Threads
Thread Thread Starter Forum Replies Last Post
Xilinx scores Daimler AI deal Stefan Mileschin WebNews 0 28th June 2018 18:45
Xilinx is number two in car cameras Stefan Mileschin WebNews 0 28th June 2018 18:28
Xilinx helped AMD with HBM 2 Stefan Mileschin WebNews 0 15th June 2018 13:43
Xilinx has majority of LiDAR market Stefan Mileschin WebNews 0 10th May 2018 12:36
Xilinx has new President and CEO Stefan Mileschin WebNews 0 8th January 2018 17:10
OCZ Announces ZD-XL PCIe SQL Accelerator SSD Solution Stefan Mileschin WebNews 0 24th July 2013 06:33
Seagate 1200 SSD: Custom Seagate SAS Controller, X8 PCIe SSD Also Announced Stefan Mileschin WebNews 0 8th May 2013 07:00
Micron Unveils New PCIe I/O Accelerator Stefan Mileschin WebNews 0 3rd May 2013 07:48
Micron P420m Announced: MLC PCIe SSD Stefan Mileschin WebNews 0 3rd May 2013 07:26
Marvell Unveils DragonFly NVDRIVE PCIe SSD Cache Accelerator Stefan Mileschin WebNews 0 4th January 2013 06:49

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off


All times are GMT +1. The time now is 07:16.


Powered by vBulletin® - Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Content Relevant URLs by vBSEO