Madshrimps Forum Madness

Madshrimps Forum Madness (https://www.madshrimps.be/vbulletin/)
-   WebNews (https://www.madshrimps.be/vbulletin/f22/)
-   -   CPUs are Leading AI Inference Workloads, Not GPUs (https://www.madshrimps.be/vbulletin/f22/cpus-leading-ai-inference-workloads-not-gpus-227874/)

Stefan Mileschin 4th March 2024 07:12

CPUs are Leading AI Inference Workloads, Not GPUs
 
The AI infrastructure of today is mostly fueled by the expansion that relies on GPU-accelerated servers. Google, one of the world's largest hyperscalers, has noted that CPUs are still a leading compute for AI/ML workloads, recorded on their Google Cloud Services cloud internal analysis. During the TechFieldDay event, a speech by Brandon Royal, product manager at Google Cloud, explained the position of CPUs in today's AI game. The AI lifecycle is divided into two parts: training and inference. During training, massive compute capacity is needed, along with enormous memory capacity, to fit ever-expanding AI models into memory. The latest models, like GPT-4 and Gemini, contain billions of parameters and require thousands of GPUs or other accelerators working in parallel to train efficiently.

https://www.techpowerup.com/319880/g...loads-not-gpus


All times are GMT +1. The time now is 18:06.

Powered by vBulletin® - Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Content Relevant URLs by vBSEO