The sharp growth of ChatGPT has already triggered many tech giants to invent their own similar products. As they buckle up – industry experts hint at a strong demand for Nvidia’s AI GPUs soon.

The estimates are so high that Nvidia’s supply may fall short of the incoming demand – with all ChatGPT-like technologies needing high-performance GPUs to train on. Though we have other OEMs, they aren’t as good as Nvidia’s AI chips in this space.

Incoming Demand for Nvidia GPUs

Observing the rise of ChatGPT in such a short span – companies like Google and Microsoft announced their own forms of AI-Chatbots coming soon to the general users. As they plan to integrate these new technologies into respective search engines, they seek more GPUs to do so.

This is due to the nature of such services – where any language/image/video generation tools rely heavily on AI processing power – where NVIDIA excels. As noted by FierceElectronics, ChatGPT’s Beta version was trained on 10,000 GPUs from NVIDIA before it was overwhelmed by massive public demand – where the service had to create a paid subscription to control it.

ChatGPT recently announced a Plus subscription plan – which provides priority access to the servers, thus reducing faster responses. All this is because of Nvidia’s AI GPUs – which do the core background work rapidly.

As Dylan Patel from Semianalysis noted, for Google to integrate its Bard bot in every search query, it would need 512,820 A100 HGX servers of Nvidia, with a total of 4,102,568 A100 GPUs! This puts the budget around $100 Billion just for the server and networking costs for Google!

Though it’s an imaginary plan, the scope for such mega operations is imminent. And Nvidia’s AI GPUs may see a sharp scarcity soon if tech giants seriously want to exploit AI Chatbots.

Though these technologies can use the GPUs from other OEMs, Nvidia’s are prioritized due to their high performance and CUDA support – where many deep learning libraries and frameworks like TensorFlow and PyTorch come built-in and are optimized for intensive operations.

LEAVE A REPLY

Please enter your comment!
Please enter your name here