Thursday, November 21, 2024

Top Features Every Crypto Wallet App Should Include in 2025

Blockchain technology's explosive development has changed digital...

What Makes SpotHero Clone the Perfect Solution for Parking Challenges?

Due to the increasing concentration of people...

Unlock Business Growth with Full Cycle Software Development Services

In today's fast-paced digital world, businesses in...

Meta and Qualcomm team up to run big A.I. models on phones

TechnologyMeta and Qualcomm team up to run big A.I. models on phones

[ad_1]

Cristiano Amon, president and CEO of Qualcomm, speaks during the Milken Institute Global Conference on May 2, 2022, in Beverly Hills, Calif.

Patrick T. Fallon | AFP | Getty Images

Qualcomm and Meta will enable the social networking company’s new large language model, Llama 2, to run on Qualcomm chips on phones and PCs starting in 2024, the companies announced today.

So far, LLMs have primarily run in large server farms, on Nvidia graphics processors, due to the technology’s vast needs for computational power and data, boosting Nvidia stock, which is up more than 220% this year. But the AI boom has largely missed the companies that make leading edge processors for phones and PCs, like Qualcomm. Its stock is up about 10% so far in 2023, trailing the NASDAQ’s gain of 36%.

The announcement on Tuesday suggests that Qualcomm wants to position its processors as well-suited for A.I. but “on the edge,” or on a device, instead of “in the cloud.” If large language models can run on phones instead of in large data centers, it could push down the significant cost of running A.I. models, and could lead to better and faster voice assistants and other apps.

Qualcomm will make Meta’s open-source Llama 2 models available on Qualcomm devices, which it believes will enable applications like intelligent virtual assistants. Meta’s Llama 2 can do many of the same things as ChatGPT, but it can be packaged in a smaller program, which allows it to run on a phone.

Qualcomm’s chips include a “tensor processor unit,” or TPU, that is well-suited for the kinds of calculations that A.I. models require. However, the amount of processing power that is available on a mobile device pales in comparison to a data center stocked with cutting-edge GPUs.

See also  Alibaba launches open-sourced A.I. model in challenge to Meta

Meta’s Llama is notable because Meta published its “weights,” a set of numbers that helps govern how a particular AI model works. Doing this will allow researchers and eventually commercial enterprises to use the AI models on their own computers without asking permission or paying. Other notable LLMs, like OpenAI’s GPT-4, or Google’s Bard, are closed-source, and their weights are closely held secrets.

Qualcomm has worked with Meta closely in the past, notably on chips for its Quest virtual reality devices. It has also demoed some A.I. models running slowly on its chips, such as the open source image generator Stable Diffusion.

[ad_2]

Source link

Check out our other content

Check out other tags:

Most Popular Articles