Qualcomm has already shared its plans of how its working on bringing on-device AI to smartphones and it seems like Meta is now joining the chipmaker in converting these plans to reality. The companies have announced how Meta’s Llama 2 AI model will enable on-device AI capabilities on smartphones running with a Qualcomm chip.
Qualcomm has revealed that it will bring on-device AI capabilities to 2024’s flagship phones and PCs, and it will be powered by Meta’ Llama 2 large language model (LLM). Interestingly, on-device AI means that users will be able to take advantage of a variety of use cases of AI without the need for an internet connection. These use cases include smart virtual assistants, productivity applications, content creation tools, entertainment, and more.
The US-based chipmaker says that developers can already start optimizing applications for on-device AI using the Qualcomm AI Stack, which is a “dedicated set of tools that allow to process AI more efficiently on Snapdragon, making on-device AI possible even in small, thin, and light devices”.
Meta and Microsoft also strengthen their partnership
Apart from this, Meta and Microsoft are also partnering, as a result of which Meta’s Llama 2 is available in the Azure AI model catalog, which will enable developers using Microsoft Azure to build with it and leverage their cloud-native tools for content filtering and safety features. Llama 2 has also been optimized to run locally on Windows.
As Meta’s Llama 2 goes open source, businesses, startups, entrepreneurs, and researchers now have access to more tools, which apparently opens up “opportunities for them to experiment, innovate in exciting ways, and ultimately benefit from economically and socially”. Meta says that the feedback from the users will help make the AI model more safer and better to use.