Qualcomm Partners with Meta to Launch On-Device AI Applications Using Llama 2

Image credit: Qualcomm
Overview
In a move that promises to revolutionize artificial intelligence (AI) applications, Qualcomm Technologies Inc., in collaboration with Meta, has announced plans to optimize the execution of Meta's Llama 2, a large language model, directly on devices, eliminating the sole reliance on cloud services.
This initiative offers developers an economical alternative to cloud-based AI implementation and services while enhancing user privacy, application reliability, and enabling personalization. The on-device AI experience, facilitated by Qualcomm's Snapdragon® platforms, is also anticipated to work even in offline scenarios such as in areas with no connectivity or in airplane mode.
Durga Malladi, senior vice president and general manager of technology, planning, and edge solutions businesses at Qualcomm Technologies, highlights the need for AI to run both on the cloud and devices at the edge for generative AI to scale into the mainstream effectively. He applauded Meta's commitment to open and responsible AI while underlining Qualcomm's determination to lower barriers for developers by bringing generative AI on-device.
As industry leaders in on-device AI, Qualcomm Technologies is uniquely positioned to support the Llama ecosystem. With an unmatched footprint at the edge - from smartphones, vehicles, XR headsets, PCs to IoT devices, and more - Qualcomm leverages its industry-leading AI hardware and software solutions to enable generative AI to scale.
From 2024 onwards, Qualcomm Technologies plans to make available Llama 2-based AI implementations on devices powered by Snapdragon. Developers can start optimizing applications for on-device AI today using the Qualcomm® AI Stack, a dedicated set of tools that enhance AI processing efficiency on Snapdragon, enabling on-device AI even in compact devices. Source
Key Takeaways:
Qualcomm Technologies is working with Meta to optimize the execution of Meta's Llama 2 large language model directly on devices.
This move enables developers to save on cloud costs and provides users with private, more reliable, and personalized experiences.
Qualcomm plans to make available Llama 2-based AI implementations on devices powered by Snapdragon starting from 2024.
Developers can start today optimizing applications for on-device AI using the Qualcomm® AI Stack.
The on-device AI experience can work even in offline scenarios, demonstrating a significant shift in AI implementation.