Guest post by Cheryl Goodman of FindGood.Tech
Past uses of the term hybrid AI had some folks envisioning a half-robot / half-human mashup. The less dramatic, and future forward use of hybrid AI practically means a combination of cloud-based and “on-device” processing of information. According to Qualcomm, a hybrid AI architecture is crucial for generative AI to scale as it distributes the massive AI workloads between the cloud and edge devices. Qualcomm is driving the hybrid AI model, combining both online and offline AI capabilities, embracing the inevitable dominance of generative AI. The company is embedding generative AI capabilities into devices at the manufacturing level, reducing the reliance on cloud-based processing and while saving energy resources. By incorporating pre-trained parameters into processors and devices, Qualcomm aims to enhance agility, speed, and cost-effectiveness in AI processing. This approach extends beyond smartphones to a host of connected devices including robotics, computers, connected cars, IoT and XR devices.
More connected devices + more data traffic + more data center costs = more demand for Hybrid AI
The generative AI ecosystem stack is expanding rapidly, and the market size of generative AI is estimated at $1.3 trillion according to Bloomberg. Startups and established organizations are racing to SDKs to build apps for speech, text, image, video, 3D, and audio using generative AI models. With this exponential growth comes costs. Generative AI applications, such as search, are more expensive to process as compared to traditional methods. Hybrid AI can distribute AI processing in different ways depending on factors like model complexity. For less complex tasks, inference can run completely on the device. More complex tasks can be processed across the cloud and devices. Hybrid AI even enables devices and the cloud to run models concurrently, with devices running lighter versions of the model while the cloud processes multiple tokens of the full model in parallel. The potential of hybrid AI increases as generative AI models become smaller and on-device processing capabilities improve according to their white paper. Models with over 1 billion parameters are already running on phones, and models with 10 billion parameters or more are expected to run on devices in the near term.
Offline Processing, Faster Results
The incorporation of AI tools into processors enables offline processing, reducing the burden on cloud networks and providing users with faster responses. Qualcomm flagship smartphones with 8 Gen2 processors are already incorporating these features, with subsequent generations expected to have even more parameters. Qualcomm anticipates manufacturers will adopt the hybrid AI approach, making advanced AI capabilities a significant factor in consumers’ purchasing decisions. To that end, Qualcomm hosted the 1st Annual #DXSummit for the system integrators, engineering firms and manufacturer in May at the San Diego headquarters, showcasing research from Bain Consulting, use cases and hands-on demos including the aware platform.
Christian Amon, CEO of Qualcomm addressed DX Summit attendees stating, “The Smartphone will remain the center of our universe thanks to core innovation on the device itself – it represents a massive opportunity as the largest development platform in existence – given the massive opportunities at the edge.“
Vishal Shah General Manager of XR & Metaverse at Lenovo stated “#DXsummit did a stellar job of bringing in Digital Transformation experts from a wide variety of industries and specialties like system integrators, engineering firms, and consulting. What was clear from the event that for AI to scale, the implementation will have to be around hybrid cloud and edge platforms, powered by processors that are increasingly on the edge, devices like smartphones and laptops. Add to this, the power of low latency 5G private and public networks, we have the perfect mix for this to be the decade of AI.”
To the lay person, it may appear that Qualcomm is making a hard pivot to support hybrid AI. Since 2007, they have maintained a deep legacy of AI “firsts”. The intelligent-edge computing company has also shipped more than 2 Billion AI-enabled products to date. So while the term hybrid AI may be newer to the vernacular – the innovation behind the technology is not.
Hybrid A.I. is inevitable.
As society finds new ways to leverage generative A.I., the demand for cloud infrastructure will grow exponentially. Hybrid A.I. processing is the next major transition in computing, just as we saw the evolution from mainframes to desktops to smartphones.
Jim Cathey, Chief Commercialization Officer Qualcomm Technologies
Earl Martin, Jr Demoing the Aware Platform
Attendees mingling including Vishal Shah and John Iaia of Lenovo’s XR team, Tony O’Connor of Qualcomm, Anshel Sag of Moor Insights and Strategy, Cheryl Goodman of FindGood.Tech, Neal Bloom of Fresh Brewed Tech and Interlock Capital.