Robert Triggs / Android Authority
TL; DR
- Qualcomm announced the Qualcomm AI Hub for developers.
- It allows developers to quickly implement AI models into their applications.
- The company says these optimized models should also work on non-Snapdragon devices, with some caveats.
AI has been a fixture on smartphones for years, but generative AI is a relatively new phenomenon in this space. Now, Qualcomm has announced the Qualcomm AI Hub to help developers quickly implement AI and generative AI features into their applications.
The company says this is actually a central location for app developers to access on-device AI models that have been quantized (i.e. scaled down) and validated by Qualcomm. The chip designer says that over 75 AI models are also supported.
To get started, you’ll need to visit aihub.qualcomm.com, select an AI model, and then select a target platform. Qualcomm adds that you can also drill down and select a specific device.
From here, the Qualcomm AI Hub will guide you to the right model. The company says developers can integrate these optimized models into their workflow with “a few lines of code.”
Qualcomm notes that these models cover image segmentation, image generation, image classification, object detection, super resolution, low-light enhancements, text generation, and natural language understanding.
Will this work on non-Snapdragon devices?
What if you want to build an AI-enabled app that runs on a device with a non-Snapdragon chip? We posed this question to Qualcomm during its press briefing, and it turns out there’s good news and bad news.
“These models are actually pre-optimized in terms of the model itself, providing acceleration on generic CPU and GPU, so it will run on other branded chips and you can still enjoy the performance boost,” the company told us during the briefing .
“But to get the best out of these models, we’ve optimized even a level deeper for Qualcomm platforms (sic), with NPU acceleration and some other improvements across the board, these models will work best on Qualcomm platforms.”
In other words, these optimized AI models will work on a phone powered by an Exynos chip, a Google Tensor chip, or a MediaTek SoC. However, they cannot take advantage of dedicated AI silicon for faster and/or more efficient performance. So developers will still need to work extra on these devices for best results.