Qualcomm released Cloud AI 100 edge inference chip
Yesterday, Qualcomm held its annual Artificial Intelligence Day (AI Day) conference in San Francisco. At this conference, the company unveiled the mystery of three chips that can be used in smartphones, tablets and other mobile devices. However, these chips are just appetizers, and the highlight is a new product designed for edge computing – the Qualcomm Cloud AI 100.
Keith Kressin, senior vice president of product management, said, “We designed this new signal processor to serve AI reasoning,” he added, adding that the product will begin production in the second half of next year.
The Cloud AI 100 integrates a range of development tools, including compilers, debuggers, analyzers, monitors, chip debuggers, and quantizers. In addition, it supports runtimes including ONNX, Glow and XLA, as well as machine learning frameworks such as Google’s TensorFlow, Facebook’s PyTorch, Keras, MXNet, Baidu’s PaddlePaddle and Microsoft’s Cognitive Toolkit.
Qualcomm estimates that the peak performance of this product is 3 to 50 times that of the Opteron 855 and the Opteron 820. The company also claims that its average speed of reasoning is about 10 times faster than traditional FPGAs. In addition, the AIs (the performance indicators commonly used in high-performance chips), the speed of the Cloud AI 100 can be “far more than” 100TOPs. In contrast, the fastest of the Snapdragon 855 is only 7TOPs.
Kressin explained: “FPGA or GPU can usually do AI inference processing more efficiently, because the GPU is parallel, and the CPU is serial, the parallel chip is more suitable for AI processing. However, the GPU was originally created for processing graphics, we To design a chip to accelerate AI, you can get a more significant improvement. From an CPU to FPGA or GPU, after an order of magnitude improvement, the custom AI accelerator can add an order of magnitude improvement.”
Before Qualcomm entered the cloud computing field, its main competitor Huawei released the industry’s highest performance Arm architecture processor Kun Peng 920. In SPECint (a benchmark suite of 12 programs designed to test integer performance), the chip scored more than 930 points, nearly 25% higher than the industry benchmark, and the power consumption is higher than the industry’s existing vendors. The product is 30% lower.
Qualcomm’s competitors can be more than one Huawei.
In January, at the Consumer Electronics Show in Las Vegas, Intel detailed the upcoming Nervana Neural Network Processor (NNP-I). According to reports, the AI training performance of the processor will reach 10 times that of the competitor’s graphics card. Google launched the dedicated reasoning chip Edge TPU last year. Alibaba announced in December last year that it plans to launch its first self-developed AI reasoning chip in the second half of this year.
On the FPGA side, Amazon recently launched its own AI cloud acceleration chip called AWS Inferentia. Microsoft also showed similar products in Project Brainwave. In March of this year, Facebook released Kings Canyon, a server chip for AI reasoning. This month, Intel announced the Agilex family of chips to optimize AI and big data work.
But Qualcomm believes that the performance advantages of the Cloud AI 100 will allow it to gain the upper hand in the deep learning chip market. The market is expected to reach $66.3 million by 2025.
“Many companies put network hardware on the edge to form a content delivery network that can perform different types of tasks such as cloud gaming or AI processing. This trend is critical. Because of this, whether it is the input technology of the end user. , or the cloud, there will be Qualcomm’s participation.”
Qualcomm has another potential advantage – ecosystem support.
Last November, Qualcomm pledged $100 million to a startup fund focused on edge AI and on-device AI for use in autonomous vehicles, robotics, computer vision and the Internet of Things. Last May, it partnered with Microsoft to develop a vision AI toolkit that assists the AI accelerator embedded in the system chip.
“In terms of market size, reasoning is spawning a large chip market,” Kressin said. “Over time, we expect the market to expand approximately 10 times between 2018 and 2025. We are very confident, Qualcomm will be a strong leader in product performance, AI processing, and data processing.”