India, Oct. 28 -- Qualcomm Technologies has announced its entry into the high-performance data center AI market with the introduction of theQualcomm AI200 and Qualcomm AI250 accelerator cards and racks. These new solutions focus on generative AI inference, specifically targeting Large Language Models (LLMs) and Large Multimodal Models (LMMs) with an emphasis on achieving low total cost of ownership (TCO) for enterprise customers.

The Qualcomm AI200 is a purpose-built rack-level solution designed for optimized performance and cost. It offers a substantial 768 GB of LPDDR per card, providing high memory capacity crucial for large AI models. This memory scale gives data centers greater flexibility for inference workloads.

A major technical...