...

SambaNova Unveils Fastest Agentic AI Chip, Partners with Intel, and Raises $350M

SambaNova Unveils Fastest Agentic AI Chip

SambaNova Systems has recently introduced its SN50 chip to enable the next wave of Agentic AI applications. This new chip operates at five times the speed of the best competitive chips. As a result, the company has raised $350 million in strategic Series E funding. Vista Equity Partners and Intel Capital led this significant round of funding. In addition, SambaNova Systems and Intel have formed a multi-year partnership for cloud-scale inference. This partnership applies Intel Xeon infrastructure to provide high-performance AI services worldwide. As a result, businesses can reduce their operating expenses by nearly three times. The SN50 chip can handle models with over 10 trillion parameters.

“AI is no longer a contest to build the biggest model,” said Rodrigo Liang, co‑founder and CEO of SambaNova. “With the SN50 and our deep collaboration with Intel, the real race is about who can light up entire data centers with AI agents that answer instantly, never stall, and do it at a cost that turns AI from an experiment into the most profitable engine in the cloud.”

“Customers are asking for more choice and more efficient ways to scale AI,” said Kevork Kechichian, EVP, General Manager, Data Center Group, Intel.“By combining Intel’s leadership in compute, networking, and memory with SambaNova’s full-stack AI systems and inference cloud platform, we are delivering a compelling option for organizations looking for GPU alternatives to deploy advanced AI at scale.”

Expanding Partnerships and Real-World Deployment

SoftBank Corp. also partnered with SambaNova as the initial major customer for the new SN50 models. They will use the chips in their next-generation data centers in Japan. Furthermore, the SN50 model has four times the network bandwidth of its predecessors. This enables effortless scaling of autonomous agents in a production setting. In addition, the air-cooled model meets the existing data center power constraints. This ensures a smooth and efficient transition to SambaNova hardware for IT professionals. Ultimately, the company seeks to control the multi-billion-dollar inference market. This cements their status as a leader among alternative GPU companies.

“AI is moving from a software story to an infrastructure story,” said Landon Downs, co-founder and managing partner at Cambium Capital.“SN50 is engineered for the real-world latency and economic requirements that will determine who successfully deploys agentic AI at scale.”

“The new SambaNova SN50 RDU changes the tokenomics of AI inference at scale. By delivering both high performance and high throughput with a chip that uses existing power and is air cooled, SambaNova is changing the game,” said Peter Rutten, Research Vice-President Performance Intensive Computing at IDC.

“With SN50, we are building an AI inference fabric for Japan that can serve our customers and partners with the speed, resiliency and sovereignty they expect from SoftBank,” said Hironobu Tamba, Vice President and Head of the Data Platform Strategy Division of the Technology Unit at SoftBank Corp. “By standardizing on SN50, we gain the ability to deliver world‑class AI services on our own terms with the performance of the best GPU clusters, but with far better economics and control.”

Explore IT Tech News for the latest advancements in Information Technology & insightful updates from industry experts!

News Source: Businesswire.com