Groq, a pioneering startup developing cutting-edge chips to accelerate generative AI models, has just secured a staggering $640 million in a new funding round led by Blackrock. The impressive lineup of investors also includes Neuberger Berman, Type One Ventures, Cisco, KDDI, and Samsung Catalyst Fund.
This latest funding round significantly boosts Groq’s total capital raised to over $1 billion, valuing the company at an impressive $2.8 billion. Originally aiming to raise $300 million at a $2.5 billion valuation, Groq’s success in surpassing these figures is a testament to its strong market position. This new valuation more than doubles its previous valuation of approximately $1 billion from April 2021, when the company raised $300 million in a round led by Tiger Global Management and D1 Capital Partners.
In a surprising and strategic move, Meta’s chief AI scientist, Yann LeCun, has been appointed as a technical advisor to Groq. This appointment is notable given Meta’s own investments in AI chips, highlighting Groq’s ability to attract top-tier talent. Additionally, Stuart Pann, former head of Intel’s foundry business and ex-CIO at HP, will join Groq as Chief Operating Officer, further strengthening the company’s leadership team.
Founded in 2016, Groq emerged from stealth mode with a mission to revolutionize AI processing. The company has developed an innovative Language Processing Unit (LPU) inference engine, which it claims can run existing generative AI models—such as OpenAI’s ChatGPT and GPT-4—at ten times the speed and one-tenth the energy consumption of conventional processors.
At the helm of Groq is CEO Jonathan Ross, known for his role in inventing Google’s tensor processing unit (TPU), an AI accelerator chip. Ross co-founded Groq with Douglas Wightman, an entrepreneur and former engineer at Alphabet’s X moonshot lab, nearly a decade ago.
Groq’s flagship product, GroqCloud, offers a developer platform powered by its LPUs. This platform features “open” models such as Meta’s Llama 3.1 family, Google’s Gemma, OpenAI’s Whisper, and Mistral’s Mixtral. GroqCloud also provides an API that allows customers to leverage its chips in cloud instances. Groq’s AI-powered chatbot playground, GroqChat, launched late last year, further showcases the company’s innovative capabilities. As of July, GroqCloud boasted over 356,000 developers, with a significant portion of the new funding earmarked to scale capacity and add new models and features.
“Many of these developers are at large enterprises,” notes COO Stuart Pann. “By our estimates, over 75% of the Fortune 100 are represented.”
Despite the booming generative AI market, Groq faces fierce competition from both emerging AI chip startups and industry giant Nvidia. Nvidia dominates the AI chip market, controlling an estimated 70% to 95% of the market share for AI chips used to train and deploy generative AI models. To maintain its lead, Nvidia has committed to releasing a new AI chip architecture annually and is reportedly creating a new business unit focused on custom chip designs for cloud computing firms and other sectors.
Groq’s competition extends beyond Nvidia, facing rivals like Amazon, Google, and Microsoft, all of which offer custom AI chips for cloud workloads. Amazon’s AWS features the Trainium, Inferentia, and Graviton processors; Google Cloud offers TPUs and the upcoming Axion chip; and Microsoft recently launched Azure instances previewing its Cobalt 100 CPU, with Maia 100 AI Accelerator instances expected soon.
Additionally, Groq competes with Arm, Intel, AMD, and a growing number of startups in a rapidly expanding AI chip market projected to reach $400 billion in annual sales within five years. Notable competitors include D-Matrix, which raised $110 million for its inference compute platform, and Etched, which emerged from stealth with $120 million for a custom processor. SoftBank’s Masayoshi Son is reportedly seeking $100 billion for a chip venture, while OpenAI is also exploring its own AI chip-making initiative.
To carve out its niche, Groq is heavily investing in enterprise and government outreach. In March, Groq acquired Definitive Intelligence, a Palo Alto-based firm offering business-oriented AI solutions, forming a new business unit called Groq Systems. This unit focuses on serving organizations, including U.S. government agencies and sovereign nations, looking to integrate Groq’s chips into existing data centers or build new ones.
Groq recently partnered with government IT contractor Carahsoft to sell its solutions to public sector clients and has a letter of intent to install tens of thousands of LPUs at Earth Wind & Power’s Norway data center. Additionally, Groq is collaborating with Saudi Arabian consulting firm Aramco Digital to install LPUs in future Middle Eastern data centers.
As Groq strengthens customer relationships, the Mountain View, California-based company is also advancing its chip technology. Last August, Groq announced a partnership with Samsung’s foundry business to manufacture 4nm LPUs, expected to deliver significant performance and efficiency gains over its first-generation 13nm chips. Groq aims to deploy over 108,000 LPUs by the end of Q1 2025.
With a robust funding round, strategic partnerships, and cutting-edge technology, Groq is well-positioned to challenge Nvidia and other industry giants in the rapidly evolving AI chip market.