Skip to main content

Grok 3 launch confirmed as 10 times more powerful than previous model

Elon Musk and the xAI team launching Grok 3
xAI

Elon Musk and the xAI team announced the Grok 3 AI model in an evening live stream on Monday.

The team detailed that the new model is “a magnitude more capable” than Grok 2, indicating Grok 3 has 10 to 15 times more power than Grok 2. They also claim that Grok 3 is more powerful than its AI model competitors such as DeekSeek and Google Gemini.

The xAI team said it has been improving on the Grok 3 AI model over the last several months, noting that it will be very, very funny, and adding that it has only been 17 months since the launch of the Grok 1 model.

Recommended Videos

The team has tested Grok 3 against many scholastic challenges, including the American Invitational Mapping Foundation, which the model matched up well against.

In demos, the xAI team showed how Grok 3 performed interesting tasks, such as computing a spacecraft mission from Earth to Mars and back and creating a game that is a mix of Tetris and Bejeweled. The team also mentioned that Grok 3 includes a feature called Big Brain, which is a reasoning model mode allowing for deeper thinking when processing queries.

Musk noted that 17 months prior, the original Grok model could barely solve high school problems, and now “Grok is ready to go to college,” with how much it has advanced.

In addition to Grok 3, the team also mentioned that it is working on an AI gaming studio, which will be a service for consumers. Additionally, it is developing a Deep Search feature for Grok, which will be xAI’s version of AI Agents.

Grok 3 will be available as of Monday for Premium + users. The team is also rolling out a new subscription product called SuperGrok, which will have better access to the AI model and additional features.

Elon Musk and the xAI team launching Grok 3.
xAI

The xAI team noted that Grok 3 should be treated as a beta version, and users should be mindful of errors in its processing.

In a post-launch Q&A, the team confirmed that Grok 3 can process audio into text. Additionally, they confirmed, that Grok 2 will be made open source once Grok 3 is “mature and stable.”

Fionna Agomuoh
Fionna Agomuoh is a Computing Writer at Digital Trends. She covers a range of topics in the computing space, including…
Elon Musk reportedly will blow $10 billion on AI this year
Elon Musk at Tesla Cyber Rodeo.

Between Tesla and xAI, Elon Musk's artificial intelligence aspirations have cost some $10 billion dollars in bringing training and inference compute capabilities online this year, according to a Thursday post on X (formerly Twitter) by Tesla investor Sawyer Merritt.

"Tesla already deployed and is training ahead of schedule on a 29,000 unit Nvidia H100 cluster at Giga Texas – and will have 50,000 H100 capacity by the end of October, and ~85,000 H100 equivalent capacity by December," Merritt noted.

Read more
Grok 2.0 takes the guardrails off AI image generation
Elon Musk as Wario in a sketch from Saturday Night Live.

Elon Musk's xAI company has released two updated iterations of its Grok chatbot model, Grok-2 and Grok-2 mini. They promise improved performance over their predecessor, as well as new image-generation capabilities that will enable X (formerly Twitter) users to create AI imagery directly on the social media platform.

“We are excited to release an early preview of Grok-2, a significant step forward from our previous model, Grok-1.5, featuring frontier capabilities in chat, coding, and reasoning. At the same time, we are introducing Grok-2 mini, a small but capable sibling of Grok-2. An early version of Grok-2 has been tested on the LMSYS leaderboard under the name 'sus-column-r,'” xAI wrote in a recent blog post. The new models are currently in beta and reserved for Premium and Premium+ subscribers, though the company plans to make them available through its Enterprise API later in the month.

Read more
Meta’s next AI model to require nearly 10 times the power to train
mark zuckerberg speaking

Facebook parent company Meta will continue to invest heavily in its artificial intelligence research efforts, despite expecting the nascent technology to require years of work before becoming profitable, company executives explained on the company's Q2 earnings call Wednesday.

Meta is "planning for the compute clusters and data we'll need for the next several years," CEO Mark Zuckerberg said on the call. Meta will need an "amount of compute… almost 10 times more than what we used to train Llama 3," he said, adding that Llama 4 will "be the most advanced [model] in the industry next year." For reference, the Llama 3 model was trained on a cluster of 16,384 Nvidia H100 80GB GPUs.

Read more