Skip to main content

One of ChatGPT’s latest features comes to the free tier

ChatGPT's Canvas screen
OpenAI

In October, OpenAI debuted its Canvas feature, a collaborative interface that visually previews the AI response to the user’s writing or coding request. However, it was only made available as a beta feature for Plus and Teams subscribers. On Tuesday, the company announced that it is bringing Canvas to all users, even at the free tier.

While one could easily mistake Canvas for a blatant knockoff of Anthropic’s Artifacts feature, OpenAI is also incorporating a swath of new capabilities into Canvas. For one, Canvas is now integrated directly into the GPT-4o model so that it runs natively within ChatGPT, eliminating the need to select it specifically from the model-picking list.

Recommended Videos

The platform is now also capable of running Python code within the Canvas interface, allowing ChatGPT to analyze the code and offer suggestions for improving it. And it’s not just Python code; users can text prose or code directly into the Canvas interface, rather than upload it through the standard chat interface, then load it into the Canvas window.

The newly integrated Canvas is now also compatible with OpenAI’s custom GPTs, which should make them easier to design and build without deep programming knowledge. And if you make a mistake, the Show Changes feature will highlight which bits of prose or code were most recently altered.

Canvas is available on the web and through the Windows desktop app. “We plan to continue making improvements and launching new features available in Canvas in the near term,” the company wrote in its announcement.

This news comes amid OpenAI’s inaugural “12 Days of OpenAI” marketing effort. To date, the company has (finally) debuted its Sora video generation system, which can generate 20-second clips in resolutions up to 1080p, as well as released the full version of its o1 reasoning model family and introduced a $200-per-month Pro subscription tier needed to access them.

Andrew Tarantola
Andrew Tarantola is a journalist with more than a decade reporting on emerging technologies ranging from robotics and machine…
ChatGPT just got a bump to its coding powers
ChatGPT collaborating with Notion

For its penultimate 12 Days of OpenAI announcement, the company revealed a trio of updates to ChatGPT's app integration on Thursday, which should make using the AI in conjunction with other programs on your desktop less of a chore.

OpenAI unveiled ChatGPT's ability to collaborate with select developer-focused macOS apps, specifically VS Code, Xcode, TextEdit, Terminal, and iTerm2, back in November. Rather than needing to copy and paste code into ChatGPT, this feature allows the chatbot to pull specified content from the coding app as you enter your text prompt. ChatGPT, however, cannot generate code directly into the app, as Cursor or GitHub Copilot are able to.

Read more
Yes, it’s real: ChatGPT has its own 800 number
1-800-chatgpt

On the 10th of its "12 Days of OpenAI" media event, the company announced that it has set up an 800 number (1-800-ChatGPT, of course) where anyone in the U.S. with a phone line can dial in and speak with the AI via Advanced Voice Mode. Because why not.

“[The goal of] OpenAI is to make artificial general intelligence beneficial to all of humanity, and part of that is making it as accessible as possible to as many people as we can,” the company's chief product officer, Kevin Weil, said during the Wednesday live stream. “Today, we’re taking the next step and bringing ChatGPT to your telephone.”

Read more
OpenAI opens up developer access to the full o1 reasoning model
The openAI o1 logo

On the ninth day of OpenAI's holiday press blitz, the company announced that it is releasing the full version of its o1 reasoning model to select developers through the company's API. Until Tuesday's news, devs could only access the less-capable o1-preview model.

According to the company, the full o1 model will begin rolling out to folks in OpenAI's "Tier 5" developer category. Those are users that have had an account for more than a month and who spend at least $1,000 with the company. The new service is especially pricey for users (on account of the added compute resources o1 requires), costing $15 for every (roughly) 750,000 words analyzed and $60 for every (roughly) 750,000 words generated by the model. That's three to four times the cost of performing the same tasks with GPT-4o.

Read more