Skip to main content

ChatGPT agrees: it shouldn’t write the articles you read

AI is taking the media world by storm, and that’s not an exaggeration. This week, it was revealed that CNET and the venture capital firm that owns it, Red Ventures, have been using AI to generate content across the various brands it owns.

The story picked up, and journalists uncovered that these AI-generated articles were littered with inaccuracies and often incorrectly attributed to human writers. There’s a moral and philosophical question about AI generating content that mimics a human, but I wanted to go to the source and ask AI what it thought about writing articles. Its own response to the ethical dilemma shows that it’s not as confident in its own technology as some companies are.

Recommended Videos

ChatGPT says no

ChatGPT and OpenAI logos.
OpenAI

Before getting into the responses, all AI-generated content in this article will be in quotes, separated into its own paragraph, and include a note that it was generated by ChatGPT. Everything else comes from me.

ChatGPT is the forerunner in the world of AI chatbots right now, so I started off by asking it to react to CNET using AI in particular:

“I am an AI model, so I do not have personal opinions. However, I can tell you that using AI to generate articles can be a useful tool for media companies, as it can save time and resources. However, the quality and accuracy of the generated articles will depend on the data and algorithms used. It is important for media companies to have a process in place for fact-checking and editing the content generated by AI to ensure accuracy and integrity.” (Generated by ChatGPT).

That seems straightforward enough, and it mirrors the process that CNET has taken toward its AI-generated articles. Even if the fact-checking policy isn’t up to snuff, CNET still insists that its articles are reviewed and fact-checked by human editors. But I posed the more general question of “should ChatGPT be used to write articles” and got a slightly more detailed response:

“ChatGPT and other language models like it can be a useful tool for generating written content, but it should not be used as the sole source for creating articles. The quality and accuracy of the content generated by a language model like ChatGPT will depend on the data and algorithms used, and it is important to have a process in place for fact-checking and editing the content generated by the model to ensure accuracy and integrity.

It is also important to note that while ChatGPT can generate text that is coherent and grammatically correct, it may lack the depth and nuance that human writers bring to their work. It may also not be able to fully understand and interpret complex and nuanced information.

In summary, ChatGPT can be used as a tool to assist in the writing process, but it should not be used to replace human writers. It’s important to have human oversight and editing to ensure the accuracy and integrity of the content generated.” (Generated by ChatGPT).

That sounds fairly self-aware — an admission that ChatGPT itself lacks the nuance and depth to write articles on its own. But I had a feeling there was more missing than just nuance. After all, AI-generated content also lacks context, and that’s what I wanted to test out for myself.

Lacking nuance

Shutterstock

In the case of CNET, the AI was being used to generate content surrounding credit cards, which is a topic that needs a lot of context in 2023. Interest rates are continuing to rise, and job growth is slowing. There’s an ethical dilemma around these finance topics that AI can’t account for and react to. Or, at the very least, AI can’t bring human empathy that opening a new line of credit might not be the best idea under certain circumstances.

But I’m not a finance writer, so I might not even spot some of the mishandling of that context. I am, however, an expert in gaming PCs and graphics cards — so in theory, I should be able to spot that missing nuance and context from ChatGPT’s answers. To start off, I asked ChatGPT if AMD or Nvidia was better, and here’s what I got:

“The choice between an Nvidia or AMD graphics card depends on your specific needs and budget. Both companies offer a range of graphics card options that vary in terms of performance, price, and features.

“Nvidia graphics cards are known for their excellent performance in gaming and professional applications such as video editing, 3D rendering, and AI development. They also have a wide range of features such as ray tracing, which is a technology that provides more realistic lighting and shadows, and DLSS (Deep Learning Super Sampling) which improves the performance in games.

“AMD graphics cards, on the other hand, are known for their excellent value for money. They often offer similar or even better performance than Nvidia cards at a lower price point. They also have a feature called FreeSync which helps to reduce screen tearing and stuttering in games.

“When making your decision, it’s important to consider the specific needs of your application. If gaming is your primary focus, Nvidia graphics cards may be the better choice due to their excellent gaming performance and features. If you’re looking for a more budget-friendly option or want to use the card for other applications such as video editing, AMD graphics cards may be a better choice.

“It’s also important to check the compatibility of the card with your computer’s power supply and motherboard, and the supported resolution and refresh rate of your monitor. It’s always recommended to check the benchmarks, reviews and technical specifications of the models you are considering before making a purchase.” (Generated by ChatGPT).

Two AMD Radeon RX 7000 graphics cards on a pink surface.
Jacob Roach / Digital Trends

At first blush, the response doesn’t seem too bad. Nvidia cards are more powerful overall, but AMD cards offer a better value. The devil is in the details here, though. For example, ChatGPT recommends an AMD graphics card for video editing and an Nvidia graphics card solely for gaming. The opposite is true, with Nvidia graphics cards vastly speeding up video editing workflows compared to AMD (a fact that ChatGPT even highlighted in the second paragraph of its response).

There are some critical elements missing, too. The response would lead you to believe that Nvidia graphics cards are the only ones that support ray tracing. On the other end, it fails to mention G-Sync or the VESA Adaptive Sync standard, leading you to believe that FreeSync is an exclusive feature for AMD graphics cards.

Perhaps that’s the nuance ChatGPT admits it lacks. And sure, you can see the bones of something accurate here and prod the copy into a more usable state. Still, it’s important to remember that none of this comes from a human, and the AI doesn’t care about inaccuracies (or anything, for that matter). It’s OK to be wrong, and I don’t want advice from a source that’s willing to do that.

ChatGPT AI bot running a phone.
Shutterstock

To drive this point home, I asked ChatGPT a final question about if the AMD RX 7900 XT or RTX 4080 was better:

“As of 2021, the RTX 3080 is currently considered a better overall graphics card than the AMD RX 7900 XT. The RTX 3080 offers better performance in most benchmarks and games, and also has support for newer technology such as ray tracing and DLSS. However, the RX 7900 XT is also a very powerful graphics card and may be a better choice for certain specific use cases or scenarios. Ultimately, the best choice for you will depend on your specific needs and budget.” (Generated by ChatGPT).

Bluntly, that’s an asinine response — you can read our human-written best graphics cards roundup for context. But my point is that it’s not enough for an editor to go in and tweak some of the wording or fact-check a couple of claims. To the average person, it might sound sensible and well-written — but to someone knowledgeable on the topic, the reasoning and logic behind its answers are entirely unfounded. But hey, I’d expect no more from content generated by AI.

Jacob Roach
Former Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Meta’s new AI app lets you share your favorite prompts with friends
Meta AI WhatsApp widget.

Meta has been playing the AI game for a while now, but unlike ChatGPT, its models are usually integrated into existing platforms rather than standalone apps. That trend ends today -- the company has launched the Meta AI app and it appears to do everything ChatGPT does and more.

Powered by the latest Llama 4 model, the app is designed to "get to know you" using the conversations you have and information from your public Meta profiles. It's designed to work primarily with voice, and Meta says it has improved responses to feel more personal and conversational. There's experimental voice tech included too, which you can toggle on and off to test -- the difference is that apparently, full-duplex speech technology generates audio directly, rather than reading written responses.

Read more
It’s not your imagination — ChatGPT models actually do hallucinate more now
Deep Research option for ChatGPT.

OpenAI released a paper last week detailing various internal tests and findings about its o3 and o4-mini models. The main differences between these newer models and the first versions of ChatGPT we saw in 2023 are their advanced reasoning and multimodal capabilities. o3 and o4-mini can generate images, search the web, automate tasks, remember old conversations, and solve complex problems. However, it seems these improvements have also brought unexpected side effects.

What do the tests say?

Read more
My favorite web browser is one you’ve probably ignored – and you shouldn’t
Opera browser on a laptop.

The world of web browsers is divided across some deep fault lines. On one hand, you have Chrome and Safari, which are clearly segregated across ecosystems and command the lion’s share of the market. Edge is a distant third, while Firefox and Brave are mostly tied to small enthusiast communities. 

Then we have new entrants like Arc, which are trying to radically reimagine the concept of a web browser. Smushed between the big players and small fish, we have the Opera browser. It has been around for a while, but in the past couple of years, it has really picked up the pace of innovation.

Read more