Skip to main content

They’re smart AF, and Nvidia’s monstrous BFGD monitors are a BFD

Nvidia BFGD
Matt Smith/Digital Trends
What if the TV tuner was removed from your television? Technically, it would no longer be a TV. But would you notice? Even if you did notice, would you care?

Probably not. Millions of people have cut the cord from their cable service, instead relying on Netflix, Hulu, and Amazon Prime Video for entertainment. Game consoles have also stepped into the void left by old-fashioned, scheduled programming. There are more gamers than ever, playing longer than ever.

In short, the way people use televisions has changed. Maybe it’s time for the TV too change, too. Nvidia’s Big Format G-Sync Displays (BFGD), which debuted on the show floor at CES, show one possible future for the TV, a future that focuses on gaming, streaming video, and smooth delivery of any content thrown at it.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Just don’t call it a monitor

It would be tempting to dismiss the BFGDs as 65-inch monitors. They’re designed to connect over DisplayPort 1.4, instead of HDMI (though HDMI is present for audio), and the early marketing positions them as the ultimate accessory for a PC-gaming den.

BIG FORMAT GAMING DISPLAYS with NVIDIA G-SYNC™ and SHIELD™ BUILT-IN

That sells the BFGDs short. Yeah, they’re targeting the PC, but they also have an Nvidia Shield built in. The Shield, if you’re not familiar, is a cross between a Roku and a bare-bones Android game console. It can handle all the online streaming apps you’d expect from an entertainment box, as well as play games – both Android titles, and games available through Nvidia’s GeForce Now subscription streaming service.

Think of it as a smart TV without a TV tuner. A very smart TV. It’s not embroiled in any stupid competition between streaming services’ corporate overlords. It can play popular games without any additional hardware. And it’ll receive all the same updates as the Shield console, which should mean a steady stream of new features over the years.

A different approach to image quality

The smart features that’ll come bundled in every BFGD are far more modern than the hodgepodge interfaces that ship with many televisions, but that’s less than half of what makes them great. The real secret sauce can be found in the BFGDs’ radically different approach to image quality.

Think of it as a smart TV without a TV tuner. A very smart TV.

A typical, top-tier television from LG, Samsung, or Vizio is built to deliver maximum visual punch. It seeks to maximize contrast, serve a wide color gamut, and minimize artifacts. The results are undeniably spectacular, but there’s a downside. Modern televisions have high latency and confusing image quality settings, and can suffer unusual frame pacing problems when they’re not fed ideal content.

BFGDs are different. They do have HDR, 4K resolution and, according to Nvidia, are built with a panel that uses a technique similar to Samsung’s Quantum Dots. Yet they’re also fast and fluid. Every BFGD will offer at least a 120Hz refresh rate. Latency numbers aren’t being quoted yet, but Nvidia told us that even 16 milliseconds would be considered “really quite high.” LG and Samsung’s best displays can’t dip below 20 milliseconds, even when turned to game mode.

Then there’s Nvidia’s not-so-secret weapon: G-Sync. It synchronizes the refresh rate of a BFGD with the input framerate of whatever G-Sync-capable device it’s connected to, including the built-in Shield. That synchronization can occur with any content, including video. It doesn’t matter if a video was shot at 24, 60, 120, or 29.997 frames per second – it will always display smoothly, without any added stutter or lag caused by the display.

BFGDs could be a BFD

Acer, Asus, and HP are lined up to build the first BFGDs, all of them 65-inchers using the same panel. I doubt they’ll sell anywhere near the volume of modern televisions. At least, not at first. But if Nvidia and its partners can deliver on the BFGD’s promise, it won’t just be PC gamers who take notice.

You can expect to see the first BFGDs in the second half of 2018. Pricing hasn’t been announced.

Editors' Recommendations

Matthew S. Smith
Matthew S. Smith is the former Lead Editor, Reviews at Digital Trends. He previously guided the Products Team, which dives…
It feels like Nvidia is gaslighting us with this ‘new’ GPU
Nvidia's Jeff Fisher presenting its CES 2023 keynote.

Nvidia announced a "new" GPU dubbed the RTX 4070 Ti via its broadcast ahead of CES 2023. Why the quotations around "new," you ask?

Well, it's not a new GPU at all actually, but you'd never know it by the way Nvidia GeForce head Jeff Fisher just talked about it. If you weren't paying close attention to tech media news for the past few months, you'd think this was Nvidia's fourth GPU in the new RTX 40-series -- but no. It's just a new name for a product that was announced last year, and then quickly retracted.

Read more
Nvidia’s rebranded RTX 4070 Ti surfaces at CES 2023
Spec sheet for Nvidia's RTX 4070 Ti.

Nvidia's controversial RTX 4070 Ti is real, and the company announced the card would arrive on January 5 in its CES 2023 keynote -- just a couple of days after the announcement. Why the accelerated timeline? There's a good chance Nvidia has had the RTX 4070 Ti in the incubator for a while.

In October 2022, Nvidia took a radical step by "unlaunching" its 12GB RTX 4080. This GPU was announced alongside the 16GB RTX 4080 and RTX 4090, and many people accused Nvidia of giving a lower-class GPU a name upgrade to justify a higher price. The specs of the RTX 4070 Ti show it's a rebranded RTX 4080 12GB, as rumors suggested.

Read more
Nvidia just leaked its own GPU for CES 2023
GeForce RTX logo is shown on the side of a graphics card.

Nvidia leaked its GeForce RTX 4070 Ti on its website, suggesting the launch of this eagerly anticipated graphics card is approaching rapidly. It's been rumored that the card will launch at CES 2023.

Nvidia has been promoting its hardware as an integral part of the VR and AR revolution with the goal to recreate the world as digital twins in the Nvidia Omniverse. In the latest leak, Nvidia mentioned the "GeForce RTX 4070 Ti, NVIDIA RTX A4500 or higher" as the recommended GPU to run the Omniverse View app. The app lets you view real-time Omniverse renders on your desktop requires an RTX GPU with 8 GB but performs better with more powerful hardware.

Read more