Skip to main content

Gigabyte is the latest to shoot its GTX 950 with a shrink-ray

If you hadn’t guessed by the influx of micro PCs, Intel NUCs, and ever shrinking single PCB systems over the past few years, small computing is in. The same goes for graphics cards, with AMD slimming down its Fury line-up of GPUs, and Nvidia doing something similar with its GTX 950s by ditching the power plugs and making them more efficient.

Although a few companies have beaten it to the punch (ASUS, MSI and EVGA), Gigabyte is jumping aboard that wagon and has released a GTX 950 of its own, sans 6pin connector. It’s also half the length of the traditional GTX 950 and though it sports dual slot thickness, it also features a shortened cooler with a single fan.

Recommended Videos

Gigabyte is looking to differentiate its card by featuring ridged blades on its 90mm fan, which purportedly increase airflow by 23 percent without increasing noise levels.

Related: NVIDIA GeForce GTX 950 review

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Inside, the card is quite typical. It features the GM206 GPU paired up with 2GB of GDDR5 memory. Component choice is said to be of a high quality, with lower-switch resistant MOSFETS, ferrite core chokes, and solid capacitors, which should guarantee a long shelf-life for the card. It also has a few clock-speed presets, so you can give its core and memory a little boost if you like.

The modified GTX 950 also supports Gigabyte’s OC Guru II overclocking software, so you can do it all manually. You can no doubt boost the performance a bit more than it was intended for. It may mean higher power draw and a faster spinning (read: louder) fan, but those are the typical trade-offs for performance.

Connectivity wise, buyers can expect a pair of DVI ports, a single HDMI, and another singular DisplayPort connector (as per TechReport). No pricing information has yet been released.

Although far from a stand out, the Gigabyte low-power GTX 950 is a nice looking package. Would you consider it for an entry level gaming system?

Jon Martindale
Jon Martindale is a freelance evergreen writer and occasional section coordinator, covering how to guides, best-of lists, and…
Intel may already be conceding its fight against Nvidia
Two intel Arc graphics cards on a pink background.

Nvidia continues to own the top-of-the-line GPU space, and the competition just hasn't been able to, well, compete. The announcement of the impressive-sounding RTX 40 Super cards cements the lead even further.

As a result, AMD is said to be giving up on the high-end graphics card market with its next-gen GPUs. And now, a new rumor tells us that Intel might be doing the same with Arc Battlemage, its anticipated upcoming graphics cards that are supposed to launch later this year. While this is bad news, it's not surprising at all.
Arc Battlemage leaks
First, let's talk about what's new. Intel kept quiet about Arc Battlemage during CES 2024, but Tom Petersen, Intel fellow, later revealed in an interview that it's alive and well. The cards might even be coming out this year, although given Intel's track record for not meeting GPU deadlines, 2025 seems like a safer bet. But what kind of performance can we expect out of these new graphics cards? This is where YouTuber RedGamingTech weighs in.

Read more
Why it’s a surprisingly good time to buy a GPU right now
AMD RX 7800 XT and RX 7700 XT graphics cards.

Getting a good deal on a GPU is all about timing. Buy at the wrong time, and you'll end up overpaying.

But I have good news for you. I've been closely monitoring GPU prices as we move through the final quarter of the year, and I'm seeing some really solid deals out there, with many of the best graphics cards coming in under their retail price. If you've been thinking of buying a graphics card, now is turning out to be a surprisingly good time.
Why is now the time to buy a GPU?

Read more
There’s only one use for an RTX 4080 Ti, and it’s not what you think
Three RTX 4080 cards sitting on a pink background.

According to the latest round of leaks, Nvidia may be readying an RTX 4080 Ti, set to come out in the first few months of 2024. While such a GPU would definitely rank high among the best graphics cards, it's not exactly necessary right now -- there are plenty of high-end GPUs floating around. There is one reason I would like to see an RTX 4080 Ti, though; to push down the price of the RTX 4080.

MEGAsizeGPU, who is a frequent speculator in the graphics card space, shared over the weekend that an RTX 4080 Super, or maybe Ti, will come out in just a few months. Given Nvidia's current naming convention in this generation, a Ti card seems more likely. As per MEGAsizeGPU, the card will be based on Nvidia's flagship AD102 chip and will have a power consumption below 450 watts. The other specs are a mystery, but I wouldn't be surprised if Nvidia added some more VRAM to the card -- 20GB seems likely. The bump from AD103 to AD102 should also deliver a decent upgrade in CUDA core counts -- dare I say in the 12,000 to 13,000 range?

Read more