Skip to main content

What is GPU mining?

At the onset of cryptocurrency mining, CPUs were the primary way to conduct mining operations. Today, unless you’re using application-specific integrated circuits (ASICs) to mine a currency, the next best thing is GPU mining, and for many cryptocurrencies, it’s the dominant form of mining.

Here’s a breakdown of what GPU mining is and how it has started to cause real problems for the hardware industries it draws from.

Recommended Videos

What is mining, and how do GPUs accomplish it?

Image used with permission by copyright holder

Mining is essentially running software to solve complex mathematical problems in order to verify transactions on a cryptographic blockchain. Once a miner completes the math problem, the reward is a portion of cryptocurrency that’s associated what the mining activity. These verified transactions are the backbone of how a decentralized cryptocurrency is able to function as a legitimate currency.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Cryptocurrency miners use their equipment to calculate a specific number called a “nonce,” or “number only used once.” This nonce is plugged into a hash function (such as SHA-256) and calculated. Since the miner doesn’t know what the exact nonce is, multiple calculations have to be done in parallel to get the right number quickly. This is where GPUs come into play.

GPUs are specifically designed to render 3D graphics and shapes. This requires complex mathematical calculations that need to be done in parallel. For example, if you’re playing Call of Duty Warzone, your computer or console’s GPU has to not only render the entire game world (including shading, lighting, and shadows), but also the character models, guns, bullets, and game physics all at the same time. This requires the ability to compute massive amounts of calculations all at once, and it’s something graphics cards are very good at.

The same applies to GPU mining, where the many thousands of parallel processing cores in modern GPUs make them great at brute forcing the complex math problems needed to calculate mining hashes and ultimately earn their miners some cryptocurrency.

GPU mining — beyond Bitcoin

A cryptocurrency mining rig from a computer graphic card.
Getty Images

While you can technically still mine Bitcoin with GPUs, application-specific integrated circuits (ASIC) and field programmable gate arrays (FPGA) can be purpose-built for calculating specific hash algorithms and are far more efficient and capable than GPUs at Bitcoin mining. The number of GPUs it would take to match the “hashing rate” of a single ASIC would prove cost prohibitive due to the tremendous power requirements and costs involved.

An Nvidia RTX 3090 Founders Edition has a hashing rate of 150 million hashes per second (150MHps) and costs anywhere from $1,500 to $3,000. A popular ASIC like the Bitmain Antminer T17+originally cost a little over $800 and has a hashing rate of 64 trillion hashes per second (THps). It does consume a lot more power than an RTX 3090, around 10 times as much, but with its much greater hashing rate, it is a far superior piece of hardware for mining Bitcoin.

That said, other cryptocurrencies can still provide a valuable profit for GPU miners, including Monero, Ripple (XRP), and Dogecoin. A good way to determine what cryptocurrency to mine is to use a mining calculator. This typically allows you to input the type of cryptocurrency you want to mine, the mining hardware you’re using or the hashing rate, and the amount of power you’re using. From there, you’re able to see how much you’ll be able to make.

One of the more popular ways to mine solo is through software like NiceHash. This allows you to rent your mining hardware out for others to mine for alt-currencies. You’re sent the profits in the form of Bitcoin. The process of setting up NiceHash on your computer is pretty simple and is perfect for those who don’t want to mess with using a command line.

That said, it may be worth looking to either join a mining pool or utilizing cloud mining. A mining pool combines the hardware capabilities of multiple miners to increase their computational power. Cloud mining may be a decent option for those who don’t have the money to invest in their own hardware (especially these days). With cloud mining, you rent mining hardware from companies like HashFlare to do the heavy lifting for you. You are then paid for your investment with Bitcoin.

The GPU shortage continues

There was a massive crypto boom starting around December 2020, when the price of Bitcoin and Ehereum jumped dramatically. This led crypto miners to quickly snap up graphics cards, accounting for 25% of all GPU sales. Unfortunately, this has contributed to the current GPU shortage and led Nvidia to limit the hash rate of newer 30-series cards.

Despite the massive drawbacks to mining Bitcoin and Litecoin on graphics cards, the upside is that graphics cards can still be sold in the aftermarket because of their general-purpose use. ASICs by definition cannot be used outside of their intended purpose and thus lose a lot of value if the specific algorithm it’s programmed for isn’t in use.

Additionally, if crypto prices crash, that reduces the profitability of mining, which also reduces the demand for graphics cards and a big sell-off of mining GPUs that have lost profitability, which can lead to a glut of GPUs on the second-hand market, crashing prices there in turn.

David Matthews
Former Digital Trends Contributor
David is a freelance journalist based just outside of Washington D.C. specializing in consumer technology and gaming. He has…
Nvidia may be working on a surprising new budget GPU
Nvidia GeForce RTX 3050 graphics card.

This is certainly unexpected. Hints of an upcoming Nvidia graphics card showed up in the PCI-ID database, and it's definitely not what you think. Instead of working on the RTX 50-series or refreshing a 40-series GPU, Nvidia is bringing back the slowest RTX 30-series card -- the RTX 3050 -- but with an Ada Lovelace chip. In theory, that could make this the first RTX 30-series GPU to have access to Deep Learning Super Sampling 3 (DLSS 3).

The GPU in question is a laptop version of the RTX 3050. It's hard to say why Nvidia would choose to release new versions of that particular card in 2024 (or beyond), but the RTX 3050 A exists -- VideoCardz found traces of it in the latest Nvidia drivers. Just the fact that there's an RTX 3050 in the works is surprising, but that it's based on the AD106 chip is what baffles me.

Read more
Nvidia may have a complete monster GPU in the works
Nvidia's Titan RTX GPU.

Nvidia must be feeling pretty secure, sitting atop the list of the best graphics cards in this generation. That trend is likely to continue, what with AMD possibly stepping down from the high-end GPU race -- but Nvidia might still surprise us. According to RedGamingTech, Nvidia is working on a GPU referred to as "Titan AI," and it sounds like the most monstrous card we've ever seen. Another reputable leaker just confirmed that theory.

The YouTuber shed some light on the performance figures we might see in the RTX 50-series, focusing on how much each GPU will outperform its predecessor. These numbers refer to straight-up rasterization with no accounting for ray tracing, and RedGamingTech wasn't sure whether they came from gaming tests or a synthetic benchmark.

Read more
A surprising new competitor in the GPU upscaling wars has entered the ring
ARM logo on the side of a building.

Arm is stepping into the graphics upscaling arena with its new Accuracy Super Resolution (ASR) technology, promising significant enhancements for GPU performance on mobile devices.

ASR is a temporal upscaler based on AMD's open-source FidelityFX Super Resolution 2 (FSR2). This compatibility allows developers to leverage familiar API and configuration options, streamlining integration. AMD's FSR and Nvidia's DLSS have primarily boosted graphical fidelity on gaming PCs, enabling higher frame rates and rendering capabilities. Arm's focus, however, is on the power savings achievable through reduced GPU use, which is crucial for mobile devices where thermal throttling can degrade performance.

Read more