Daxdi now accepts payments with Bitcoin

The Best Graphics Cards for 2021

If you're a PC gamer, or a content creator who lives and dies by the speed of your graphics-accelerated software, your video card is the engine that powers what you can do—or how lustily you can brag.

Our guide will help you sort through the best video-card options for your desktop PC, what you need to know to upgrade a system, and how to evaluate whether a particular card is a good buy.

We'll also touch on some upcoming trends—they could affect which card you choose.

After all, consumer video cards range from under $100 to well over $1,499.

It's easy to overpay or underbuy.

(We won't let you do that, though.)


Who's Who in GPUs: AMD vs.

Nvidia

First off, what does a graphics card do? And do you really need one?

If you're looking at any given prebuilt desktop PC on the market, unless it's a gaming-oriented machine, PC makers will de-emphasize the graphics card in favor of promoting CPU, RAM, or storage options.

Indeed, sometimes that's for good reason; a low-cost PC may not have a graphics card at all, relying instead on the graphics-accelerated silicon built into its CPU (an "integrated graphics processor," commonly called an "IGP").

There's nothing inherently wrong with relying on an IGP—most business laptops, inexpensive consumer laptops, and budget-minded desktops have them—but if you're a gamer or a creator, the right graphics card is crucial.

A modern graphics solution, whether it's a discrete video card or an IGP, handles the display of 2D and 3D content, drawing the desktop, and decoding and encoding video content in programs and games.

All of the discrete video cards on the consumer market are built around large graphics processing chips designed by one of two companies: AMD or Nvidia.

These processors are referred to as "GPUs," for "graphics processing units," a term that is also applied, confusingly, to the graphics card itself.

(Nothing about graphics cards...ahem, GPUs...is simple!)

The Best Graphics Card Deals This Week*

*Deals are selected by our partner, TechBargains

The two companies work up what are known as "reference designs" for their video cards, a standardized version of a card built around a given GPU.

Sometimes these reference-design cards are sold directly by Nvidia (or, less often, by AMD).

Nvidia's own brand of cards are spotted easily by the "Founders Edition" badge, something that until the release of Nvidia's latest, the GeForce RTX 3000 series, didn't mean much more than slightly higher clock speeds from stock and sturdy build quality.

Often the Founders Editions cards are the most aesthetically consistent of any cards that might come out during the lifetime of a particular GPU.

But their designs tend to be conservative, not as accommodating to aggressive overclocking or modification as some third-party options are.

Sometimes, reference cards are duplicated by third-party card makers (companies referred to in industry lingo as AMD or Nvidia "board partners"), such as Asus, EVGA, MSI, Gigabyte, Sapphire, XFX, and Zotac.

Depending on the graphics chip in question, these board partners may sell their own self-branded versions of the reference card (adhering to the design and specifications set by AMD or Nvidia), or they will fashion their own custom products, with different cooling-fan designs, slight overclocking done from the factory, or features such as LED mood illumination.

Some board partners will do both—that is, sell reference versions of a given GPU, as well as their own, more radical designs.


Who Needs a Discrete GPU?

We mentioned integrated graphics (IGPs) above.

IGPs are capable of meeting the needs of most general users today, with three broad exceptions...

Professional Workstation Users. These folks, who work with CAD software or in video and photo editing, will still benefit greatly from a discrete GPU.

Some of their key applications can transcode video from one format to another, or perform other specialized operations using resources from the GPU instead of (or in addition to) those of the CPU.

Whether this is faster will depend on the application in question, which specific GPU and CPU you own, and other factors.

Productivity-Minded Users With Multiple Displays. People who need a large number of displays can also benefit from a discrete GPU.

Desktop operating systems can drive displays connected to the IGP and discrete GPUs simultaneously.

If you've ever wanted five or six displays hooked up to a single system, you can combine an IGP and a discrete GPU to get there.

That said, you don't necessarily need a high-end graphics card to do that.

If you're simply displaying business applications, multiple browser windows, or lots of static windows across multiple displays (i.e., not demanding PC games), all you need is a card that supports the display specifications, resolutions, monitor interfaces, and number of panels you need.

If you're showing four web browsers across four display panels, a GeForce RTX 3080 card, say, won't confer any greater benefit than a GeForce GTX 1660 with the same supported outputs.

Gamers. And of course, there's the gaming market, to whom the GPU is arguably the most important component.

RAM and CPU choices both matter, but if you have to pick between a top-end system circa 2018 with a 2021 GPU or a top-end system today using the highest-end GPU you could buy in 2018, you'd want the former.

Graphics cards fall into two distinct classes: consumer cards meant for gaming and light content creation work, and dedicated cards meant for professional workstations and geared toward scientific computing, calculations, and artificial intelligence work.

This guide, and our reviews, will focus on the former, but we'll touch on workstation cards a little bit, later on.

The key sub-brands you need to know across these two fields are Nvidia's GeForce and AMD's Radeon RX (on the consumer side of things), and Nvidia's Titan and Quadro, as well as AMD's Radeon Pro and Radeon Instinct (in the pro workstation field).

Nvidia continues to dominate the very high end of both markets.

For now though, we'll focus on the consumer cards.

Nvidia's consumer card line in early 2021 is broken into two distinct classes, both united under the long-running GeForce brand: GeForce GTX, and GeForce RTX.

AMD's consumer cards, meanwhile, comprise the Radeon RX and (now fading) Radeon RX Vega families, as well as the end-of-life Radeon VII.

Before we get into the individual lines in detail, though, let's outline a few very important considerations you should make for any video-card purchase.


Target Resolution and Monitor Tech: Your First Considerations

Resolution is the horizontal-by-vertical pixel count at which your video card will drive your monitor.

This has a huge bearing on which card to buy, and how much you need to spend, when looking at a video card from a gaming perspective.

If you are a PC gamer, a big part of what you'll want to consider is the resolution(s) at which a given video card is best suited for gaming.

Nowadays, even low-end cards will display everyday programs at lofty resolutions like 3,840 by 2,160 pixels (a.k.a., 4K).

But for strenuous PC games, those cards will not have nearly the power to drive smooth frame rates at high resolutions like those.

In games, the video card is what calculates positions, geometry, and lighting, and renders the onscreen image in real time.

For that, the higher the in-game detail level and monitor resolution you're running, the more graphics-card muscle is required.

Resolution Is a Key Decision Point

The three most common resolutions at which today's gamers play are 1080p (1,920 by 1,080 pixels), 1440p (2,560 by 1,440 pixels), and 2160p or 4K (3,840 by 2,160 pixels).

Generally speaking, you'll want to choose a card suited for your monitor's native resolution.

(The "native" resolution is the highest supported by the panel, and the one at which the display looks the best.)

You'll also see ultra-wide-screen monitors with in-between resolutions (3,440 by 1,440 pixels is a common one); you can gauge these versus 1080p, 1440p, and 2160p by calculating the raw number of pixels for each (multiply the vertical number by the horizontal one) and seeing where that screen resolution fits in relative to the common ones.

(See our targeted roundups of the best graphics cards for 1080p play and the best graphics cards for 4K gaming.)

Why does this matter? Well, in the case of PC gaming, the power of the components inside your next PC—whether you are buying one, building one, or upgrading—should be distributed in a way that best suits the way you want to play.

Without getting too deep into the weeds, here's how it works: The frame rates you'll see when gaming at 1080p, even at the highest detail levels, are almost always down to some balance of CPU and GPU power, rather than either one being the outright determinant of peak frame rates.

Next is the 1440p resolution, which starts to split the load when you are playing at higher detail levels.

Some games start to ask more of the GPU, while others can still lean on the CPU for the heavy math.

(It depends on how the game has been optimized by the developer.) Then there's 4K resolution, where, in most cases, almost all of the lifting is done exclusively by the GPU.

Consider how the most recent releases from Nvidia and AMD handled when tested in 4K resolution...

Look at the GeForce RTX 3080.

While the RTX 3080 posted a result of 93 frames per second (fps) in Far Cry 5 at 4K resolution (just over a 50% improvement over last generation's GeForce RTX 2080 Super at 61fps), its performance at 1080p resolution was just 11% improved.

This is because at the extremes of 1080p gaming, the CPU can be a more of a factor than it is at 4K.

Now, of course, you can always dial down the detail levels for a game to make it run acceptably at a higher-than-recommended resolution, or dial back the resolution itself.

But to an extent, that defeats the purpose of a graphics card purchase.

The highest-end cards are meant for 4K play or for playing at very high refresh rates at 1080p or 1440p; you don't have to spend $1,000 or even $500 to play more than acceptably at 1080p.

In short: Always buy the GPU that fits the monitor you either play on today or plan to own in the near future.

There are plenty of midrange GPUs that can power 1440p displays at their peak, and 4K is still, now, a fringe display resolution for the most active PC gamers if the Steam Hardware Survey is any indication.

(It saw less than 3% of users playing at 4K at the end of 2020.)

High-Refresh Gaming: Why High-End GPUs Matter

Another thing to keep abreast of is a trend in gaming that's gained major momentum in recent years: high-refresh gaming monitors.

For ages, 60Hz (or 60 screen redraws a second) was the panel-refresh ceiling for most PC monitors, but that was before the genre of esports really hit its stride.

Panels focused on esports and high-refresh gaming may support up to 144Hz, 240Hz, or even 360Hz for smoother gameplay.

What this means: If you have a video card that can consistently push frames in a given game in excess of 60fps, on a high-refresh monitor you may be able to see those formerly "wasted" frames in the form of smoother game motion.

Powered by esports success stories (like 16-year-old Fortnite prodigy Bugha turning into a multi-millionaire overnight), the demand has surged in recent years for high-refresh monitors that can keep esports hopefuls playing at their peak.

And while 1080p is still overwhelmingly the preferred resolution for competitive players across all game genres, many are following the trends that monitors are setting first.

The number of players moving up to the 1440p bracket of graphical resolutions (played in either 16:9 aspect ratio at 2,560 by 1,440 pixels, or in 21:9 at 3,440 by 1,440) is growing faster than ever, thanks in no small part to recent game-monitor entries like the ViewSonic Elite XG270QG, which marries the worlds of high-refresh and high-quality panels.

To an extent, the cards and the panels are playing a game of leapfrog themselves.

Gaming at a higher resolution does have its benefits for those who want to hit their opponents with pixel-perfect precision, but just as many esports hopefuls and currently salaried pros still swear by playing at resolutions as low as 720p in games like Counter-Strike: Global Offensive.

So all told, your mileage may vary, depending on the way you prefer to play.

Most casual gamers won't care about extreme refresh rates, but the difference is marked if you play fast-action titles, and competitive esports hounds find the fluidity of a high refresh rate a competitive advantage.

(See our picks for the best gaming monitors, including high-refresh models.) In short: Buying a powerful video card that pushes high frame rates can be a boon nowadays even for play at a "pedestrian" resolution like 1080p, if paired with a high-refresh monitor.

HDR Compatibility

Finally, keep HDR compatibility in mind.

More and...

If you're a PC gamer, or a content creator who lives and dies by the speed of your graphics-accelerated software, your video card is the engine that powers what you can do—or how lustily you can brag.

Our guide will help you sort through the best video-card options for your desktop PC, what you need to know to upgrade a system, and how to evaluate whether a particular card is a good buy.

We'll also touch on some upcoming trends—they could affect which card you choose.

After all, consumer video cards range from under $100 to well over $1,499.

It's easy to overpay or underbuy.

(We won't let you do that, though.)


Who's Who in GPUs: AMD vs.

Nvidia

First off, what does a graphics card do? And do you really need one?

If you're looking at any given prebuilt desktop PC on the market, unless it's a gaming-oriented machine, PC makers will de-emphasize the graphics card in favor of promoting CPU, RAM, or storage options.

Indeed, sometimes that's for good reason; a low-cost PC may not have a graphics card at all, relying instead on the graphics-accelerated silicon built into its CPU (an "integrated graphics processor," commonly called an "IGP").

There's nothing inherently wrong with relying on an IGP—most business laptops, inexpensive consumer laptops, and budget-minded desktops have them—but if you're a gamer or a creator, the right graphics card is crucial.

A modern graphics solution, whether it's a discrete video card or an IGP, handles the display of 2D and 3D content, drawing the desktop, and decoding and encoding video content in programs and games.

All of the discrete video cards on the consumer market are built around large graphics processing chips designed by one of two companies: AMD or Nvidia.

These processors are referred to as "GPUs," for "graphics processing units," a term that is also applied, confusingly, to the graphics card itself.

(Nothing about graphics cards...ahem, GPUs...is simple!)

The Best Graphics Card Deals This Week*

*Deals are selected by our partner, TechBargains

The two companies work up what are known as "reference designs" for their video cards, a standardized version of a card built around a given GPU.

Sometimes these reference-design cards are sold directly by Nvidia (or, less often, by AMD).

Nvidia's own brand of cards are spotted easily by the "Founders Edition" badge, something that until the release of Nvidia's latest, the GeForce RTX 3000 series, didn't mean much more than slightly higher clock speeds from stock and sturdy build quality.

Often the Founders Editions cards are the most aesthetically consistent of any cards that might come out during the lifetime of a particular GPU.

But their designs tend to be conservative, not as accommodating to aggressive overclocking or modification as some third-party options are.

Sometimes, reference cards are duplicated by third-party card makers (companies referred to in industry lingo as AMD or Nvidia "board partners"), such as Asus, EVGA, MSI, Gigabyte, Sapphire, XFX, and Zotac.

Depending on the graphics chip in question, these board partners may sell their own self-branded versions of the reference card (adhering to the design and specifications set by AMD or Nvidia), or they will fashion their own custom products, with different cooling-fan designs, slight overclocking done from the factory, or features such as LED mood illumination.

Some board partners will do both—that is, sell reference versions of a given GPU, as well as their own, more radical designs.


Who Needs a Discrete GPU?

We mentioned integrated graphics (IGPs) above.

IGPs are capable of meeting the needs of most general users today, with three broad exceptions...

Professional Workstation Users. These folks, who work with CAD software or in video and photo editing, will still benefit greatly from a discrete GPU.

Some of their key applications can transcode video from one format to another, or perform other specialized operations using resources from the GPU instead of (or in addition to) those of the CPU.

Whether this is faster will depend on the application in question, which specific GPU and CPU you own, and other factors.

Productivity-Minded Users With Multiple Displays. People who need a large number of displays can also benefit from a discrete GPU.

Desktop operating systems can drive displays connected to the IGP and discrete GPUs simultaneously.

If you've ever wanted five or six displays hooked up to a single system, you can combine an IGP and a discrete GPU to get there.

That said, you don't necessarily need a high-end graphics card to do that.

If you're simply displaying business applications, multiple browser windows, or lots of static windows across multiple displays (i.e., not demanding PC games), all you need is a card that supports the display specifications, resolutions, monitor interfaces, and number of panels you need.

If you're showing four web browsers across four display panels, a GeForce RTX 3080 card, say, won't confer any greater benefit than a GeForce GTX 1660 with the same supported outputs.

Gamers. And of course, there's the gaming market, to whom the GPU is arguably the most important component.

RAM and CPU choices both matter, but if you have to pick between a top-end system circa 2018 with a 2021 GPU or a top-end system today using the highest-end GPU you could buy in 2018, you'd want the former.

Graphics cards fall into two distinct classes: consumer cards meant for gaming and light content creation work, and dedicated cards meant for professional workstations and geared toward scientific computing, calculations, and artificial intelligence work.

This guide, and our reviews, will focus on the former, but we'll touch on workstation cards a little bit, later on.

The key sub-brands you need to know across these two fields are Nvidia's GeForce and AMD's Radeon RX (on the consumer side of things), and Nvidia's Titan and Quadro, as well as AMD's Radeon Pro and Radeon Instinct (in the pro workstation field).

Nvidia continues to dominate the very high end of both markets.

For now though, we'll focus on the consumer cards.

Nvidia's consumer card line in early 2021 is broken into two distinct classes, both united under the long-running GeForce brand: GeForce GTX, and GeForce RTX.

AMD's consumer cards, meanwhile, comprise the Radeon RX and (now fading) Radeon RX Vega families, as well as the end-of-life Radeon VII.

Before we get into the individual lines in detail, though, let's outline a few very important considerations you should make for any video-card purchase.


Target Resolution and Monitor Tech: Your First Considerations

Resolution is the horizontal-by-vertical pixel count at which your video card will drive your monitor.

This has a huge bearing on which card to buy, and how much you need to spend, when looking at a video card from a gaming perspective.

If you are a PC gamer, a big part of what you'll want to consider is the resolution(s) at which a given video card is best suited for gaming.

Nowadays, even low-end cards will display everyday programs at lofty resolutions like 3,840 by 2,160 pixels (a.k.a., 4K).

But for strenuous PC games, those cards will not have nearly the power to drive smooth frame rates at high resolutions like those.

In games, the video card is what calculates positions, geometry, and lighting, and renders the onscreen image in real time.

For that, the higher the in-game detail level and monitor resolution you're running, the more graphics-card muscle is required.

Resolution Is a Key Decision Point

The three most common resolutions at which today's gamers play are 1080p (1,920 by 1,080 pixels), 1440p (2,560 by 1,440 pixels), and 2160p or 4K (3,840 by 2,160 pixels).

Generally speaking, you'll want to choose a card suited for your monitor's native resolution.

(The "native" resolution is the highest supported by the panel, and the one at which the display looks the best.)

You'll also see ultra-wide-screen monitors with in-between resolutions (3,440 by 1,440 pixels is a common one); you can gauge these versus 1080p, 1440p, and 2160p by calculating the raw number of pixels for each (multiply the vertical number by the horizontal one) and seeing where that screen resolution fits in relative to the common ones.

(See our targeted roundups of the best graphics cards for 1080p play and the best graphics cards for 4K gaming.)

Why does this matter? Well, in the case of PC gaming, the power of the components inside your next PC—whether you are buying one, building one, or upgrading—should be distributed in a way that best suits the way you want to play.

Without getting too deep into the weeds, here's how it works: The frame rates you'll see when gaming at 1080p, even at the highest detail levels, are almost always down to some balance of CPU and GPU power, rather than either one being the outright determinant of peak frame rates.

Next is the 1440p resolution, which starts to split the load when you are playing at higher detail levels.

Some games start to ask more of the GPU, while others can still lean on the CPU for the heavy math.

(It depends on how the game has been optimized by the developer.) Then there's 4K resolution, where, in most cases, almost all of the lifting is done exclusively by the GPU.

Consider how the most recent releases from Nvidia and AMD handled when tested in 4K resolution...

Look at the GeForce RTX 3080.

While the RTX 3080 posted a result of 93 frames per second (fps) in Far Cry 5 at 4K resolution (just over a 50% improvement over last generation's GeForce RTX 2080 Super at 61fps), its performance at 1080p resolution was just 11% improved.

This is because at the extremes of 1080p gaming, the CPU can be a more of a factor than it is at 4K.

Now, of course, you can always dial down the detail levels for a game to make it run acceptably at a higher-than-recommended resolution, or dial back the resolution itself.

But to an extent, that defeats the purpose of a graphics card purchase.

The highest-end cards are meant for 4K play or for playing at very high refresh rates at 1080p or 1440p; you don't have to spend $1,000 or even $500 to play more than acceptably at 1080p.

In short: Always buy the GPU that fits the monitor you either play on today or plan to own in the near future.

There are plenty of midrange GPUs that can power 1440p displays at their peak, and 4K is still, now, a fringe display resolution for the most active PC gamers if the Steam Hardware Survey is any indication.

(It saw less than 3% of users playing at 4K at the end of 2020.)

High-Refresh Gaming: Why High-End GPUs Matter

Another thing to keep abreast of is a trend in gaming that's gained major momentum in recent years: high-refresh gaming monitors.

For ages, 60Hz (or 60 screen redraws a second) was the panel-refresh ceiling for most PC monitors, but that was before the genre of esports really hit its stride.

Panels focused on esports and high-refresh gaming may support up to 144Hz, 240Hz, or even 360Hz for smoother gameplay.

What this means: If you have a video card that can consistently push frames in a given game in excess of 60fps, on a high-refresh monitor you may be able to see those formerly "wasted" frames in the form of smoother game motion.

Powered by esports success stories (like 16-year-old Fortnite prodigy Bugha turning into a multi-millionaire overnight), the demand has surged in recent years for high-refresh monitors that can keep esports hopefuls playing at their peak.

And while 1080p is still overwhelmingly the preferred resolution for competitive players across all game genres, many are following the trends that monitors are setting first.

The number of players moving up to the 1440p bracket of graphical resolutions (played in either 16:9 aspect ratio at 2,560 by 1,440 pixels, or in 21:9 at 3,440 by 1,440) is growing faster than ever, thanks in no small part to recent game-monitor entries like the ViewSonic Elite XG270QG, which marries the worlds of high-refresh and high-quality panels.

To an extent, the cards and the panels are playing a game of leapfrog themselves.

Gaming at a higher resolution does have its benefits for those who want to hit their opponents with pixel-perfect precision, but just as many esports hopefuls and currently salaried pros still swear by playing at resolutions as low as 720p in games like Counter-Strike: Global Offensive.

So all told, your mileage may vary, depending on the way you prefer to play.

Most casual gamers won't care about extreme refresh rates, but the difference is marked if you play fast-action titles, and competitive esports hounds find the fluidity of a high refresh rate a competitive advantage.

(See our picks for the best gaming monitors, including high-refresh models.) In short: Buying a powerful video card that pushes high frame rates can be a boon nowadays even for play at a "pedestrian" resolution like 1080p, if paired with a high-refresh monitor.

HDR Compatibility

Finally, keep HDR compatibility in mind.

More and...

Daxdi

Daxdi.com Cookies

At Daxdi.com we use cookies (technical and profile cookies, both our own and third-party) to provide you with a better online experience and to send you personalized online commercial messages according to your preferences. If you select continue or access any content on our website without customizing your choices, you agree to the use of cookies.

For more information about our cookie policy and how to reject cookies

access here.

Preferences

Continue