Updated on March 22, 2022
The GPU supply and cost situation is anything but “normal” at the moment, as you can see from the Amazon prices above. Editor’s note: If you’re in the market for a new card, consult this buying guide for tips on how to get a good deal. To get the most out of your current GPU, check out this tutorial on how to maximise its performance.
Also, keep in mind that we’ve listed the best Nvidia and AMD graphics cards based on your desired gameplay resolution (in descending order) (unless one is an unequivocal clear choice). This is just a small sample of the third-party cards that may be found in the market. It’s reasonable to interpret our endorsement of a certain GPU family (such as the GeForce GTX 1660 Ti or Radeon RX 6700 XT) as a recommendation for the entire family as a single reference card in a given class.
When it comes to PC gaming or video creation, the performance of your graphics-accelerated software is everything—and your video card is the engine that drives it.
Tweet and Share
In the annals of time
Connatix’s Sponsored Content
With this guide, we’ll help you find the finest video-card options for your desktop PC, as well as how to determine whether a certain card is worth your money. Also, keep an eye out for emerging styles, since they may influence your decision over which card to buy. At the end of the day, consumer video cards can cost anywhere from as little as $100 to as much as $1,499 (and that’s just the MSRP). Overpaying or underbuying is easy, but we’ll stop you from doing it.
Who’s Who in GPUs: AMD vs. Nvidia
To begin, what is the purpose of a graphics card? And do you really need one of those things anyways?
Any prebuilt desktop PC on the market, whether it’s a gaming-oriented computer, will put less emphasis on the graphics card than CPU, RAM, or storage options unless it is specifically designed for gaming. Indeed, a low-cost PC may not have a graphics card at all, relying instead on the graphics-accelerated hardware built into its CPU. This is often the case (an “integrated graphics processor,” commonly called an “IGP”). When it comes to gaming or creating, the correct graphics card is essential. There’s nothing wrong with using an IGP; most business laptops and budget-minded desktops have one.
An IGP or discrete video card can display 2D and 3D content, draw the desktop, and decode and encode video information in programmes and games, all with advanced graphics solutions. All consumer-level discrete video cards are based on big graphics processor chips made by one of two companies: AMD or Nvidia. The word “GPU,” which stands for “graphics processing units,” is used to describe these processors, which are also known as graphics cards. It’s not easy to understand graphics cards…er, GPUs!
For their video cards, the two companies provide “reference designs,” a standard version of the card based on a specific GPU.. Nvidia will occasionally sell these reference-design graphics cards directly to consumers (or, less often, by AMD).
To distinguish Nvidia’s “Founders Edition” cards from the rest of the company’s offerings, the GeForce RTX 3000 series introduced a new “Founders Edition” labelling that signifies more than just greater clock rates and solid build quality. Founders Edition cards are often the most visually uniform of any cards that may be released during the lifespan of a specific GPU. ” However, their designs tend to be more conservative than those of third-party choices, which are more tolerant of extreme overclocking or modification.
Nvidia’s new Founders Edition cards defy common thought in the most radical way possible. PCB (printed circuit board) is 50% smaller than previous generation RTX 20 Series Nvidia GPUs using the company’s revolutionary “push-pull” cooling technology in each corresponding model. In these cards, Nvidia has placed its engineering prowess on full display, and while AMD has put up a decent fight on performance, the RTX 30 Series Founders Edition cards stand alone.
New Nvidia Founder’s Edition cards, for the first time ever, have been developed to be more compact, lighter, and faster than their predecessors. Similar to AMD or Nvidia “board partners,” third-party card manufacturers like Asus, EVGA, MSI and Gigabyte produce exact clones of reference cards. It’s possible that board partners selling AMD or Nvidia-based graphics cards may also offer their own branded versions of the reference card, or that they will manufacture their own unique cards with features like LED mood illumination and different cooling fan designs. Depending on the GPU, some board partners may offer both the reference design and their own, more radical version of the GPU.
In terms of design, MSI’s GeForce 3080 Gaming X Trio 10G and AMD’s Radeon RX 6000 Series are both fantastic instances of vintage GPU design, but they lack the kind of technological breakthroughs in cooling and power efficiency that we’ve seen in current-generation Founders Edition cards…
Who Needs a Discrete GPU?
We discussed IGPs earlier (integrated graphics processors). With the exception of three major drawbacks, IGPs are capable of meeting the needs of the vast majority of modern computer users…
Users use workstations in the course of their professional duties Some people will still benefit from using a discrete graphics processing unit even if they don’t work with CAD or video and photo editing software. When converting video formats or doing other specialised activities, some of their most essential applications might make use of GPU resources instead of (or in addition to) CPU resources. There are a number of elements that influence how quick it is, such as the specific GPU and CPU that you have
Productivity-focused multi-display users. Those that use a large number of monitors may also find a discrete GPU handy. Desktop operating systems can run on both the IGP and discrete GPUs at the same time. Five or six monitors can be connected to a single computer using an IGP or a discrete GPU.
However, a top-of-the-line graphics card is not necessary. In order to run multiple monitors, you need a graphics card that can handle the display requirements of your business software, several browser windows, or even a huge number of static windows on many monitors at the same time. If you’re presenting three web browsers across three display panels, a GeForce RTX 3080 card has no advantage over a GeForce GTX 1660 card with the same supported outputs.
Gamers. Additionally, the graphics processing unit (GPU) is a critical component in gaming systems. A top-of-the-line system in 2018 with a 2022 GPU would be preferable to a top-of-the-line system in 2018 with the highest-end GPU available in 2018.
Dedicated workstation graphics cards are designed for scientific computing, mathematics, and artificial intelligence (AI) tasks, while consumer graphics cards are more suited to gaming and light content creation. The emphasis of this guide and our reviews will be on the former, with a brief mention of workstation cards at the end. Both Nvidia’s Titan and Quadro (now RTX A-Series and AMD’s Radeon Pro and Radeon Instinct) and AMD’s Radeon Pro and Instinct are important sub-brands to know about (in the pro workstation field). Both markets remain dominated by Nvidia at the high end.
For the time being, we’ll be focusing on consumer credit cards. As of early 2022, Nvidia’s consumer card range is divided into two distinct classes: GeForce GTX and GeForce RTX. Consumer cards from AMD include the Radeon RX and (now largely extinct) Radeon RX Vega lines; and the end-of-life Radeon VII, which is no longer supported. If you’re planning on purchasing a video card in the near future, there are several things you should keep in mind.
Target Resolution and Monitor Tech: Your First Considerations
Your video card’s resolution is the number of pixels it can output per inch (Hz/V) on your monitor. When considering a video card for gaming, this has a significant impact on which card to purchase and how much money you need to set aside.
For PC gamers, the resolutions at which a certain video card is best suited for gaming are an important consideration. Everyday programmes can now be displayed at high resolutions like 3,840 by 2,160 pixels on even low-end graphics cards (a.k.a., 4K). In PC games that require a lot of power, however, those cards will be unable to maintain smooth frame rates at such high resolutions. Graphics processing units (GPUs) are responsible for calculating and rendering on-screen images in real time. Graphics-card power is necessary as the amount and resolution of in-game detail increases.
Choosing a resolution is a critical step in the decision-making process.
1080p, 1440p, and 2160p or 4K are the three most popular screen resolutions used by today’s video gamers (3,840 by 2,160 pixels). In general, you’ll want a video card that can handle the native resolution of your monitor. If you want the greatest picture, you’ll want to use your display at its “original” resolution.
If you look around, you’ll find ultra-wide screens with a range of resolutions (3,440 by 1,440 pixels is a frequent one), which you can compare to more common ones like 1080p, 1440p, and 2160p by counting the raw amount of pixels on each one and comparing it to the common ones. If you want to play games in 1080p or 4K, you should check out our lists of the best graphics cards for each resolution.
What is the significance of this? PC gamers should consider how the components within their new computer will be distributed in order to get the most out of their gaming experience. This is true whether they’re purchasing, building, or upgrading an existing PC.
Here’s how it works, without getting too bogged down in the weeds: It’s rare that either the CPU or the GPU is solely responsible for the high frame rates you’ll notice when gaming in 1080p, especially at the maximum detail settings.
Next up is the 1440p resolution, which begins to distribute the burden when you’re playing at a high level of detail. While some games begin to place greater demands on the GPU, others can continue to do the heavy math on the CPU. (This is dependent on the developer’s optimization techniques.) Then there’s 4K resolution, where the GPU is doing nearly all the heavy lifting.
You can, of course, lower the degree of detail in a game to make it run more smoothly at a higher resolution, or lower the resolution itself. However, this defeats the whole purpose of buying a graphics card. You don’t need to spend $1,000 or even half that much to play well at 1080p; the highest-end cards are designed for 4K play or for playing at extremely high refresh rates at 1080p or 1440p.
In a nutshell, you should always acquire a graphics card that is compatible with the display you anticipate on using in the near future. According to the Steam Hardware Survey, the most popular display resolution for PC gamers is 1440p at its best, and 4K is still a distant dream for the majority of gamers. In early 2022, less than 3% of people were playing at 4K.
High-Refresh Gaming: Why High-End GPUs Matter
High-refresh gaming monitors, a recent trend in gaming, are another thing to keep an eye on. 60Hz (60 screen refreshes per second) was the standard for most PC monitors for a long time, but that was before esports really took off.
With a focus on high-refresh gaming and esports in mind, panels may be capable of running at up to 360 frames per second. If your video card is capable of continuously pushing frames in a game above 60fps, you may be able to observe previously “wasted” frames in the form of smoother game motion on a high-refresh monitor.
Recent years have seen a rise in the demand for high-refresh monitors, spurred on by stories about esports successes (like the instant multi-millionaire status of Fortnite prodigy Bugha, then age 16). Despite the fact that 1080p is still the most popular resolution for competitive gamers across all genres, displays are setting the trend first.
It’s no secret that more and more people are embracing 1440p resolutions, which can be played in either 16:9 or 21:9 aspect ratios at 2,560 by 1,440 pixels or 3,440 by 1,440 pixels. The cards and panels are, in a sense, playing a game of leapfrog with one another.
In games like Counter-Strike: Global Offensive, esports hopefuls and professional players alike swear by playing at resolutions as low as 720p, despite the benefits of playing at higher resolutions for those who wish to hit their opponents pixel-perfectly. Your mileage may vary depending on how you choose to play, as well as the games you choose to participate in.
The fluidity of a high refresh rate is a competitive advantage for competitive esports players, but most casual gamers won’t notice the difference unless they play fast-action games. For the greatest gaming displays, including high-refresh monitors, check out our top recommendations. A strong video card with a high frame rate, even at a “pedestrian” resolution like 1080p, can be a boon today even when paired with a high-refresh display.
In the end, don’t forget HDR compatibility. Nearly all of our Editors’ Choice recommendations for the finest gaming monitors in recent years offer HDR to some degree. While HDR 10 and HDR 400 displays didn’t have much of an influence on their HDR image quality in our testing, any monitors beyond the HDR 600 spec should be considered in your GPU decision as a display for gaming and one for HDR-enhanced content. They are both HDR compatible.
Make sure the HDR monitor you purchase is compatible with a newer graphics card’s refresh rate and bitrate. A dance, indeed, but one that can pay off handsomely in terms of both content development and game screens.
FreeSync vs. G-Sync: Jets! Sharks! Maria?
Buying a card depending on whether it supports one of these two venerable standards for easing gameplay? It all depends on your monitor.
AMD’s FreeSync and Nvidia’s G-Sync are two facets of a single adaptive sync technology. In games with adaptive sync, the monitor’s refresh rate fluctuates based on the video card’s output capabilities at any given point in the game, allowing for smooth gameplay. Without it, frame rate wobbles can cause to artefacts, visual stuttering, or screen tearing, in which mismatched screen halves appear for a brief period of time. A full frame is drawn only if the video card is capable of delivering a full frame.
FreeSync or G-Sync may or may not be supported by your monitor. Due to the fact that FreeSync is less expensive to manufacture than G-Sync, it is more widely used than G-Sync. You may decide to go with one or the other based on this, but keep in mind that the tides are shifting. G-Sync compatibility for FreeSync-compatible monitors was announced at CES 2019 by Nvidia, and a fast growing subset of FreeSync monitors has been certified by Nvidia as such. This means that you might not have to make a simple decision between black and white (or red and green).
Your monitor may or may not support FreeSync or G-Sync. FreeSync is more extensively utilised than G-Sync due to its lower manufacturing costs. This may lead you to choose between the two, but you should be aware that the sands are shifting. Nvidia stated at CES 2019 that FreeSync monitors may now use G-Sync, and a growing number of these monitors have been certified as such by Nvidia. Because of this, you may not be forced to choose between black or white (or red and green).
Meet the Radeon and GeForce Families
To wrap things off, let’s talk about the differences between the two competing gangs that have come together in recent years. There is a wide range of GPUs available, from low-end versions for low-resolution gaming all the way up to high-end models for 4K and high-refresh rate gaming. Let’s start with Nvidia. (Again, keep in mind that MSRPs are only “recommended” prices these days, and that the volatility of today’s prices against the list prices we mention!)
A Look at Nvidia’s Lineup
For the time being, the majority of Nvidia’s card lineup will consist of cards based on the “Turing” line of last generation (a.k.a. “20-series”) GPUs as well as the newer GTX 1600 series cards. The GeForce RTX 30-Series cards, the most recent additions, are high-end cards based on GPUs using the “Ampere” architecture.
An overview of the Turing, Ampere, and Pascal families’ current relevant card classes, their rough cost, and their use cases may be seen below…
The Nvidia GeForce Card Lineup, Early 2022
This list does not include several of the older GTX Pascal cards like the GTX 1070 and GTX 1080, which have been around for a while. In 2022, their GeForce RTX counterparts have mainly replaced them on the second-hand market after being sold out. After the debut of the GeForce GTX 1660 and GTX 1660 Ti, the GeForce GTX 1060 also met its demise, while the GTX 1050 has been rendered obsolete by the GTX 1650 and the GTX 1650 Super.
Let’s begin with Turing, shall we? In the wake of Nvidia’s 20-series GPU debut in September 2018, there was a mixed response. However, on the other hand, the business was offering up some of the most powerful GPUs ever seen, along with new and exciting technologies like ray-tracing and DLSS. On the other hand, no games supported ray-tracing or DLSS at the time of the Turing launch. Only a small number of video games now support DLSS 2.0 on its own or in conjunction with ray-tracing.
However, in comparison to previous generations, Nvidia has shifted the goalposts for high-end GPU pricing. In comparison to the GeForce RTX 2080 Ti, the company’s new flagship graphics card, the $699 GeForce RTX 2080 wasn’t all that much better.
As a result, in 2019, Nvidia released the GeForce RTX 2060 Super, RTX 2070 Super, and RTX 2080 Super (up-tick versions of the existing cards) at the same time AMD launched its midrange Radeon RX 5700 and RX 5700 XT graphics cards. Nvidia’s Super cards, which are available for both RTX and GTX models, improve the specifications of the cards they’re supposed to replace in a system (some more effectively than others).
In September 2020, the GeForce RTX 30 Series will be available. New GeForce RTX 3070, 3080, and 3090 GPUs were introduced by Nvidia. One after another of the RTX 3080 Ti, RTX 3070 Ti and RTX 3060 Ti were introduced. As a result, they’ve earned their own section in the spec book. There is no Founders Edition for the RTX 3060 or 3050, but there are third-party models.)
Nvidia GeForce RTX 30 Series: Selected Cards Compared
Building on Samsung’s 8nm technology and using the second generation of RT cores and the third generation of tensor core memory type, the cards are an evolutionary leap. Everything from the location of various modules and chips to the inner workings of a new heatsink has had to be reworked due to reworked PCBs.
The GeForce GT 1030 and GTX 1050 remain very low-end cards, costing under $100 or a little more, regardless of the impact of Nvidia’s 30 Series on card prices up and down the Nvidia stack. To put it another way, Nvidia’s current low-to-mid-range GPUs vary from the GTX 1650/Super to the GTX 1660 Ti, RTX 2060 Super and RTX 3060, with the last two costing a little more than $300.
With the advent of the GeForce RTX 30 Series, the middle and high end got a lot more challenging. If you’re looking for a “elite” high-end GPU, the GeForce RTX 3080 and RTX 3090 are the ones to go for, starting at $699 (MSRP!), and rising from there. This separates them from the more moderate (but still powerful) options like the GeForce RTX 2060, RTX 2070, RTX 2080, and RTX 3060. In the present pricing structure, these cards typically start at $350 MSRP and can go as high as $650 at MSRP depending on the model.
A Look at AMD’s Lineup
Compared to Nvidia’s low-end, mainstream, and high-end cards, AMD is stronger in 2022 than it has been in a long time.
The AMD Radeon Card Lineup, Early 2022
For 1080p gaming, the Radeon RX 550 and RX 560 are the low end, while the old Radeon 570 to RX 590 are the former midrange, but their time is limited given the company’s latest additions to its 1080p-play arsenal, including the Radeon RX 5500 XT and RX 5600 XT, as well as the even newer Radeon RX 6600 and RX 6500 XT…. These cards, the Radeon RX 580, RX Vega 56, and RX Vega 64, while still excellent 1080p cards, are clearly from a different time.
There has been a significant shift in AMD’s 1080p and 1440p graphics cards. The first in the company’s long-awaited line of 7nm-based “Navi” mid-range graphics cards based on a new architecture, Radeon DNA, was released in July 2019. (RDNA). To begin with, there was the Radeon RX 5700, followed by the RX 5700 XT and finally, the Radeon RX 5700 XT Anniversary Edition, which were all limited editions. The 1440p gaming market is the primary focus of all of these cards. In that resolution range, each can run demanding AAA titles at 60 frames per second.
First half of 2021 saw AMD’s RX 6700 XT, RX 6800 XT, and RX 6900 XT take on Nvidia’s top-end challenge. RDNA 2 cards are more cost-effective than the company’s previous-generation RDNA 1 cards, although they still fall behind Nvidia’s 30 Series cards in terms of design, cooling, and driver stability compared to Nvidia’s Founders Edition cards.
RDNA 2 will be unveiled to the public in 2020. When it comes to discrete desktop graphics cards, AMD’s RDNA 2 improves on what was introduced in RDNA while also adding additional capabilities to compete with Nvidia and Microsoft’s Xbox One X and PlayStation 5. Microsoft’s DX12 Ultimate API and ray-tracing compute cores are included. Here are some of the first RDNA 2 cards to be shown…
AMD Radeon RX 6000 Series: Selected Cards Compared