Updated on August 14, 2022
As you can see from the Amazon prices above, the current state of the GPU market is anything but “normal” at this time. This buying-strategies guide will help you locate a good deal on a card if you’re thinking about making a purchase soon. To get the most out of your current GPU, check out this tutorial on how to maximise its performance.
Also keep in mind that our recommendations above are based (in ascending order) on your ideal gameplay resolution, with Nvidia and AMD GPUs selected for each use case (unless one is an unequivocal clear choice). There are many more third-party cards available than what we’ve included here. A single reference card (such as the GeForce GTX 1660 Ti or the Radeon RX 6700 XT) might be seen as an endorsement of the entire GPU family.
Whether you’re a PC gamer or a multimedia maker, your video card determines how much you can accomplish—or how lustfully you may brag—with your graphics-accelerated software.
In this tutorial, we’ll show you how to get the best video card for your desktop PC, what you need to know about upgrading a system, and how to determine whether or not a particular card is a decent purchase. Also, keep an eye out for emerging styles, since they may influence your decision over which card to buy. At the end of the day, consumer video cards can cost anywhere from as little as $100 to as much as $1,499 (and that’s just the MSRP). We will not allow you to overpay or underbuy.
Who’s Who in GPUs: AMD vs. Nvidia
To begin, what is the purpose of a graphics card? And do you really need one of these?
As a general rule, PC manufacturers don’t put much emphasis on graphics cards in their prebuilt desktop PCs unless they’re specifically designed for gaming. If a low-cost PC doesn’t come equipped with any kind of discrete graphics card at all, it may do so to save money by utilising the graphics acceleration built into the CPU (an “integrated graphics processor,” commonly called an “IGP”). There is nothing intrinsically wrong with using an IGP—most business laptops, affordable consumer laptops, and budget-minded desktops have them—but the correct graphics card is essential for gamers and creators.
Displaying 2D and 3D content, drawing the desktop, and decoding and encoding video information in programmes and games are all handled by current graphics solutions. AMD or Nvidia’s big graphics processor chips power every consumer discrete video card on the market today. “GPUs,” or “graphics processing units,” are the name used to describe these processors, which are also known as graphics cards. Graphics cards, or GPUs, are not easy to understand!
For their video cards, the two companies create “reference designs,” a standard version of the card based on a specific GPU. Occasionally, Nvidia sells these reference-design cards directly (or, less often, by AMD).
Share This Date in History on Twitter
Connatix’s Sponsored Content
To distinguish Nvidia-branded cards, look for the “Founders Edition” label. Up until the debut of Nvidia’s most recent GeForce RTX 3000 series, this label meant little more than slightly faster clock speeds and a more durable design. Founders Edition cards are often the most visually uniform of any cards that may be released during the lifespan of a specific GPU. ” However, their designs tend to be more conservative than those of third-party choices, which are more tolerant of extreme overclocking or modification.
Nvidia’s new Founders Edition cards defy common thought in the most radical way possible. Each model of the RTX 20 Series Nvidia GPUs has a PCB (printed circuit board, or “guts”) that is half as large as the previous generation’s and is equipped with Nvidia’s unique “push-pull” cooling technology. In these cards, Nvidia has placed its engineering prowess on full display, and while AMD has put up a decent fight on performance, the RTX 30 Series Founders Edition cards stand alone.
Founders Edition cards from Nvidia will benefit from this treatment because they’ll be smaller, lighter, and faster than ever before. Sometimes third-party card producers (known as AMD or Nvidia “board partners”) produce cards that are nearly identical to the official reference cards. Board partners may sell their own self-branded versions of the reference card (adhering to the design and specifications set by AMD or Nvidia, for example), or they may fashion their own custom products, with different cooling fan designs, slight overclocking done from the factory, or features such as LED mood illumination. Some board partners will sell both the reference and their own, more radical designs of a given GPU.
Even while the MSI GeForce RTX 3080 Gaming X Trio 10G and AMD’s Radeon RX 6000 Series GPUs look great, they lack the kind of cooling and power efficiency improvements found in the Founders Edition cards of the current generation.
Who Needs a Discrete GPU?
Earlier, we spoke of IGPs (integrated graphics processors). There are three major exceptions to the general rule that IGPs can suit the needs of today’s mainstream users…
Users of high-end workstations. Workers in CAD or in video and photo processing will find a discrete GPU to be an invaluable tool. In some of their most important applications, they can use the GPU’s resources instead of (or in addition to) those provided by the CPU to accomplish specific functions. How much faster it is depends on a variety of factors, such as the precise GPU and CPU you have.
Users who are focused on getting things done with many monitors. A separate GPU can help those who need a high number of monitors. In desktop operating systems, both the IGP and discrete GPUs can be used simultaneously to drive displays. It is possible to combine an IGP and a discrete GPU if you wish to connect five or six screens to the same computer.
A high-end graphics card is not required for this, however. For non-demanding PC games, all you need is a graphics card that supports the display specs, resolutions, monitor interfaces, and the number of panels required to display your business apps, browser windows, or other static windows. There is no advantage to using a GeForce RTX 3080 card over a GeForce GTX 1660 with the same supported outputs if you’re running three web browsers on three displays.
Gamers. Finally, we have gamers that place a high value on the graphics processing unit (GPU). If you must choose between a high-end system from 2018 with a 2022 GPU and a high-end system from today with the highest-end GPU you could get in 2018, the former is the one you should go with.
Gaming and light content creation use consumer-grade graphics cards; professional workstations require specialised graphics cards designed for scientific computations, calculations and artificial intelligence. In this guide, as well as our reviews, we’ll focus on the former, but we’ll touch on workstation cards at some point later. Nvidia’s GeForce and AMD’s Radeon RX (on the consumer side of things) and Nvidia’s Titan and Quadro (now RTX A-Series), as well as AMD’s Radeon Pro and Radeon Instinct (on the professional side of things) are the main sub-brands you need to know (in the pro workstation field). In both markets, Nvidia is the undisputed leader.
For the time being, we’ll be focusing on consumer credit cards. In early 2022, Nvidia’s GeForce GTX and GeForce RTX consumer card lines are split into two distinct classes, both under the long-running GeForce brand: AMD’s consumer cards, on the other hand, include the Radeon RX and RX Vega families, as well as the end-of-life Radeon VII. Prior to delving into each line, let’s take a look at some of the most critical factors to keep in mind when shopping for a video card.
Target Resolution and Monitor Tech: Your First Considerations
Your video card’s resolution is the number of pixels it can output per inch (Hz/vpi) on your monitor. A video card’s price and performance are directly related to how well it can be used for gaming.
One of the most important things to think about when purchasing a video card for a personal computer is what resolution it works best at. At 3,840 by 2,160 pixels, even low-end graphics cards are capable of displaying ordinary programmes (a.k.a., 4K). However, these GPUs lack the capacity to maintain smooth frame rates at such high resolutions in demanding PC games. The video card is responsible for rendering the on-screen image in real time and calculating on-screen coordinates, geometry, and lighting. You need additional graphics-card power to play at a high degree of in-game detail and monitor resolution.
Resolution Is a Key Decision Point
1080p, 1440p, and 2160p or 4K are the three most popular screen resolutions used by today’s video gamers (3,840 by 2,160 pixels). Generally speaking, you’ll want to get a graphics card that matches the native resolution of your monitor. If you want the greatest picture, you’ll want to use your display at its “original” resolution.
These in-between resolutions (3,440 by 1,440 pixels is a common one) are also known as ultra-wide-screen monitors; you can compare their pixel counts to the more common ones, like 1080p, 1440p, and 2160p, by calculating their raw pixel counts (by multiplying the vertical number by the horizontal number). If you’re looking for the finest graphics cards for 1080p or 4K gaming, check out our specific reviews.
What’s the big deal? When it comes to PC gaming, the power of the components within your next PC should be distributed in a way that best suits the way you want to play, whether you purchase one, construct one, or upgrade.
Here’s how it works, without getting too bogged down in the weeds: It’s rare that either the CPU or the GPU is solely responsible for the high frame rates you’ll notice when gaming in 1080p, especially at the maximum detail settings.
The 1440p resolution is next, which begins to divide the workload when playing at higher fidelity settings. While some games begin to place greater demands on the GPU, others can continue to do the heavy math on the CPU. (It all depends on the developer’s optimization of the game.) Then there’s 4K resolution, where the GPU is doing nearly all the heavy lifting.
To get a better performance out of games at a higher-than-recommended resolution, you may always lower the detail levels or lower the resolution altogether. However, this negates the purpose of purchasing a graphics card to some extent. You don’t need to spend $1,000 or even half that much to play well at 1080p; the highest-end cards are designed for 4K play or for playing at very high refresh rates at 1080p or 1440p.
In a nutshell, you should always acquire a graphics card that is compatible with the display you anticipate on using in the near future. According to the Steam Hardware Survey, the most popular display resolution for PC gamers is 1440p at its best, and 4K is still a distant dream for the majority of gamers. (At the beginning of 2022, it witnessed less than 3% of users playing in 4K.)
High-Refresh Gaming: Why High-End GPUs Matter
Another gaming trend to keep an eye on is high-refresh gaming monitors, which have garnered significant traction in recent years. 60Hz (60 screen refreshes per second) was the standard for most PC monitors for a long time, but that was before esports took off.
If you’re into esports and high-refresh gaming, you may find panels that can run at up to 360 frames per second. A high-refresh display may allow you to witness the “wasted” frames of a high-end video card in the form of smoother game motion when playing at 60 frames per second.
As a result of esports success stories like Bugha becoming a multi-millionaire overnight, the demand for high refresh monitors has increased significantly in recent years. While 1080p is still the most popular choice for competitive gamers across all genres, many are now following the lead of monitor manufacturers and upgrading to higher resolutions.
Many gamers are now opting for higher resolutions like 1440p (played in either 16:9 or 21:9 aspect ratios, respectively), thanks to monitors like as the ViewSonic Elite XG270QG that combine high-refresh rates with high-quality panels. This is helping to fuel the growth of the 1440p gaming market. Both the cards and the panels are engaging in a form of self-referential game of leapfrog.
If you want to hit your opponents with pin-point accuracy, you’ll want to play at a higher resolution. However, many aspiring eSports players and currently employed professionals still swear by playing Counter-Strike: Global Offensive at a resolution of as low as 720p. Your mileage may vary depending on how you choose to play, as well as the games you choose to participate in.
Extreme refresh rates aren’t a big deal for casual gamers, but if you’re playing a fast-paced game or competing in esports, they can make a big impact. For the greatest gaming displays, including high-refresh monitors, check out our top recommendations. Even if you’re playing at a “pedestrian” resolution like 1080p with a high-refresh panel, a fast video card that can push high frame rates can be an advantage.
In the end, don’t forget HDR compatibility. Even our Editors’ Choice picks for best gaming displays of the last several months all support HDR at some extent. In our tests, HDR 10 and HDR 400 monitors don’t have much of an impact on their HDR image quality, while monitors above HDR 600 should be considered as both a gaming and HDR-enhanced content display, respectively.
For HDR transfer, customers must ensure that the monitor they select supports a refresh rate and bitrate that the new card can support. In spite of the fact that it is a tense dance, it can pay off handsomely in terms of content development and gaming monitors.
FreeSync vs. G-Sync: Jets! Sharks! Maria?
Which of these two venerable specifications should you look for in a graphics card if you want a smoother gaming experience? Your monitor might make a difference.
Both FreeSync (from AMD) and G-Sync (from Nvidia) are variations on an adaptive sync technology. In games with adaptive sync, the monitor’s refresh rate fluctuates based on the video card’s output capabilities at any given point in the game, allowing for smooth gameplay. Without it, frame rate wobbles can cause to artefacts, visual stuttering, or screen tearing, in which mismatched screen halves appear for a brief period of time. A full frame is drawn only if the video card is capable of delivering a full frame.
FreeSync or G-Sync may be supported by your display, or it may not. As a result, FreeSync is far more prevalent than G-Sync, because it doesn’t increase the manufacturing costs of a monitor. Based on this, you might prefer the products of one GPU manufacturer over another, but keep in mind that the landscape is shifting. FreeSync-compatible monitors may now use adaptive sync with late-model Nvidia GeForce cards thanks to a driver update announced by Nvidia at CES 2019. Additionally, Nvidia has certified a growing number of FreeSync monitors as “G-Sync Compatible.” As a result, the decision may not be as cut and dry as it has been in the past.
Even if you’re a professional CS:GO or Overwatch player, you may not notice any noticeable difference between the two in the latest models of the headsets. For many years, screen tearing was a difficult problem to overcome, but now days it’s only the eyes of an expert that can detect the difference between G-Sync and FreeSync monitors.
Meet the Radeon and GeForce Families
Now that we’ve spoken about how these two warring gangs have come together in recent years, let’s speak about the differences between them. Low-end versions for low-resolution gaming range up to high-end models for 4K gaming and/or very high refresh rates, with GPU lines from the two major graphics-chip vendors constantly changing. First, let’s take a look at what Nvidia has to offer. (Again, keep in mind that MSRPs are only “recommended” prices these days, and that the volatility of today’s prices against the list prices we mention!)
A Look at Nvidia’s Lineup
As of late 2020, the company’s main card stack was divided between cards based on the “Turing” line of GPUs and newer GTX 1600 series cards, both of which were based on the Turing architecture. The GeForce RTX 30-Series cards, the most recent additions, are high-end cards based on GPUs using the “Ampere” architecture.
An overview of the Turing, Ampere, and Pascal families’ current relevant card classes, their rough cost, and their use cases may be seen below…
The Nvidia GeForce Card Lineup, Early 2022
Many of the older GeForce GTX Pascal cards, such as the GTX 1070 and GTX 1080, aren’t included here if you’ve been following the market for a while. Due to the success of the GeForce RTX 2022 successors, these cards are no longer in production and can be found for sale on the secondary market. Due to the debut of the GeForce GTX 1660 and GTX 1660 Ti, the GeForce GTX 1060 has suffered a similar demise, while the GTX 1050 has since lost its importance due to the GTX 1650 and GTX 1650 Super.
Here is a brief introduction to Turing. In the wake of Nvidia’s 20-series GPU debut in September 2018, there was a mixed response. However, the business was touting some of the most powerful GPUs available, as well as cutting-edge features like ray-tracing and dynamic light scattering (DLSS). To be fair, there were no games that supported ray-tracing or deep learning at the time of the release of the Turing chip. However, DLSS 2.0 on its own or when combined with Raytracing is still a small percentage of the games available on the market.
However, in comparison to previous generations, Nvidia has shifted the goalposts for high-end GPU pricing. With a list price of almost $1,000 for Nvidia’s new flagship graphics card, the GeForce RTX 2080 Ti, the GeForce RTX 2080 was not much cheaper at $699.
At the beginning of 2019, when AMD launched the AMD Radeon RX 5700 and Radeon RX 5700 series of midrange GPUs, Nvidia reversed its course by releasing the GeForce RTX 2060 Super, RTX 2070 Super, and RTX 2080 Super. Nvidia’s Super cards, which are available for both RTX and GTX models, improve the specifications of the cards they’re supposed to replace in a system (some more effectively than others).
All of this points to the GeForce RTX 30 Series debuting in September of 2020. GeForce RTX 3070, GeForce RTX 3080, and GeForce RTX 3090 GPUs were introduced by Nvidia. One after another of the RTX 3080 Ti, RTX 3070 Ti and RTX 3060 Ti were introduced. They deserve their own speculative breakout. (Note: Only third-party variants of the RTX 3060 and 3050 are available as Founders Editions.)
Nvidia GeForce RTX 30 Series: Selected Cards Compared