Updated on August 14, 2022
Best Graphics Card for Budget 1080p Gaming (AMD or Nvidia)

Nvidia GeForce GTX 1650 Super
Best Graphics Card for High-Refresh 1080p Gaming (AMD or Nvidia)

Nvidia GeForce RTX 3050
As you can see from the Amazon prices above, the current state of the GPU market is anything but “normal” at this time. This buying-strategies guide will help you locate a good deal on a card if you’re thinking about making a purchase soon. To get the most out of your current GPU, check out this tutorial on how to maximise its performance.
Also keep in mind that our recommendations above are based (in ascending order) on your ideal gameplay resolution, with Nvidia and AMD GPUs selected for each use case (unless one is an unequivocal clear choice). There are many more third-party cards available than what we’ve included here. A single reference card (such as the GeForce GTX 1660 Ti or the Radeon RX 6700 XT) might be seen as an endorsement of the entire GPU family.
Whether you’re a PC gamer or a multimedia maker, your video card determines how much you can accomplish—or how lustfully you may brag—with your graphics-accelerated software.
In this tutorial, we’ll show you how to get the best video card for your desktop PC, what you need to know about upgrading a system, and how to determine whether or not a particular card is a decent purchase. Also, keep an eye out for emerging styles, since they may influence your decision over which card to buy. At the end of the day, consumer video cards can cost anywhere from as little as $100 to as much as $1,499 (and that’s just the MSRP). We will not allow you to overpay or underbuy.
Who’s Who in GPUs: AMD vs. Nvidia
To begin, what is the purpose of a graphics card? And do you really need one of these?
As a general rule, PC manufacturers don’t put much emphasis on graphics cards in their prebuilt desktop PCs unless they’re specifically designed for gaming. If a low-cost PC doesn’t come equipped with any kind of discrete graphics card at all, it may do so to save money by utilising the graphics acceleration built into the CPU (an “integrated graphics processor,” commonly called an “IGP”). There is nothing intrinsically wrong with using an IGP—most business laptops, affordable consumer laptops, and budget-minded desktops have them—but the correct graphics card is essential for gamers and creators.
Displaying 2D and 3D content, drawing the desktop, and decoding and encoding video information in programmes and games are all handled by current graphics solutions. AMD or Nvidia’s big graphics processor chips power every consumer discrete video card on the market today. “GPUs,” or “graphics processing units,” are the name used to describe these processors, which are also known as graphics cards. Graphics cards, or GPUs, are not easy to understand!
For their video cards, the two companies create “reference designs,” a standard version of the card based on a specific GPU. Occasionally, Nvidia sells these reference-design cards directly (or, less often, by AMD).
Share This Date in History on Twitter
Connatix’s Sponsored Content
To distinguish Nvidia-branded cards, look for the “Founders Edition” label. Up until the debut of Nvidia’s most recent GeForce RTX 3000 series, this label meant little more than slightly faster clock speeds and a more durable design. Founders Edition cards are often the most visually uniform of any cards that may be released during the lifespan of a specific GPU. ” However, their designs tend to be more conservative than those of third-party choices, which are more tolerant of extreme overclocking or modification.
Nvidia’s new Founders Edition cards defy common thought in the most radical way possible. Each model of the RTX 20 Series Nvidia GPUs has a PCB (printed circuit board, or “guts”) that is half as large as the previous generation’s and is equipped with Nvidia’s unique “push-pull” cooling technology. In these cards, Nvidia has placed its engineering prowess on full display, and while AMD has put up a decent fight on performance, the RTX 30 Series Founders Edition cards stand alone.
Founders Edition cards from Nvidia will benefit from this treatment because they’ll be smaller, lighter, and faster than ever before. Sometimes third-party card producers (known as AMD or Nvidia “board partners”) produce cards that are nearly identical to the official reference cards. Board partners may sell their own self-branded versions of the reference card (adhering to the design and specifications set by AMD or Nvidia, for example), or they may fashion their own custom products, with different cooling fan designs, slight overclocking done from the factory, or features such as LED mood illumination. Some board partners will sell both the reference and their own, more radical designs of a given GPU.
Even while the MSI GeForce RTX 3080 Gaming X Trio 10G and AMD’s Radeon RX 6000 Series GPUs look great, they lack the kind of cooling and power efficiency improvements found in the Founders Edition cards of the current generation.
Who Needs a Discrete GPU?
Earlier, we spoke of IGPs (integrated graphics processors). There are three major exceptions to the general rule that IGPs can suit the needs of today’s mainstream users…
Users of high-end workstations. Workers in CAD or in video and photo processing will find a discrete GPU to be an invaluable tool. In some of their most important applications, they can use the GPU’s resources instead of (or in addition to) those provided by the CPU to accomplish specific functions. How much faster it is depends on a variety of factors, such as the precise GPU and CPU you have.
Users who are focused on getting things done with many monitors. A separate GPU can help those who need a high number of monitors. In desktop operating systems, both the IGP and discrete GPUs can be used simultaneously to drive displays. It is possible to combine an IGP and a discrete GPU if you wish to connect five or six screens to the same computer.
A high-end graphics card is not required for this, however. For non-demanding PC games, all you need is a graphics card that supports the display specs, resolutions, monitor interfaces, and the number of panels required to display your business apps, browser windows, or other static windows. There is no advantage to using a GeForce RTX 3080 card over a GeForce GTX 1660 with the same supported outputs if you’re running three web browsers on three displays.
Gamers. Finally, we have gamers that place a high value on the graphics processing unit (GPU). If you must choose between a high-end system from 2018 with a 2022 GPU and a high-end system from today with the highest-end GPU you could get in 2018, the former is the one you should go with.
Gaming and light content creation use consumer-grade graphics cards; professional workstations require specialised graphics cards designed for scientific computations, calculations and artificial intelligence. In this guide, as well as our reviews, we’ll focus on the former, but we’ll touch on workstation cards at some point later. Nvidia’s GeForce and AMD’s Radeon RX (on the consumer side of things) and Nvidia’s Titan and Quadro (now RTX A-Series), as well as AMD’s Radeon Pro and Radeon Instinct (on the professional side of things) are the main sub-brands you need to know (in the pro workstation field). In both markets, Nvidia is the undisputed leader.
For the time being, we’ll be focusing on consumer credit cards. In early 2022, Nvidia’s GeForce GTX and GeForce RTX consumer card lines are split into two distinct classes, both under the long-running GeForce brand: AMD’s consumer cards, on the other hand, include the Radeon RX and RX Vega families, as well as the end-of-life Radeon VII. Prior to delving into each line, let’s take a look at some of the most critical factors to keep in mind when shopping for a video card.
Target Resolution and Monitor Tech: Your First Considerations
Your video card’s resolution is the number of pixels it can output per inch (Hz/vpi) on your monitor. A video card’s price and performance are directly related to how well it can be used for gaming.
One of the most important things to think about when purchasing a video card for a personal computer is what resolution it works best at. At 3,840 by 2,160 pixels, even low-end graphics cards are capable of displaying ordinary programmes (a.k.a., 4K). However, these GPUs lack the capacity to maintain smooth frame rates at such high resolutions in demanding PC games. The video card is responsible for rendering the on-screen image in real time and calculating on-screen coordinates, geometry, and lighting. You need additional graphics-card power to play at a high degree of in-game detail and monitor resolution.
Resolution Is a Key Decision Point
1080p, 1440p, and 2160p or 4K are the three most popular screen resolutions used by today’s video gamers (3,840 by 2,160 pixels). Generally speaking, you’ll want to get a graphics card that matches the native resolution of your monitor. If you want the greatest picture, you’ll want to use your display at its “original” resolution.
These in-between resolutions (3,440 by 1,440 pixels is a common one) are also known as ultra-wide-screen monitors; you can compare their pixel counts to the more common ones, like 1080p, 1440p, and 2160p, by calculating their raw pixel counts (by multiplying the vertical number by the horizontal number). If you’re looking for the finest graphics cards for 1080p or 4K gaming, check out our specific reviews.
What’s the big deal? When it comes to PC gaming, the power of the components within your next PC should be distributed in a way that best suits the way you want to play, whether you purchase one, construct one, or upgrade.
Here’s how it works, without getting too bogged down in the weeds: It’s rare that either the CPU or the GPU is solely responsible for the high frame rates you’ll notice when gaming in 1080p, especially at the maximum detail settings.
The 1440p resolution is next, which begins to divide the workload when playing at higher fidelity settings. While some games begin to place greater demands on the GPU, others can continue to do the heavy math on the CPU. (It all depends on the developer’s optimization of the game.) Then there’s 4K resolution, where the GPU is doing nearly all the heavy lifting.
To get a better performance out of games at a higher-than-recommended resolution, you may always lower the detail levels or lower the resolution altogether. However, this negates the purpose of purchasing a graphics card to some extent. You don’t need to spend $1,000 or even half that much to play well at 1080p; the highest-end cards are designed for 4K play or for playing at very high refresh rates at 1080p or 1440p.
In a nutshell, you should always acquire a graphics card that is compatible with the display you anticipate on using in the near future. According to the Steam Hardware Survey, the most popular display resolution for PC gamers is 1440p at its best, and 4K is still a distant dream for the majority of gamers. (At the beginning of 2022, it witnessed less than 3% of users playing in 4K.)
High-Refresh Gaming: Why High-End GPUs Matter
Another gaming trend to keep an eye on is high-refresh gaming monitors, which have garnered significant traction in recent years. 60Hz (60 screen refreshes per second) was the standard for most PC monitors for a long time, but that was before esports took off.
If you’re into esports and high-refresh gaming, you may find panels that can run at up to 360 frames per second. A high-refresh display may allow you to witness the “wasted” frames of a high-end video card in the form of smoother game motion when playing at 60 frames per second.
As a result of esports success stories like Bugha becoming a multi-millionaire overnight, the demand for high refresh monitors has increased significantly in recent years. While 1080p is still the most popular choice for competitive gamers across all genres, many are now following the lead of monitor manufacturers and upgrading to higher resolutions.
Many gamers are now opting for higher resolutions like 1440p (played in either 16:9 or 21:9 aspect ratios, respectively), thanks to monitors like as the ViewSonic Elite XG270QG that combine high-refresh rates with high-quality panels. This is helping to fuel the growth of the 1440p gaming market. Both the cards and the panels are engaging in a form of self-referential game of leapfrog.
If you want to hit your opponents with pin-point accuracy, you’ll want to play at a higher resolution. However, many aspiring eSports players and currently employed professionals still swear by playing Counter-Strike: Global Offensive at a resolution of as low as 720p. Your mileage may vary depending on how you choose to play, as well as the games you choose to participate in.
Extreme refresh rates aren’t a big deal for casual gamers, but if you’re playing a fast-paced game or competing in esports, they can make a big impact. For the greatest gaming displays, including high-refresh monitors, check out our top recommendations. Even if you’re playing at a “pedestrian” resolution like 1080p with a high-refresh panel, a fast video card that can push high frame rates can be an advantage.
HDR Compatibility
In the end, don’t forget HDR compatibility. Even our Editors’ Choice picks for best gaming displays of the last several months all support HDR at some extent. In our tests, HDR 10 and HDR 400 monitors don’t have much of an impact on their HDR image quality, while monitors above HDR 600 should be considered as both a gaming and HDR-enhanced content display, respectively.
For HDR transfer, customers must ensure that the monitor they select supports a refresh rate and bitrate that the new card can support. In spite of the fact that it is a tense dance, it can pay off handsomely in terms of content development and gaming monitors.
FreeSync vs. G-Sync: Jets! Sharks! Maria?
Which of these two venerable specifications should you look for in a graphics card if you want a smoother gaming experience? Your monitor might make a difference.
Both FreeSync (from AMD) and G-Sync (from Nvidia) are variations on an adaptive sync technology. In games with adaptive sync, the monitor’s refresh rate fluctuates based on the video card’s output capabilities at any given point in the game, allowing for smooth gameplay. Without it, frame rate wobbles can cause to artefacts, visual stuttering, or screen tearing, in which mismatched screen halves appear for a brief period of time. A full frame is drawn only if the video card is capable of delivering a full frame.
FreeSync or G-Sync may be supported by your display, or it may not. As a result, FreeSync is far more prevalent than G-Sync, because it doesn’t increase the manufacturing costs of a monitor. Based on this, you might prefer the products of one GPU manufacturer over another, but keep in mind that the landscape is shifting. FreeSync-compatible monitors may now use adaptive sync with late-model Nvidia GeForce cards thanks to a driver update announced by Nvidia at CES 2019. Additionally, Nvidia has certified a growing number of FreeSync monitors as “G-Sync Compatible.” As a result, the decision may not be as cut and dry as it has been in the past.
Even if you’re a professional CS:GO or Overwatch player, you may not notice any noticeable difference between the two in the latest models of the headsets. For many years, screen tearing was a difficult problem to overcome, but now days it’s only the eyes of an expert that can detect the difference between G-Sync and FreeSync monitors.
Meet the Radeon and GeForce Families
Now that we’ve spoken about how these two warring gangs have come together in recent years, let’s speak about the differences between them. Low-end versions for low-resolution gaming range up to high-end models for 4K gaming and/or very high refresh rates, with GPU lines from the two major graphics-chip vendors constantly changing. First, let’s take a look at what Nvidia has to offer. (Again, keep in mind that MSRPs are only “recommended” prices these days, and that the volatility of today’s prices against the list prices we mention!)
A Look at Nvidia’s Lineup
As of late 2020, the company’s main card stack was divided between cards based on the “Turing” line of GPUs and newer GTX 1600 series cards, both of which were based on the Turing architecture. The GeForce RTX 30-Series cards, the most recent additions, are high-end cards based on GPUs using the “Ampere” architecture.
An overview of the Turing, Ampere, and Pascal families’ current relevant card classes, their rough cost, and their use cases may be seen below…
The Nvidia GeForce Card Lineup, Early 2022
NORMAL RANGE (MSRP) | FAMILY | VR SUITABILITY | GAMING RESOLUTION SUITABILITY | |
---|---|---|---|---|
GeForce GT 1030 | $70 to $100 | Pascal | No | Low-detail 720p gaming; multi-display situations |
GeForce GTX 1050 | $125 to $160 | Pascal | No | 720p or low-detail 1080p |
GeForce GTX 1050 Ti | $150 to $200 | Pascal | Borderline (Rift); No (Vive) | Moderate-detail 1080p |
GeForce GTX 1650 | $150 to $180 | Turing | No | Moderate-detail 1080p |
GeForce GTX 1650 Super | $160 to $180 | Turing | No | Moderate-detail 1080p |
GeForce GTX 1660 | $220 to $250 | Turing | Yes | High-detail 1080p |
GeForce GTX 1660 Super | $210 to $290 | Turing | Yes | High-detail 1080p, moderate-detail 1440p |
GeForce GTX 1660 Ti | $280 to $330 | Turing | Yes | High-detail 1080p; moderate-detail 1440p |
GeForce RTX 2060 | $350 to $470 | Turing | Yes (VirtualLink on some cards) | High-detail 1080p or 1440p |
GeForce RTX 2060 Super | $379 to $499 | Turing | Yes (VirtualLink on some cards) | High-detail 1080p or 1440p; moderate-detail 4K |
GeForce RTX 2070 | $500 to $700 (end of life) | Turing | Yes (VirtualLink) | High-detail 1080p or 1440p; moderate-detail 4K |
GeForce RTX 2070 Super | $499 to $699 | Turing | Yes (VirtualLink) | High-detail 1080p or 1440p; moderate-detail 4K |
GeForce RTX 2080–RTX-2080/s?k=GeForce+RTX+2080 | $700 to $900 (end of life) | Turing | Yes (VirtualLink) | High-detail 1080p, 1440p, or 4K |
GeForce RTX 2080 Super | $699 | Turing | Yes (VirtualLink) | High-detail 1080p, 1440p, or 4K |
GeForce RTX 2080 Ti | $1,000 to $1,500 | Turing | Yes (VirtualLink) | High-detail 1080p, 1440p, or 4K |
GeForce RTX 3050 | $249 to $489 | Ampere | Yes | High-detail 1080p, medium-detail 1440p |
GeForce RTX 3060 | $299 to $399 | Ampere | Yes | High-detail 1080p, 1440p, or low-detail 4K |
GeForce RTX 3060 Ti | $399 to $499 | Ampere | Yes | High-detail 1080p, 1440p, or low-detail 4K |
GeForce RTX 3070 | $499 to $649 | Ampere | Yes | High-detail 1080p, 1440p, or 4K |
GeForce RTX 3070 Ti | $599 to $899 | Ampere | Yes | High-detail 1080p, 1440p, or 4K |
GeForce RTX 3080 | $699 to $799 | Ampere | Yes | High-detail 1080p, 1440p, or 4K |
GeForce RTX 3080 Ti | $1,199 to $1,699 | Ampere | Yes | High-detail 1080p, 1440p, or 4K |
GeForce RTX 3090 | $1,499 to $1,799 | Ampere | Yes | High-detail 1080p, 1440p, or 4K |
Many of the older GeForce GTX Pascal cards, such as the GTX 1070 and GTX 1080, aren’t included here if you’ve been following the market for a while. Due to the success of the GeForce RTX 2022 successors, these cards are no longer in production and can be found for sale on the secondary market. Due to the debut of the GeForce GTX 1660 and GTX 1660 Ti, the GeForce GTX 1060 has suffered a similar demise, while the GTX 1050 has since lost its importance due to the GTX 1650 and GTX 1650 Super.
Here is a brief introduction to Turing. In the wake of Nvidia’s 20-series GPU debut in September 2018, there was a mixed response. However, the business was touting some of the most powerful GPUs available, as well as cutting-edge features like ray-tracing and dynamic light scattering (DLSS). To be fair, there were no games that supported ray-tracing or deep learning at the time of the release of the Turing chip. However, DLSS 2.0 on its own or when combined with Raytracing is still a small percentage of the games available on the market.
However, in comparison to previous generations, Nvidia has shifted the goalposts for high-end GPU pricing. With a list price of almost $1,000 for Nvidia’s new flagship graphics card, the GeForce RTX 2080 Ti, the GeForce RTX 2080 was not much cheaper at $699.
At the beginning of 2019, when AMD launched the AMD Radeon RX 5700 and Radeon RX 5700 series of midrange GPUs, Nvidia reversed its course by releasing the GeForce RTX 2060 Super, RTX 2070 Super, and RTX 2080 Super. Nvidia’s Super cards, which are available for both RTX and GTX models, improve the specifications of the cards they’re supposed to replace in a system (some more effectively than others).
All of this points to the GeForce RTX 30 Series debuting in September of 2020. GeForce RTX 3070, GeForce RTX 3080, and GeForce RTX 3090 GPUs were introduced by Nvidia. One after another of the RTX 3080 Ti, RTX 3070 Ti and RTX 3060 Ti were introduced. They deserve their own speculative breakout. (Note: Only third-party variants of the RTX 3060 and 3050 are available as Founders Editions.)
Nvidia GeForce RTX 30 Series: Selected Cards Compared
Nvidia GeForce RTX 3060 Ti Founders Edition | Nvidia GeForce RTX 3070 Founders Edition | Nvidia GeForce RTX 3080 Founders Edition | Nvidia GeForce RTX 3090 Founders Edition | |
---|---|---|---|---|
Architecture | Ampere | Ampere | Ampere | Ampere |
Manufacturing Process / Die Size | 8nm / 392.5mm2 | 8nm / 392.5mm2 | 8nm / 628.4mm2 | 8nm / 628.4mm2 |
Transistor Count | 17.4 billion | 17.4 billion | 28.3 billion | 28.3 billion |
Processing Cores | 4,864 | 5,888 | 8,704 | 10,496 |
GPU Boost Clock | 1,665MHz | 1,730MHz | 1,710MHz | 1,700MHz |
RT Cores (2nd Gen) | 38 (2nd Gen) | 46 | 68 | 82 |
Tensor Cores (3rd Gen) | 152 (3rd Gen) | 184 | 272 | 328 |
Video Memory | 8GB GDDR6 | 8GB GDDR6X | 10GB GDDR6X | 24GB GDDR6X |
Memory Interface / Bandwidth | 256-bit / 448GBps | 256-bit / 448GBps | 320-bit / 760GBps | 384-bit / 936GBps |
Board Power | 200 watts | 220 watts | 320 watts | 350 watts |
Power Connectors | One eight-pin (via 12-pin adapter) | One 12-pin | One 12-pin | One 12-pin |
Launch Price | $399 | $499 | $699 | $1,499 |
RT cores and Tensor cores are now in their second generation, respectively, and the memory type has been upgraded from GDDR6 to GDDR6X on the cards made on Samsung’s 8nm process. As a result, everything from the location of various modules and chips on the board to the inner workings of an entirely new heatsink has had to be rethought in response to these reworked boards.
When it comes to the 30 Series’ impact on Nvidia card prices, we still consider the GeForce GT 1030 and GTX 1050 to be relatively low-end cards, costing less than $100. With prices ranging from $150 to $300 or so at list, Nvidia’s current low-to-midrange lineup is comprised of the RTX 2060 Super, RTX 3060 and the GTX 1650/GTX 1650 Super.
The GeForce RTX 30 Series has added a new layer of complexity to the midrange and high end GPU market. If you’re looking for a “elite” high-end GPU, the GeForce RTX 3080 and RTX 3090 are the ones to go for, starting at $699 (MSRP!), and rising from there. This separates them from the more moderate (but still powerful) options like the GeForce RTX 2060, RTX 2070, RTX 2080, and RTX 3060. In the present pricing structure, these cards typically start at $350 MSRP and can go as high as $650 at MSRP depending on the model.
A Look at AMD’s Lineup
As for AMD’s card classes, in 2022 the company is stronger than it has been for some time, competing ably enough with Nvidia’s low-end, mainstream, and high-end cards.
The AMD Radeon Card Lineup, Early 2022
NORMAL RANGE (MSRP) | FAMILY | VR SUITABILITY | GAMING RESOLUTION SUITABILITY | |
---|---|---|---|---|
Radeon RX 550 | $85 to $140 (end of life) | Polaris | No | 720p or low-detail 1080p |
Radeon RX 560 | $100 to $170 (end of life) | Polaris | No | 720p or low-detail 1080p |
Radeon RX 570 | $150 to $210 (end of life) | Polaris | No | 720p or moderate-detail 1080p |
Radeon RX 5500 XT | $170 to $200 | Navi | Yes | Moderate-detail 1080p, low-detail 1440p |
Radeon RX 580 | $170 to $220 (end of life) | Polaris | Yes | High-detail 1080p; moderate-detail 1440p |
Radeon RX 590 | $190 to $240 (end of life) | Polaris | Yes | High-detail 1080p; moderate-detail 1440p |
Radeon RX 5600 XT | $280 to $340 | Navi | Yes | High-detail 1080p; moderate-detail 1440p |
Radeon RX Vega 56 | $280 to $440 (end of life) | Vega | Yes | High-detail 1080p or 1440p |
Radeon RX 5700 | $349 | Navi | Yes | High-detail 1080p or 1440p |
Radeon RX Vega 64 | $380 to $600 (end of life) | Vega | Yes | High-detail 1080p/1440p; moderate-detail 4K |
Radeon RX 5700 XT | $399 | Navi | Yes | High-detail 1080p/1440p; moderate-detail 4K |
Radeon RX 6500 XT | $199 and up | Navi | Yes | High-detail 1080p, low-detail 1440p |
Radeon RX 6600 | $329 and up | Big Navi | Yes | High-detail 1080p or 1440p |
Radeon RX 6600 XT | $459 and up | Big Navi | Yes | High-detail 1080p or 1440p |
Radeon RX 6700 XT | $479 and up | Big Navi | Yes | High-detail 1080p/1440p; moderate-detail 4K |
Radeon RX 6800 | $449 and up | Big Navi | Yes | High-detail 1080p/1440p; moderate-detail 4K |
Radeon RX 6800 XT | $549 and up | Big Navi | Yes | High-detail 1080p/1440p; moderate-detail 4K |
Radeon RX 6900 XT | $999 and up | Big Navi | Yes | High-detail 1080p/1440p; moderate-detail 4K |
Radeon VII | $549 to $699 (end of life) | Vega | Yes | High-detail 1080p/1440p; moderate-detail
4K |
With the newer Radeon RX 5500 XT and the RX 5600 XT, and the even newer Radeon RX 6600 and RX 6500 XT, AMD’s 1080p-play lineup has been bolstered, but its older RX 570 through RX 590 cards are still good for 1080p gaming, but their time is running out. Despite the fact that the RX 580 is still an excellent deal for 1080p gaming, and the RX Vega 56 and RX Vega 64 are capable of both 1080p and 1440p resolutions, these cards are clearly of a bygone era.
Indeed, AMD’s 1080p and 1440p cards have undergone a major overhaul. The first in the company’s long-awaited line of 7nm-based “Navi” mid-range graphics cards based on a new architecture, Radeon DNA, was released in July of 2019. (RDNA). The Radeon RX 5700, Radeon RX 5700 XT, and the Radeon RX 5700 XT Anniversary Edition were the first three cards released. All of these cards are focused on the 1440p gaming market. Each of these GPUs is capable of delivering AAA-quality video at a frame rate greater than 60 frames per second in that pixel range.
In the first half of 2021, AMD’s Radeon RX 6700 XT, RX 6800, RX 6800 XT, and RX 6900 XT took the fight to Nvidia. Compared to the previous generation of RDNA 1 cards, these RDNA 2-based cards offer a better value for money, although they lag below Nvidia’s 30 Series Founders Edition cards in terms of design, cooling, and driver stability.
RDNA 2 will be shown to the public in 2020, when the business plans to take the curtain off. RDNA 2 is included in AMD’s discrete desktop graphics cards, as well as the Sony PS5 and Xbox Series X consoles, and aims to keep the firm in close competition with Team Green by enhancing many of the capabilities that debuted in RDNA. Microsoft’s DX12 Ultimate API and ray-tracing compute cores are included. The first batch of RDNA 2 cards have been released…
AMD Radeon RX 6000 Series: Selected Cards Compared
AMD Radeon RX 6800 | AMD Radeon RX 6800 XT | AMD Radeon RX 6900 XT | |
---|---|---|---|
Architecture | RDNA 2 | RDNA 2 | RDNA 2 |
Manufacturing Process / Die Size | 7nm / 519mm2 | 7nm / 519mm2 | 7nm / 519mm2 |
Transistor Count | 26.8 billion | 26.8 billion | 26.8 billion |
Processing Cores | 3,840 | 4,608 | 5,120 |
GPU Boost Clock | 2,105MHz | 2,250MHz | 2,250MHz |
RT Cores | 60 | 72 | 80 |
Video Memory | 16GB GDDR6 | 16GB GDDR6 | 16GB GDDR6 |
Memory Interface / Bandwidth | 256-bit / 512GBps | 256-bit / 512GBps | 256-bit/512GBps |
Board Power | 250 watts | 300 watts | 300 watts |
Power Connectors | Two eight-pin | Two eight-pin | Two eight-pin |
Launch Price | $579 | $649 | $999 |

Graphics Card Basics: Understanding the Core Specs
Our comparison charts above should help you narrow down your options based on your monitor’s resolution and the card families you’re interested in (and your budget). The graphics processor’s clock speed, onboard VRAM (i.e. how much video memory it has), and—of course!—price are all important considerations when comparing cards, though.
Clock Speed
High base clock speeds (the speed at which the graphics core operates) and more cores are indicative of a faster GPU when comparing identical GPUs from the same series. However, only cards from the same product family, using the same GPU, can be compared in this way. The Founders Edition GeForce GTX 3080 has a base frequency of 1,710MHz, while MSI’s Gaming X Trio version of the RTX 3080 (using the same chip) has a base speed of 1,815MHz in its out-of-the-box Gaming mode.
The graphics chip’s “boost clock” is not included in this base clock measurement. The graphics chip’s boost clock allows it to momentarily increase its speed when it is under load and thermal conditions permit it to do so. This can also differ from one card in the same family to the next in terms of design. A card’s cooling hardware and factory settings are two factors that can affect how well it performs. Boost clocks for a particular GPU tend to be higher on high-end partner cards with multi-fan coolers.
That’s not even including AMD’s own proprietary GPU speed category: “game clock.” Gamers should expect to witness a “average clock speed across a wide range of titles,” according to the company’s engineers, who tested 25 different games on the company’s RDNA and RDNA 2-based portfolio of cards. So that you don’t compare game clocks to boost or base clocks, which game clock clearly isn’t.
Understanding Onboard Video-Card Memory
Onboard video memory (also referred to by the old-school name “frame buffer”) is usually suited to the needs of games or programmes that the card is built to execute. As long as you’re playing PC games at the appropriate resolutions and detail levels, you may assume that your video card will have enough RAM to keep up. If a card manufacturer overprovisions a card with more memory than it can realistically consume, the price and competitiveness of the card will suffer. There are, however, some kinks in this plan.
Generally speaking, a card built for 1080p gaming these days has 4GB or 6GB of RAM, whereas cards designed for 1440p or 4K gaming have 8GB or more of RAM. For the most part, the amount of RAM on a specific GPU-based card is the same across the board.
There are a few wrinkles: Card manufacturers offer variations of a card with the same GPU but varying quantities of VRAM in some isolated but crucial circumstances. Cards based on the Radeon RX 5500 XT, for example, are available in 4GB and 8GB configurations. So keep an eye on how much RAM both cards have because they’re both midrange GPUs. The less expensive ones will offer fewer features.
A video card with at least 4GB of memory should be mandatory if you’re searching for an all-out 1080p experience. AMD and Nvidia now include more VRAM than this in their $200-plus MSRP GPUs. While Nvidia’s GeForce RTX 3090 has 24GB of GDDR5 memory, AMD has 8GB on its RX-series cards, while the latter has 16GB on its top-of-the-line cards.) Any card with less than 4GB of RAM should only be used for low-resolution gaming, basic or outdated games that don’t require a lot of hardware resources on the part of the user.
It’s a completely other ballgame for creators. The amount of onboard VRAM is more significant in many 3D rendering programmes (as well as VFX operations, modelling, and video editing) than the boost clock speed. In most circumstances, the better a graphics card is for rendering out a complicated VFX picture with thousands, if not millions, of distinct elements, the more VRAM, faster memory, and a greater bandwidth pipe it has.
This is only one of the many specifications you’ll come across. It’s a measure of how fast data can flow into and out of the graphics processing unit (GPU). Since AMD and Nvidia have different designs and, at times, demand different amounts of memory bandwidth, these figures cannot be directly compared.
The type of memory in your next GPU is also a vital consideration, and understanding which type you’re purchasing into is critical depending on the types of games or programmes you want to run, even if you won’t really have a choice within a card line, is crucial…………………………….
Even though AMD’s Radeon VII can theoretically keep up with an RTX 2080 when it comes to gaming, it also has 16GB of HBM2 VRAM for content creators to take advantage of. When it comes to video editing in Adobe Premiere Pro, HBM2 is favoured since it is cheaper to manufacture than its predecessor, HBM1. GDDR5 has since overtaken it in popularity.
Modern GPUs have relied on GDDR6 memory for decades, from the RTX 2080 Ti and RTX Titan all the way down to the AMD Radeon RX 5600 XT. If you’re looking for a VRAM solution that can run even the most demanding AAA games at a high resolution at a reasonable price, GDDR6 is an excellent choice. Radeon RX 6000 Series GPUs from AMD use GDDR6, which would seem like a good upgrade if Nvidia’s most recent cards didn’t already use…
Nvidia has begun using GDDR6X memory in its GeForce RTX 30 Series graphics cards. It more than doubles the original GDDR6 design’s possible bandwidth without encountering the signal deterioration or route interference issues that plagued prior incarnations.
Upgrading a Pre-Built Desktop With a New Graphics Card
For the most part, pre-built computers these days have the cooling capacity to accommodate a new discrete GPU without any issues.
Before purchasing or upgrading a graphics card, measure the available space in your chassis. It’s not uncommon for a gap to exist between the motherboard’s far right edge and the hard drive bays. In some cases, you may only have a few millimetres to spare on your GPU’s total length. In rare circumstances, extremely long cards can be a concern. Our top picks for tiny PC graphics cards may be found here.)
Next, measure the height of your graphics card. In some cases, the card partners have their own coolers that deviate from the normal AMD and Nvidia reference designs. The card you choose should not be too tall to close your case if it has an ornate cooler design.
Finally, we come to the power supply unit (PSU). You’ll need a power supply capable of supporting a new graphics card in your system. If you’re upgrading a pre-built PC that came with a low-end video card or no card at all to a high-end video card, be very cautious about this. The odds are against you if you’re looking at a low-cost or business-oriented PC, as their power supplies tend to be underpowered or underequipped.
Aside from the maximum wattage of your power supply, the number of six-pin and eight-pin connectors on your PSU is also an important consideration. A six-pin power connector for a video card is a common feature in most modern systems, including those sold by OEMs like Dell, HP, and Lenovo, and some have both a six-pin and an eight-pin connector.
A six-pin cable, an eight-pin cable, or a mix of the two is required to power midrange and high-end graphics cards. (Except for the lowest-end cards, which rely entirely on the PCI Express slot for power.) Make sure you know what kind of connectors your card requires before you buy it.
Since some of the GeForce RTX cards require a special adapter (it comes in the box) to turn two eight-pin PSU connectors into a single 12-pin one card-side, and MSI’s 12.7-inch GeForce RTX 3080 Gaming X Trio (and a few other high-end monsters) now require three eight-pin connector suck down its required juice, we’ve noticed some changes here. A third PSU connector isn’t required by just this one new RTX card; there are others as well.
The recommended power supply wattage for each of Nvidia and AMD’s graphics-card families is listed on their respective websites. Take these principles carefully, but keep in mind that they are merely guidelines and are generally conservative in their approach. Do not use a 300-watt power supply if AMD or Nvidia states you need at least a 500-watt power supply to run a specific GPU, but also know that you do not need an 800-watt power supply to guarantee ample headroom. You should always check the power-supply suggestion specifications for the exact card you are looking at before purchasing a third-party version of a given GPU.
SLI, CrossFireX, and NVLink: Fading for Gamers
Both AMD and Nvidia have been moving away from supporting dual-, tri-, or even quad-card systems over the last few generations. These have traditionally been a pricey, but rather simple, approach to boost performance. However, the value proposition is no longer there (especially for PC gamers).
With the addition of two RTX 2080 Ti cards to our testing in 2019, we discovered mixed results to put it bluntly. Today’s games rarely make use of several GPUs and those that do don’t show speed improvements as a result. Nvidia’s SLI and NVLink are twin-card technologies; AMD’s CrossFireX is a twin-card technology. There are several games that run slower because of engine optimization.
It’s a different scenario, however, when it comes to content development. NVLink card pairing is only useful for professional creators, which is why Nvidia’s GeForce RTX 3090 is now the only card in its current portfolio that supports it. It’s a good investment for them.
To sum it up? It is usually always better to go with the most expensive card you can afford rather than a less expensive one that you want to add to your collection in the future.
Ports and Preferences: What Connections Should My Graphics Card Have?
DVI, HDMI, and DisplayPort are the three most prevalent types of ports found on the back of today’s video cards. DVI is still used in some systems and monitors, but it is the oldest of the three standards and no longer appears on high-end graphics cards today.
There are usually three DisplayPorts (sometimes four) and one HDMI port on most graphics cards. There are various distinctions between HDMI and DisplayPort. The first requirement is that your graphics card must support HDMI 2.0a or DisplayPort 1.2/1.2a at the very least if you intend to use a 4K display today or in the near future. Even if the GPU supports HDMI 2.0b or DisplayPort 1.4, smooth 4K playback or gaming will still be a challenge without it. (The most recent generation of cards from both manufacturers will suffice.)
For all of Nvidia’s GeForce RTX 30 Series cards, HDMI 2.0 has bandwidth restrictions of 18Gbps; the latest HDMI 2.1 cable specification has bandwidth limits of 48Gbps (in HDMI 2.1). 8K resolution can be displayed at a refresh rate of up to 60Hz, while 4K is supported up to a maximum of 120Hz.
The 20-series of GeForce RTX Turing cards from Nvidia use a connector called VirtualLink. DisplayPort over USB-C is also supported by this interface, which appears and functions like a USB Type-C connection. Virtual-reality (VR) headsets are what the port was really built for, offering enough power and bandwidth to meet the demands of VR head-mounted displays (HMDs). In theory, it’s a wonderful feature that isn’t available on the Founders Edition cards of the RTX 30 series, therefore its future is uncertain.
Looking Forward: Graphics Card Trends
This is the first time that Nvidia and AMD have engaged in a head-to-head battle over consumer video cards in more than a year, stirring things up between the two major players.
Nvidia DLSS, Super Resolution, and Image Sharpening Technology
The inclusion of image-sharpening technologies like as AMD’s Radeon Image Sharpening (RIS), FidelityFX with CAS (also from AMD), and Nvidia’s Freestyle has had a significant impact on the gaming landscape in the last year or two. The question is, what are these programmes, and how can they benefit gamers who are on a tight budget while purchasing games?
Render scaling is to blame for everything. Your graphics options may have an option to modify the size of the game’s rendered area, which is common in most recent games. You may think of it as reducing the “render” resolution to something like 2,048 by 1,152 pixels while still maintaining the existing game resolution (let’s say 2,560 by 1,440 pixels) (again, for the sake of this example).
Then again, who would intentionally make their game appear bad? Those that use game sharpeners are the ones to blame. Scaling down a game’s render resolution using image-sharpening technologies increases frame rate (since lower resolutions use less GPU muscle to draw), while an image-sharpening technology cleans things up in the background at a minimal performance cost.
A sharpening filter is used to restore detail to the downsampled image as part of the “cleaning things up” process. You can theoretically gain a large amount of performance without sacrificing much visual clarity if you set it just right (85% down-render with a 35% sharpen scale is a popular ratio). What’s the significance of this? If you’re able to reduce the frame rate of your game without sacrificing visual quality, you’ll be able to get by with a cheaper video card.
During our testing, we determined that the peak down-sample is roughly 30%. For the same high-definition gaming experience as running the game at its native resolution without render scaling, you can buy a card that’s nearly a third cheaper than what you were originally looking at, sharpen it back up to 30% using one of the aforementioned sharpening tools and still get close to the same experience.
With regard to smoothing out polygons around the edges of characters and objects, Nvidia has come up with DLSS, or “deep learning supersampling,” which stands for “deep-learning supersampling.” Video game anti-aliasing is one of the most computationally intensive tasks for a graphics card to perform in video games, and it has been employed from its creation to achieve the same goal: to make the jagged thing look smoother.
Nvidia makes use of artificial intelligence to assist in DLSS. Although it’s expensive (i.e., requires a GeForce RTX card) for the time being, direct light scattering is more expensive (i.e., requires an RTX card) because it can’t be done on a regular CUDA core but rather a specialist graphics core called a Tensor core. The Tensor core is the ray-tracing equivalent of the RT core for Nvidia’s artificial intelligence neural network’s complex equations.
Because so few games support DLSS and its follow-up technology DLSS 2.0, there is a lot of room for improvement. With the addition of DLSS 2.0 support to Unreal Engine 4, this is beginning to change, although it will still be some time before it is widely used by game developers across all platforms and platforms. PC Labs observed that one of the most well-known DLSS-capable games, Death Stranding, benefited greatly from the utilisation of both DLSS and CAS during our testing. The visual quality was actually better with DLSS than CAS, despite the fact that there were still some noticeable twitches while characters were moving. GeForce RTX card users could benefit greatly from more games adopting DLSS. A significant “if” comes into play here.
Also in this bout in 2021, AMD threw some punches with its FidelityFX Super Resolution technology. More than 30 games have been added to the compatible titles list since we tested the functionality way back in June 2021, despite the fact that only a few supported this upsampling technology when it launched in mid-2021. Nvidia and AMD’s latest video cards are supported, allowing for a far wider range of compatibility than before. DLSS, on the other hand, requires a GeForce RTX graphics card to work.
In the first half of 2022, AMD launched its next upscaling weapon: Radeon Super Resolution. FidelityFX’s Super Resolution is based on the same principles and algorithms, but they are now implemented at the driver level. This restricts the GPUs it operates on—only AMD Radeon RX 5000 and RX 6000 models—but opens up tens of thousands of games to utilise the feature. More information on the technology can be found by clicking here.
VR: New Interfaces, New HMDs?
There is a lot of space for improvement because only a few games support DLSS and its follow-up technology DLSS 2.0. Things are changing, but it will be some time until game developers across all platforms and platforms start using DLSS 2.0 support in Unreal Engine 4. PC Labs found that Death Stranding, one of the most well-known DLSS-capable games, benefited considerably from both DLSS and CAS during our testing. Despite the fact that there were still apparent twitches while characters were moving, the visual quality was actually better with DLSS than CAS. Users of the GeForce RTX card could tremendously benefit from additional games incorporating DLSS technology. There’s a big “if” in this situation.
The FidelityFX Super Resolution technology from AMD was also used in this fight in 2021. Despite the fact that only a few games supported this upsampling technique when it premiered in mid-2021, more than 30 games have since been added to the list of compatible titles. Allowing for a much broader range of compatibility than ever before, Nvidia and AMD’s most recent video cards are supported When it comes to DLSS, you’ll need a GeForce RTX graphics card in order to get it to operate.
Next year, AMD unveiled its Radeon Super Resolution, a new tool for upscaling images. The same principles and algorithms underlie FidelityFX’s Super Resolution, which is applied at the driver level. Although this feature can only be used on AMD Radeon RX 5000 and RX 6000 graphics cards, it frees up the feature for thousands of new titles. You can learn more about the technology by clicking this link.
However, two new headsets released in 2019 increased the power consumption slightly. There are now 2,560 by 1,440 pixels per eye in the Oculus Rift S and 1,440 by 1,600 pixels in the $1,000 Valve Index, bringing the total number of pixels to 2,880 by 3,200 pixels. As a result, the newest generation of top-tier GPUs from Nvidia and AMD, which were released in 2021 and 2022, both chose not to include a virtual link connection on the card’s backside.
You’ll need a graphics card that can keep up with the rigorous demands of these newest headsets if you opt to spend the money (80Hz refresh on the Rift S, and 144Hz refresh on the Index). This indicates a machine that can run Half-Life: Alyx at 144fps on two monitors simultaneously, above 1080p, which is a graphically intensive game. To get the greatest experience, Valve recommends at least an RTX 2070 Super or an AMD Radeon RX 5700 XT. To summarise, be sure to read the specifications of any headset you are contemplating and strictly adhere to the GPU recommendations. A headache and even nausea are the only things you’ll get from a subpar VR experience.
So, Which Graphics Card Should I Buy?
As the year 2022 begins, and GPUs are still in short supply, the answer to this question is more muddled than ever. When it comes to games and graphics cards, new technology is making things more complicated for both the high-end and mid- and low-end gamers, because of image-sharpening factors (and possible discounts).
RTX 3080, RTX 3080 Ti, and RTX 3090 from Nvidia are now the only cards that can match their MSRP price-to-performance ratio when it comes to the high end of the market. If you’re searching for maximum power, Nvidia is your best bet.
As for Nvidia, the company’s new Radeon RX 6000 Series is bringing AMD back into the high-end market, although it’s not quite as competitive as Nvidia’s current offerings. Some games still have driver implementation issues with the RX 5000 line one year after the RX 5700 and RX 5700XT were released. This could be a red flag for certain customers.
Lower down the line, Nvidia and AMD butt heads regularly, each with their own answers to sub-$1,000 PC builders that span a huge number of different card models and available options.
The GPUs we recommend at the top and bottom of this article span the spectrum of budget to high end, representing the full range of the best cards that are “available now.” Note, though: Those quote marks are very intentional.
The Elephant in the Room: Graphics Card Pricing (and Availability) in 2022
Because of a supply-side difficulty that began in 2020 and continued into 2022, discrete graphics cards are only now becoming widely available at all price points, from enterprise to consumer.
This is all you need to know, condensed: First and foremost, it is not the “fault” of any one entity. Not AMD’s, not Nvidia’s, and certainly not Bitcoin’s (on its own). Several elements have come together to create the pricing situation that we are familiar with at the time of this writing in early 2022. Even before the outbreak of the pandemic, experts in the field predicted that we would be heading in this direction.
Every industry that relies on semiconductors, from graphics cards to the automotive industry, is currently experiencing a supply pressure. It’s possible that the situation is worse in GPUs. While the card scarcity in 2017-2018 was caused by the initial rise in bitcoin prices, this time around, cryptocurrency is growing at the same time that tariffs have hit, on top of an increase in bots meant to get over Newegg and Amazon’s shopping-cart captcha limits. Card sales are now limited to just a few minutes after they go on sale due to this phenomenon. Once they’ve been stripped of their original packaging, they’re sold for a significant markup on sites like eBay and elsewhere. This is a price that some purchasers in the market are definitely ready to pay.
In the long term, many in the sector expect this particular squeeze to continue. Furthermore, as long as supply bottlenecks persist, demand for discrete desktop GPUs will rise. As demand for last-generation GPUs declines ahead of the arrival of the new generation, prices will continue to rise.
In the past year, we’ve seen GPUs like the GTX 1080 Ti sell for $100 more on eBay than they did at MSRP when they were first released.
GPUs on eBay at Different Prices
Folks, it’s a wild place out there.
The point is that none of the pricing we’ve examined so far match the current reality for purchasers at the MSRP level. Use this guide as a benchmark for comparing the relative performance of the card you’re considering to those of the competition. But what about in the actual world? Depending on what time of day you browse Best Buy, Newegg, eBay, or Amazon listings, you can come across a winning lottery ticket or a loss. In today’s market, there are no assurances. Anybody’s guess as to whether or not you’ll receive what you want and at what price. The best you can hope for is to get a card for less than the MSRP using our methods. It is doable, but it is not simple.
As a result, “Any cards you can find at a decent price” are, to some extent, “The Best Graphics Cards for 2022.” If you’re intending to wait until the next generation of GPUs comes out, the Nvidia GeForce RTX 4090 will be available on April 1st, which should help with supply.