Updated on August 14, 2022
When it comes to buying a new graphics card right now, things aren’t exactly what they used to be. If you’re in the market for a new card, consult this buying guide for tips on how to get a good deal. If you don’t mind waiting a little longer, take a look at our guide on how to get the most out of your existing GPU.
As a side note, we’ve picked out the best Nvidia and AMD cards based on the resolution you plan on playing at (in ascending order) (unless one is an unequivocal clear choice). We’ve only included a small number of third-party cards in this analysis; there are plenty more out there. Because of this, our support of the GeForce GTX 1660 Ti or Radeon RX 6700 XT family as a whole can be seen as an endorsement of the entire class of GPUs.
When it comes to PC gaming or video creation, the performance of your graphics-accelerated software is everything—and your video card is the engine that drives it.
00:12
01:20
Tweet this and let your followers know about it!
In the annals of time:
Connatix’s Sponsored Content
Our guide will help you find the best video card for your desktop PC, what you need to know about upgrading a system, and how to evaluate whether a certain card is a worthwhile choice.. Also, keep an eye out for emerging styles, since they may influence your decision over which card to buy. As a result, video cards for home use can cost anywhere from $99 all the way up to over $1,499 (and that’s just the MSRP…more on that later). Overpaying or underbuying is simple, but we won’t allow you do it.
Who’s Who in GPUs: AMD vs. Nvidia
Displaying 2D and 3D content, drawing the desktop, and decoding and encoding video information in programmes and games are all handled by current graphics solutions. AMD or Nvidia’s big graphics processor chips power every consumer discrete video card on the market today. For whatever reason, the graphics card itself is referred to as a “GPU” when referring to these processors. Graphics cards, or GPUs, are not easy to understand!
Video cards developed by both companies are based on “reference designs,” which are standardised versions of cards based on certain GPUs. Nvidia does sell these reference-design cards on occasion directly from their website (or, less often, by AMD).
Before the GeForce RTX 3000 series, the “Founders Edition” branding on Nvidia’s own brand of graphics cards meant little more than slightly better clock speeds and a strong build quality. For the initial generation of a GPU’s Founder’s Edition cards, it is usual to have a more consistent design. That’s why certain third-party solutions’ designs tend to be more conservative.
Founders Edition GPUs from Nvidia violate conventional wisdom on the most radical of levels. The PCB (printed circuit board, or “guts”) on each RTX 20 Series Nvidia GPU is half the size of the previous generation and features Nvidia’s exclusive “push-pull” cooling technique. AMD may be able to match Nvidia’s performance, but for the time being, the RTX 30 Series Founders Edition cards remain the clear frontrunners.
So far, we’ve only seen the procedure on Nvidia’s Founders Edition badged cards, which reduces their size, weight, and performance to unprecedented levels. AMD and Nvidia “board partners” (businesses like Asus, EVGA, MSI, Gigabyte, Sapphire, XFX, and Zotac) sometimes produce cards based on reference cards. If the graphics chip is AMD or Nvidia-based, these board partners may sell their own branded versions of the reference card, or they may create their own custom products with different cooling fan designs, slight overclocking from the factory, or features like LED mood illumination. They may also sell their own branded versions of the reference card. This can be done by some of the board partners, who will both sell the reference version of a specific GPU and their own, more radical designs.
GPUs from MSI and AMD, while impressive in their own right, lack the kind of technological leaps in cooling and power efficiency we’ve seen in current-gen Founders Edition cards, which is why third-party cards like this one are so classic. The MSI GeForce RTX 3080 Gaming X Trio 10G, for example, is as classic as GPU design gets.
Who Needs a Discrete GPU?
Earlier, we spoke of IGPs (integrated graphics processors). There are three major exceptions to the general rule that IGPs can suit today’s needs:
Users of high-end workstations. Workers in CAD or in video and photo processing will find a discrete GPU to be an invaluable tool. Transcoding video from one format to another or doing other specialised processes using GPU resources instead of (or in addition to) CPU resources is one of their primary applications. Whether or not this is faster will be determined by a variety of things, including the application in issue, the GPU and CPU you currently have, and so on.
Multi-Display Users Who Focus on Productivity. A discrete GPU can also help those who require a high number of screens. IGP- and discrete-connected monitors can both be driven by desktop operating systems at the same time. It is possible to combine an IGP and a discrete GPU if you wish to connect five or six screens to the same computer.
A high-end graphics card is not required for this, however. The only thing you need is a graphics card that supports the display requirements, resolutions, monitor interfaces, and number of panels you need whether you’re merely displaying business apps, several browser windows, or a large number of static windows across many monitors. A GeForce RTX 3080 card, for example, will not provide any more benefit than a GeForce GTX 1660 card with the same supported outputs if you’re presenting three web browsers across three display panels.
Gamers. There’s also the gaming market, where the GPU is unquestionably the most critical component. A top-of-the-line system in 2018 with a 2022 GPU would be preferable to a top-of-the-line system in 2018 with the highest-end GPU available in 2018.
Graphics cards are divided into two distinct categories: consumer cards for gaming and light content creation, and dedicated cards for professional workstations and focused toward scientific computing, computations, and artificial intelligence work. This guide and our reviews will focus on the former, although we’ll also touch on workstation cards a little bit later on in the guide. Nvidia’s GeForce and AMD’s Radeon RX are the most important consumer-facing sub-brands, while Nvidia’s Titan and Quadro (now RTX A-Series) and AMD’s Radeon Pro and Radeon Instinct are the most important professional-level sub-brands (in the pro workstation field). In both markets, Nvidia is the undisputed leader.
For the time being, we’ll be focusing on consumer cards. There will be two distinct classes of Nvidia’s consumer graphics cards in early 2022, both under the long-standing GeForce brand: GeForce GTX and GeForce RTX. Currently, the Radeon RX and (now mostly gone) Radeon RX Vega product families make up AMD’s consumer graphics cards. Radeon VII, AMD’s last consumer graphics card, is also still available. Here are a few things to keep in mind when purchasing a new video card before we get into the specific lines.
Target Resolution and Monitor Tech: Your First Considerations
Your video card’s resolution is the number of horizontal and vertical pixels it can display on your monitor. A video card’s price and performance are directly related to how well it can be used for gaming.
One of the most important things to think about when purchasing a video card for a personal computer is what resolution it works best at. Everyday software now runs at high resolutions like 3,840 by 2,160 pixels on even the most basic graphics cards of the past (a.k.a., 4K). In PC games that require a lot of power, however, those cards will be unable to maintain smooth frame rates at such high resolutions. The video card is responsible for rendering the on-screen image in real time and calculating on-screen coordinates, geometry, and lighting. For this, greater graphics card power is required for higher in-game detail levels and monitor resolutions.
Resolution Is a Key Decision Point
1080p, 1440p, and 2160p or 4K are the three most popular screen resolutions used by today’s video gamers (3,840 by 2,160 pixels). Generally speaking, you’ll want to get a graphics card that matches the native resolution of your monitor. (The display’s optimal resolution is called the “native” resolution because it is the highest supported by the panel.)
This isn’t the only type of ultra-widescreen monitor you’ll come across; there are other displays with in-between resolutions (3,440 by 1,440 pixels is a common one), which you can compare to 1080p, 1440p, or 2160p by multiplying each pixel’s vertical value by its horizontal number. If you’re looking for the best graphics cards for 1080p or 4K gaming, we’ve got you covered.
What’s the point of all of this? When it comes to PC gaming, the power of the components within your next PC should be distributed in a way that best suits the way you want to play, whether you purchase one, construct one, or upgrade.
Here’s how it works, without going into too much detail: When playing at 1080p, even at the highest detail levels, frame rates are nearly always determined by a combination of CPU and GPU capabilities, rather than each one being solely responsible.
Next up is the 1440p resolution, which begins to distribute the burden when you’re playing at a high level of detail. While some games begin to place greater demands on the GPU, others can continue to do the heavy math on the CPU. How well the game has been optimised by its developer is a factor in this. When it comes to high-resolution displays like 4K, the graphics card handles practically all of the processing.
To get a better performance out of games at a higher-than-recommended resolution, you may always lower the detail levels or lower the resolution altogether. Nevertheless, that goes some way towards defeating the point of purchasing a graphics card. It is not necessary to spend $1,000 or even half that to play more than acceptablely at 1080p if you want to play at high refresh rates at 1080p or 1440p.
As a general rule of thumb, you should always purchase a graphics card that is compatible with the display you want to use for gaming. According to the Steam Hardware Survey, the most popular display resolution for PC gamers is 1440p at its best, and 4K is still a distant dream for the majority of gamers. (At the beginning of 2022, it witnessed less than 3% of users playing in 4K.)
High-Refresh Gaming: Why High-End GPUs Matter
High-refresh gaming monitors, a recent trend in gaming, are another thing to keep an eye on. It wasn’t until the esports scene really took off that 60Hz (60 screen refreshes per second) was a common maximum for PC monitors.
Esports and high-refresh gaming panels may enable refresh rates of 144Hz, 240Hz, or even 360Hz, allowing for a more fluid gaming experience. If your video card is capable of continuously pushing frames in a game above 60fps, you may be able to observe previously “wasted” frames in the form of smoother game motion on a high-refresh monitor.
The demand for high-refresh monitors, which may keep esports hopefuls performing at their top, has increased in recent years due to esports success stories (like the instant multi-millionaire status of 16-year-old Fortnite star Bugha). While 1080p is still the most popular choice for competitive gamers across all genres, many are now following the lead of monitor manufacturers and upgrading to higher resolutions.
Recent gaming monitors like the ViewSonic Elite XG270QG combine the best of high-refresh and high-quality panels to attract more gamers to the 1440p resolution (played in either a 16:9 aspect ratio at 2,560 by 1,440 pixels or in a 21:9 aspect ratio at 3,440 by 1,440 pixels). There is a degree of self-referential play going on between the cards and the panels in this game.
In games like Counter-Strike: Global Offensive, esports hopefuls and professional players alike swear by playing at resolutions as low as 720p, despite the benefits of playing at higher resolutions for those who wish to hit their opponents pixel-perfectly. That said, your mileage may vary based on how you choose to play, as well as the games you prefer to play.
Extreme refresh rates aren’t a big deal for casual gamers, but if you’re playing a fast-paced game or competing in esports, they can make a big impact. For the greatest gaming displays, including high-refresh monitors, check out our top recommendations. With a high-refresh rate monitor and a strong video card, gaming at a “pedestrian” resolution like 1080p can benefit from high frame rates.
HDR Compatibility
Finally, keep in mind that HDR compatibility is a must. More and more monitors these days, including nearly all of our recent Editors’ Choice gaming monitor recommendations, offer HDR. And while in our testing HDR 10 and HDR 400 displays don’t frequently make much effect for their HDR image quality, any monitors beyond the HDR 600 spec should weigh into your GPU decision as both a display for gaming and one for HDR-enhanced video.
Make sure the HDR monitor you purchase is compatible with a newer graphics card’s refresh rate and bitrate. A dance, indeed, but one that can pay off handsomely in terms of both content creation and gaming monitors alike.
G-Sync vs. FreeSync: Jets! Sharks! Maria?
Which of these two venerable specifications should you look for in a graphics card if you want to enjoy a more fluid gaming experience? When it comes to monitors, everything depends.
An adaptive sync technique known as G-Sync and AMD’s FreeSync are two sides of the same coin. Adaptive sync allows the video card to control the monitor’s refresh rate, so that the frame rate changes when the video card’s output varies throughout the game. Without it, frame rate wobbles can cause to artefacts, visual stuttering, or screen tearing, in which mismatched screen halves appear for a brief period of time. A full frame is drawn only if the video card is capable of delivering a full frame.
The monitor you’ve just bought may or may not support FreeSync or G-Sync technology. That’s because it’s cheaper to produce a display with FreeSync than it is with G-Sync. In light of this, you may favour the products of one GPU manufacturer over another, but keep in mind that the landscape is changing. An increasing amount of FreeSync monitors have been certified as “G-Sync Compatible” by Nvidia, who supplies a driver patch for FreeSync monitors that works with adaptive sync. Decision-making may no longer be as simple as black and white (or red and green).
When it comes to the most recent CS:GO and Overwatch models we evaluated, you won’t notice any difference between the two. However, only the most experienced gamers can see the difference between today’s G-Sync-Compatible displays and those that were released with earlier versions of the FreeSync and G-Sync technologies, which reduced screen tearing more slowly.
Meet the Radeon and GeForce Families
Let’s talk about the contrasts between these two warring gangs now that we’ve discussed how they came together in recent years. From low-end GPUs for low-resolution gaming to high-end models for 4K and high-refresh rate gaming, there is a vast range of GPUs available. Our first stop will be at Nvidia. Given the volatility of current pricing relative to list prices, please keep in mind that MSRPs are only “recommended” prices these days.)
Our Experts Dig Deep into the Nvidia Product Portfolio
From now until the end of 2020, AMD’s GPU stack will be divided between older “Turing”-generation GPUs and the newer GTX 1600 series, all of which are powered by the company’s new “Turing” architecture. Most latest additions to the GeForce RTX 30-Series are high-end cards that use the “Ampere” GPU architecture.
In the “Pascal” (Turing’s predecessor), Turing and Ampere families, below is a brief overview of the key card classes, their cost, and their use cases…
The Nvidia GeForce Card Lineup, Early 2022
Older cards like the GeForce GTX 1070 and 1080 are not included in this list because they are no longer in production. Since their GeForce RTX replacements have already sold out by 2022, they can be obtained mostly on the second-hand market. The GeForce GTX 1060 and GTX 1050 were superseded by the GeForce GTX 1650 and GTX 1650 Super in the wake of their respective introductions.
But first, a few words on Alan Turing. When Nvidia’s 20-series GPUs were unveiled in September 2018, the response was mixed. These two cutting-edge graphics technologies, DLSS and ray-tracing, making them among of the most powerful graphics cards on the market at that time. However, at the time of Turing’s launch, no games supported ray-tracing or DLSS. Few games still use DLSS 2.0 alone or in conjunction with ray-tracing two years after its release.
High-end GPU prices have been significantly increased by Nvidia, compared to previous generations. With a list price of almost $1,000 for Nvidia’s new flagship graphics card, the GeForce RTX 2080 Ti, the GeForce RTX 2080 was not much cheaper at $699.
NVIDIA reacted in 2019 by producing up-tick versions of their existing cards (the GeForce 10 series) to compete with AMD’s midrange Radeon RX 5700 and Radeon RX 5700 XT graphics cards. Super cards from Nvidia boost the performance of each card they are meant to replace in the stack, which includes the RTX and GTX segments (some more effectively than others).
All of this points to the GeForce RTX 30 Series debuting in September of 2020. New GeForce RTX 3070, 3080, and 3090 GPUs were introduced by Nvidia. One after another of the RTX 3080 Ti, RTX 3070 Ti and RTX 3060 Ti were introduced. They deserve their own speculative breakout. There is no Founders Edition for the RTX 3060 or 3050, however there are third-party variants that are available instead.
Nvidia GeForce RTX 30 Series: Selected Cards Compared
They represent a generational leap, bringing the RT cores to their second generation, Tensor cores to their third generation, and the memory type from GDDR6 to GDDR6X, all built on Samsung’s 8nm manufacturing process. Everything from the location of various modules and chips on the PCB to the inner workings of a brand-new heatsink has had to be redesigned because of reworked PCBs.
The GeForce GT 1030 and GTX 1050 remain very low-end cards, costing under $100 or a little more, regardless of the impact of Nvidia’s 30 Series on card prices up and down the Nvidia stack. To put it another way, Nvidia’s current low-to-midrange is comprised of the GTS 1650/GTS 1650 Super to GTS 1660 Ti, RTX 2060 Super, and RTX 3060.
The GeForce RTX 30 Series has added a new layer of complexity to the midrange and high end GPU market. Aside from the more affordable (but no less powerful) RTX 2060, RTX 2070, RTX 2080, RTX 2080 Ti, and RTX 3070, the GeForce RTX 3080 and RTX 3090 belong to the “elite” high-end pricing group. They start at $699 (again; MSRP!) and go up from there. The MSRP of these cards starts around $350 and goes all the way up to $650, depending on the specific model.
A Look at AMD’s Lineup
As for AMD’s card classes, in 2022 the company is stronger than it has been for some time, competing ably enough with Nvidia’s low-end, mainstream, and high-end cards.
The AMD Radeon Card Lineup, Early 2022
For 1080p gaming, the Radeon RX 550 and RX 560 are the low end, while the old Radeon 570 to RX 590 are the former midrange, but their time is limited given the company’s latest additions to its 1080p-play arsenal, including the Radeon RX 5500 XT and RX 5600 XT, as well as the even newer Radeon RX 6600 and RX 6500 XT. It’s hard to argue with the fact that the RX 580 is still an excellent deal for 1080p gaming, but the other two, although still capable of handling both 1080 and 1440 resolutions, are clearly yesterday’s models.
Indeed, AMD’s 1080p and 1440p cards have undergone a major overhaul. In July of this year, AMD launched Radeon DNA, a brand-new graphics architecture based on 7nm-based “Navi” midrange graphics cards (RDNA). Among the first three cards were the Radeon RX 5700 and the Radeon RX 5700 XT Anniversary Edition. The 1440p gaming market is where all of these cards are aiming to compete. In that resolution range, each can run demanding AAA titles at 60 frames per second.
Radeon RX 6700 and RX 6800 XT were AMD’s response to Nvidia’s dominance at the top of the market in the first half of 2021. A few percentage points behind Nvidia’s 30 Series Founder’s Edition cards in design, cooling, and driver stability are still to be found in these RDNA 2-based cards, which offer a better price-to-performance ratio than the company’s previous-generation RDNA 1.
RDNA 2 will be unveiled to the public in 2020. Both in discrete desktop graphics cards as well as in Sony’s PS5 and Xbox Series X, RDNA 2 improves upon the features introduced in RDNA while also introducing new ones to keep AMD competitive with Team Green’s offerings. Microsoft’s DX12 Ultimate API and ray-tracing compute cores are included. The first few RDNA 2 cards have been released…
AMD Radeon RX 6000 Series: Selected Cards Compared

Graphics Card Basics: Understanding the Core Specs
Now that you’ve seen our comparison charts, you should have a decent notion of which card families to look at based on the display and resolution you want to achieve (and your budget). The graphics processor’s clock speed, onboard VRAM (i.e. how much video memory it has), and—of course!—price are all important considerations when comparing cards, though.
timer’s rate
Higher base clock speed (the speed of the graphics core) and more cores indicate a faster GPU when comparing GPUs from the same series. That’s only a valid comparison across cards in the same product line based on the same GPU, though. As an example, the Founders Edition GeForce GTX 3080 has a base frequency of 1,710MHz, whereas MSI’s Gaming X Trio edition of the RTX 3080 has a base clock of 1,815MHz in its factory overclocked Gaming Mode.
The graphics chip’s boost clock is unique from the base clock. Under certain temperature conditions, the graphics processor can run at a higher frequency than the base clock (also known as the boost clock). Even within a single family, this can be different from card to card. The cooling hardware on the card and the manufacturer’s aggressiveness in the factory settings both play a role. The highest boost clocks for a given GPU are typically found on high-end partner cards with massive multi-fan coolers.
Furthermore, AMD’s exclusive GPU speed category, “game clock,” is not even mentioned here. After testing 25 different games on its RDNA and RDNA 2-based cards, the business claims that “game clock” represents the “average clock speed gamers should anticipate to experience across a wide range.” So that you don’t compare game clocks to boost or base clocks, which game clock clearly isn’t.
Understanding the Memory of the Internal Video Card
Typically, the quantity of onboard video memory (also known as “frame buffer” in the old school) is matched to the needs of specific games or programmes. As long as you’re playing PC games at the appropriate resolutions and detail levels, you may assume that your video card will have enough RAM to keep up. As a result, a card manufacturer is unlikely to provide a card with more memory than it would actually need because doing so would drive up the price and reduce the card’s ability to compete. However, there are a few caveats.
Modern graphics cards optimised for 1080p play typically have 4GB or 6GB of RAM, whilst those optimised for 1440p and 3840 by 2160 (4K) play tend to have 8GB or more. This is because the resolutions of these higher-resolution games are so large. In most cases, the amount of memory on a given GPU-based card is the same for all the cards.
Cards with varying amounts of VRAM have the same GPU, but this isn’t always the case, and it’s crucial to note. Cards based on the Radeon RX 5500 XT, for example, are available in 4GB and 8GB configurations. Keep an eye out for the amount of memory on these two GPUs, which are common in midrange cards. There will be less in the lower-priced models.
A video card with at least 4GB of memory is a must if you want to play games in full 1080p resolution. Both AMD and Nvidia now include additional VRAM in their $200-MSRP-plus GPUs. (AMD’s RX-series cards now have 8GB of memory, while Nvidia’s most powerful card, the GeForce RTX 3090, has 16GB of memory.) However, only secondary systems, low-resolution gaming, or basic or outdated games that don’t require a lot of hardware resources should make use of sub-4GB cards.
It’s a completely other ballgame for creators. The amount of onboard VRAM is more significant in many 3D rendering programmes (as well as VFX operations, modelling, and video editing) than the boost clock speed. More VRAM and faster memory are generally better for a task like creating a complicated VFX picture that has hundreds, if not millions, of distinct elements to calculate at simultaneously, so long as the card’s bandwidth pipe is large enough.
Another metric to be aware of is the memory bandwidth. It refers to the speed at which data may be transferred between the GPU and the rest of the system. In general, more is better, however the architectures of AMD and Nvidia differ and their memory bandwidth requirements can differ, therefore these values cannot be directly compared.
The type of memory in your next GPU is also a vital consideration, and understanding which type you’re purchasing into is critical depending on the types of games or programmes you plan to run, even though you won’t really have a choice within a card line, is essential..
Even while AMD’s Radeon VII can theoretically keep up with an RTX 2080 when it comes to gaming, the company went all-in on HBM2 with the launching of the card, which has an impressive 16GB of HBM2 VRAM for content producers. Using HBM2 in Adobe Premiere Pro video editing is favoured because of its lower production costs, and AMD chose to use it because of this. Because of GDDR6, it’s now considered obsolete.
GDDR6: This type of memory is found in practically every current GPU, from the RTX 2080 Ti and RTX Titan to the AMD Radeon RX 5600 XT. Newer versions of VRAM such as GDDR6 offer more power for the money and can run even the most demanding AAA games at high resolution. GDDR6 is a dependable and highly adjustable VRAM solution that meets every market SKU. If Nvidia’s latest cards didn’t already use… AMD went with GDDR6 in its latest Radeon RX 6000 Series models, which wouldn’t seem like a bad upgrade.
Nvidia has begun using GDDR6X in its GeForce RTX 30 Series graphics cards. It more than doubles the original GDDR6 design’s possible bandwidth without encountering the signal deterioration or route interference issues that plagued prior incarnations.
Upgrading a Pre-Built Desktop With a New Graphics Card
A discrete GPU can be added to most pre-built computers these days, as long as the chassis is large enough.
Measure the available card space in your chassis before making any GPU purchases or upgrades. Occasionally, there is a gap between the motherboard’s far right-hand edge and the hard drive bays. In some cases, you may only have a few millimetres to spare on your GPU’s total length. Cards that are particularly long may be problematic in certain little circumstances. (For more information, see our picks for the best graphics cards for small PCs.)
The height of your graphics card should also be checked. Aside from AMD and Nvidia reference designs, the card partners may use their own cooling. The card you choose should not be too tall to close your case if it has an ornate cooler design.
Finally, we have the power supply unit (PSU). Adding a new video card necessitates a power supply (PSU) capable of powering it. A high-end video card installed in a pre-built PC that came with a low-end or no video card should be approached with caution. The odds are against you if you’re looking at a low-cost or business-oriented PC, as their power supplies tend to be underpowered or underequipped.
The number of six- and eight-pin cables on your power supply unit, as well as the unit’s maximum wattage, are the two most critical considerations. A video card’s power supply usually has at least one six-pin connector; some power supplies include both a six-pin and an eight-pin connector. This is true of systems sold by OEMs like Dell, HP, and Lenovo.
A six-pin cable, an eight-pin cable, or a mix of the two is required to power midrange and high-end graphics cards. (The most basic PCI Express cards draw all of their power from the slot.) Make sure you know exactly what connectors your card requires.
We’ve seen some modifications recently, as some of the Founders Edition cards require a special adapter (included in the box), which turns two eight-pin PSU connectors into a single 12-pin card-side connector; likewise, the massive 12.7-inch MSI GeForce RTX 3080 Gaming Trio (and a few other high-end monsters) now require a whopping three eight-pin connectors to suck down their required juice. It’s not the only new RTX card that requires three PSUs.
The recommended power supply wattage for each of Nvidia and AMD’s graphics-card generations is provided by both companies. However, keep in mind that these are merely suggestions, and that they tend to be conservative in nature. Even though AMD or Nvidia recommends a 500-watt PSU to run a certain GPU, don’t risk it with the 300-watter you may have installed, but remember that you don’t need an 800-watt PSU to guarantee enough headroom either. Always examine the power supply suggestion data for a certain card before purchasing a third-party version of a given GPU because AMD and Nvidia’s guidance for their reference cards may differ somewhat.
SLI, CrossFireX, and NVLink: Fading for Gamers
Both AMD and Nvidia have been moving away from supporting dual-, tri-, or even quad-card systems over the last few generations. These have generally been a pricey, but easy, way to boost performance. For PC gamers, though, the value proposition is no longer there.
When we tested with two RTX 2080 Ti cards in 2019 at PC Labs, we found that adding two cards to the mix produced…mixed results. In today’s games, most aren’t created to take advantage of many cards, and those that are don’t see performance increase in parallel. AMD’s CrossFireX is a twin-card technology, while Nvidia’s SLI and NVLink are twin-card technologies.) It’s all about engine optimization; some games genuinely run slower.
Contrary to popular belief, content creation is an entirely separate ballgame. If you’re looking for a return on your RTX 3090 investment, you need to be a professional-level producer because Nvidia currently only offers the GeForce RTX 3090 with support for NVLink card pairing.
What’s the gist? Nowadays, it’s better to get the greatest card you can afford than to buy an inferior card now and plan to add another to it later in virtually all circumstances.
Ports and Preferences: What Connections Should My Graphics Card Have?
DVI, HDMI, and DisplayPort are all commonly found on the back of modern graphics cards. As the oldest of the three standards, DVI is still used by a few computers and monitors, but it is no longer used on modern, high-end graphics cards.
DisplayPorts (typically three) and HDMI ports (commonly one) are common on most cards. There are various distinctions between HDMI and DisplayPort. At the very least, you’ll need a graphics card capable of supporting HDMI 2.0a or DisplayPort 1.2/1.2a in order to use a 4K display. Even if the GPU supports HDMI 2.0b or DisplayPort 1.4, smooth 4K playback or gaming will still be a challenge without it. In terms of this, the most recent generation of cards from both manufacturers will be fine.
For all of Nvidia’s GeForce RTX 30 Series cards, HDMI 2.0 has bandwidth restrictions of 18Gbps; the latest HDMI 2.1 cable specification has bandwidth limits of 48Gbps (in HDMI 2.1). A refresh rate of up to 60Hz is now possible with 8K resolution, while a refresh rate of up to 120Hz is now possible with 4K.
Nvidia’s GeForce RTX Turing cards (the 20 Series) use a port named VirtualLink. DisplayPort over USB-C is supported by this USB Type-C port, which appears and functions like a USB Type-C port. Virtual-reality (VR) headsets are what the port was really built for, offering enough power and bandwidth to meet the demands of VR head-mounted displays (HMDs). In theory, it’s a fantastic feature that isn’t available on the Founders Edition cards of the RTX 30 Series, so its future is unclear.
The Future of Graphics Cards
However, in the previous 18 months, more cards have been sold than at any other time in recent memory, putting Nvidia’s hold on the consumer video card market in jeopardy.
Image Sharpening, Nvidia DLSS, and Super Resolution
The inclusion of image-sharpening technologies like as AMD’s Radeon Image Sharpening (RIS), FidelityFX with CAS (also from AMD), and Nvidia’s Freestyle has had a significant impact on the gaming landscape in the last year or two. The question is, what are these services, and how can they benefit gamers who are on a tight budget?
Render scaling is to blame for everything. You can probably find a setting in your graphics options that allows you to alter the game’s render scale in most recent titles. If you have the game set to 2,560 by 1,440 pixels, for example, this will reduce the “render” resolution to something like 2,048 by 1,152 pixels, because that’s what it does (again, for the sake of this example).
But hold on…who would deliberately make their game worse? Those that use game sharpeners are the ones to blame. In order to increase a game’s frame rate, image-sharpening technologies allow you to reduce the resolution of the rendered image, which reduces the number of pixels drawn by the GPU and so reduces the amount of muscle required.
The downsampled image is sharpened as part of the “cleaning things up” process. You can theoretically gain a large amount of performance without sacrificing much visual clarity if you set it just right (85% down-render with a 35% sharpen scale is a popular ratio). What’s the big deal? If you can reduce the frame rate of your game without sacrificing visual quality, you can save money on your video card while still getting the same amount of performance.
During our testing, we observed that the peak down-sample is around 30 percent. For the same high-definition gaming experience as running the game at its native resolution without render scaling, you can buy a card that’s nearly a third cheaper than what you were originally looking at, sharpen it back up to 30% using one of the aforementioned sharpening tools and still get close to the same experience.
It’s a challenge as ancient as 3D-capable video cards themselves: how to smooth out the polygons around the edge of a character or object with as minimal performance damage as possible. DLSS, short for “deep learning supersampling,” is Nvidia’s new answer. For graphics cards, anti-aliasing, as it’s commonly known, is one of the most computationally demanding operations in video games. A wide range of technologies have been developed to accomplish the same goal: smoothing out jagged edges.
Nvidia makes use of artificial intelligence to assist in DLSS. A GeForce RTX card is required to employ DirectLight Synchronization (DLSS), which can only be done on the Tensor Core graphics core, and not on any other type of CUDA core. The Tensor core is the ray-tracing equivalent of the RT core for Nvidia’s artificial intelligence neural network’s complex equations.
DLSS and DLSS 2.0 have a lot of promise, however there aren’t many games that use them yet. DLSS 2.0’s incorporation into Unreal Engine 4 is changing this, but it will be some time before it’s widely or universally adopted by developers. Death Stranding, one of the most well-known DLSS-capable games, always benefited from the use of either DLSS or CAS in our testing at PC Labs. The visual quality was actually better with DLSS than CAS, despite the fact that there were still some noticeable twitches while characters were moving. DLSS might be a significant advantage for GeForce RTX card users if more games implement it. There’s a lot of “if” in that sentence.
AMD also threw some punches in 2021 in this fight, with its FidelityFX Super Resolution. Only a few games supported upsampling at its mid-2021 introduction, but the open-source approach to adoption made it a lot more accessible, and more than 30 games have been added to the list of compatible titles since we tried the function in June 2021. Many more current video cards are compatible with it, including those from both Nvidia and AMD. DLSS, on the other hand, is only available on GeForce RTX cards.
In the first half of 2022, AMD launched its next upscaling weapon: Radeon Super Resolution. Using the same ideas and methods as FidelityFX Super Resolution, Super Resolution is implemented in the driver. Only AMD Radeon RX 5000 and RX 6000 models are supported, but the number of titles that can benefit from the functionality has been expanded to the tens of thousands. Check out this link for a detailed explanation of how the technology functions.
VR: New Interfaces, New HMDs?
VirtualLink, as we mentioned before, is another factor to keep in mind. The hardware requirements for virtual reality headsets differ slightly from those of traditional monitors. Both the original HTC Vive and the Oculus Rift have an effective resolution of 2,160 by 1,200 across both eyes in the mainstream tethered VR HMD market. Older midrange GPUs, such as AMD’s Radeon RX 5700 XT and Nvidia’s GeForce GTX 1660 Super, work well in VR because their resolutions are lower than 4K.
While conventional games require a lower frame rate, virtual reality (VR) requires a greater frame rate. It is possible to have a terrible VR gaming experience if the frame rate drops below 90 frames per second. GPUs that cost $300 or more are likely to deliver the best VR experiences today, but current generation headsets can run on lower-end cards that aren’t capable of running at 4K resolution..
However, two new headsets released in 2019 increased the power consumption slightly. The Oculus Rift S has raised the bar to a resolution of 2,560 by 1,440 pixels per eye, while the mega-enthusiast-level $1,000 Valve Index has pumped its own respective numbers up to 1,440 by 1,600 pixels per eye, or 2,880 by 3,200 pixels in total. In the years since those two cards were released, however, GPUs have not kept up with the VirtualLink trend; in fact, many of the top models from Nvidia and AMD in the years 2021 and 2022 don’t include a VirtualLink connection on the back of their latest cards.
Investing in one of these modern headsets will necessitate a graphics card capable of handling their high demands (80Hz refresh on the Rift S, and 144Hz refresh on the Index). The ability to run a game like Half-Life: Alyx at 144 frames per second on two monitors at 1080p or higher is required. To get the greatest experience, Valve recommends at least an RTX 2070 Super or an AMD Radeon RX 5700 XT. To summarise, be sure to thoroughly examine the technical specifications of any headset you are considering, as well as the GPU’s recommended settings. A headache and even nausea are the only things you’ll get from a subpar VR experience.
So, Which Graphics Card Should I Buy?
As the year 2022 begins and GPUs remain in short supply, that solution becomes more complicated than ever before. In the future, new technology will continue to muddle the waters at the high end while maintaining both the midrange and low end in a state of flux due to image-sharpening issues (and possible discounts).
Speaking of the top end, Nvidia GeForce RTX 3060 Ti and the like can’t be beat by anything AMD has to offer when it comes to MSRP price-to-performance alone, so if you’re looking for the most power for the sake of power, Nvidia is your go-to choice at this time.
Nvidia, on the other hand, is still firmly entrenched in the high-end market, but AMD’s debut of the Radeon RX 6000 Series has pushed the company back into contention. After a year of RX 5700 and RX 5700 XT launches, certain games’ driver implementations are still buggy on the RX 5000 series. Some buyers may be put off by that.
Towards the bottom of the food chain, Nvidia and AMD regularly butt heads, each with its own replies to sub-$1,000 PC builders that span a large number of card types and available options.
The GPUs we recommend at the top and bottom of this article span the spectrum of budget to high end, representing the full range of the best cards that are “available now.” Please take note, however, that the use of quotation marks is deliberate.
The Elephant in the Room: Graphics Card Pricing (and Availability) in 2022
Due to supply-side issues that began in 2020 and rolled into 2022, discrete graphics cards have been in short supply at all levels, from business to consumer.
Here’s a quick rundown of everything you need to know: To begin, no one or anything is to blame. Not those of Nvidia, AMD, or even Bitcoin (on its own). As of this writing in early 2022, the current pricing situation is the result of a convergence of events. Even before the epidemic, analysts in the space had forecast that we were already on a collision course.
Every industry that relies on semiconductors, from graphics cards to the automotive industry, is currently experiencing a supply pressure. It’s possible that the situation is worse in GPUs. Cryptocurrency prices are rising at the same time as tariffs have taken effect, on top of a spike in bots designed to bypass Newegg and Amazon’s shopping cart captcha restrictions. As a result, cards are selling out within minutes, if not seconds, after going on sale. A few purchasers in the market have proved that they are willing to spend multiples of the stated price for repackaged and resold items on eBay and elsewhere.
Many in the sector believe that the current crunch will persist until at least 2022. Furthermore, as long as supply bottlenecks persist, demand for discrete desktop GPUs will rise. As demand for last-generation GPUs declines ahead of the arrival of the new generation, prices will continue to rise.
Cards like the GTX 1080 Ti are already showing signs of this, selling for $100 more on eBay than they did on launch day…in 2017.
eBay Pricing GPUs
People, it’s a crazy world out there.
The point is that none of the MSRP pricing we’ve mentioned match the current reality for purchasers. Take this guide as your best objective benchmark for how the card you desire compares to other possibilities in terms of relative performance. But what about in the actual world? Lottery tickets can be found at Best Buy, Newegg, eBay, or Amazon at any time of day. In today’s market, there are no assurances. It’s anyone’s guess when (or if) you’ll obtain what you’re looking for and at what price. You can only hope to get a card at a price close to MSRP by following our advice. It’s doable, but it’s not simple.
As a result, “Any cards you can find at a decent price” are, to some extent, “The Best Graphics Cards for 2022.” If you’re intending to wait until the next generation of GPUs comes out, the Nvidia GeForce RTX 4090 will be available on April 1st, which should help with supply.