Updated on May 3, 2023
Our evaluations are reliable.
Present supply and cost conditions for GPUs are everything but “normal,” as can be seen from the Auction prices shown above. Editor’s note: If you’re planning to make a purchase in the near future, use our advice to find a decent price on a credit card. Check out this guide on how to get the most performance out of your existing GPU.
Our top-rated Nvidia and AMD graphics cards have been arranged in order of target gameplay quality (unless one is an unequivocal clear choice). There are a lot more third-party cards out there than we’ve listed here. Reference cards (such as the GeForce GTX 1660 Ti or Radeon RX 6700 XT) can be considered as an endorsement of the entire GPU family.
Believing in your own abilities means having a graphics card that can keep up with the demands of your favourite PC games and other graphically intensive software.
In this tutorial, you’ll learn about the finest desktop PC video card options, how to upgrade a system, and how to decide whether or not a certain card is a worthwhile purchase. Please join us if you want to learn more about forthcoming trends. With prices ranging from $100 to over $1,500 (and that’s just the MSRP), there is something for everyone’s budget. Despite the fact that it is easy to overpay or underbuy, we will not allow you to do so.
Who’s Who in GPUs: AMD vs. Nvidia
What is the purpose of a graphics card? Is it truly necessary?
As a general rule, PC manufacturers don’t put much emphasis on graphics cards in their prebuilt desktop PCs unless they’re specifically designed for gaming. A low-cost PC may in fact not have a graphics card at all, instead depending on the graphics-enhanced hardware built into the CPU (an “integrated graphics processor,” commonly called an “IGP”). There’s nothing wrong with depending on an integrated graphics processor (IGP); most business laptops, low-cost consumer laptops, and low-cost desktops have them); nevertheless, if you’re a gamer or a creator, having the correct graphics card is critical to your enjoyment and creativity.
An IGP or discrete video card can display 2D and 3D data, draw the desktop, and decode and encode video information in programmes and games, all with advanced graphics solutions. AMD or Nvidia’s big graphics processor chips power every consumer discrete video card on the market today. “GPUs,” or “gpu,” are the name used to describe these processors, which are also used to describe the graphics card itself. (There is nothing easy about graphics cards…er, GPUs!)
There are “reference design” for their graphics card, which are standardised versions of cards based on specific GPUs, that the two businesses develop for their products. In several cases, Nvidia sells its own reference-design cards (or, less often, by AMD).
Before Nvidia’s newest, the GeForce RTX -3000 series, “Founders Edition” branding meant nothing more than slightly faster clock speeds from stock and solid build quality for Nvidia’s own brand of cards. As far as looks go, the most recent generation of GeForce GTX GPU Founders Editions are the most uniform. However, their designs tend to be more conservative than those of third-party choices, which are more tolerant of extreme overclocking or modification.
For Nvidia’s new Founders Edition cards, common thinking is thrown out the window. Each model of the RTX -20 Series Nvidia GPUs has a PCB (printed circuit board, or “guts”) that is half as large as the previous generation’s and is equipped with Nvidia’s unique “push-pull” cooling technology. It’s hard to argue with Nvidia’s design acumen on performance, but the RTX 30 Series Founders Edition cards really stand out when it comes to industrial style.
Cards bearing Nvidia’s Founders Edition branding have only been given this treatment, despite the fact that they are smaller, lighter, and quicker than any other cards previously. There are many “board partners” (sometimes called “third-party card producers” or “board partners”) and big motherboard manufacturers like Gigabyte and Sapphire that make identical clones of the benchmark cards they use. It is possible for these board partners to sell their own branded versions of the reference card, or they may manufacture their own customised cards with different cooling fan designs, mild overclocking from the factory, or LED mood lighting. For example, some of the board partners will sell both the reference version of a given GPU and their own, more radical designs. This can be done.
There are few things more classic in GPU design than MSI’s Nvidia GeForce RTX 3080 Gaming X Trio 10G and AMD’s Radeon RX 6000 Series, both of which are stylish and efficient, but they don’t boast the kind of technological frog leaps we’ve seen in current-generation Founders Edition cards.
Who Needs a Discrete GPU?
In the past, IGPs were discussed (integrated graphics processors). There are three notable exceptions to the general rule that IGPs can meet the demands of the majority of current users.
Professionals in the field of computer science who use workstations. Because of this, dedicated graphics cards are still beneficial for people who work with CAD or video and photo-editing software. It is possible for some of their most critical programmes to employ GPU resources instead of (or in addition to) CPU resources. GPU and CPU performance varies widely among computers.
Multi-display users who place a high value on efficiency and effectiveness. Separate GPUs are still useful even if you have a lot of displays. Desktop operating systems currently support both IGP and discrete displays. Up to six screens can be connected to an IGP or separate GPU, depending on the specifications of each.
It isn’t necessary to have the most up-to-date graphics card. PC gaming that isn’t graphically intensive requires graphics cards that support the monitors you plan to utilise and can handle as many panels as each monitor has. The GeForce RTX 3080 GPU has no advantage over the GeForce GTX 1660 with the same number of outputs if you’re running three browsers on three monitors.
Gamers. Gaming systems usually make use of the graphics processing unit as well (GPU). At this point in 2018, it is impossible to say which of the most expensive, high-end systems built in 2018 is the best.
Graphics cards can be divided into two categories: consumer ones for generating basic games and content, and workstation ones for handling more complicated computations and AI systems. Workstation cards are the centre of this guide and our reviews even though we’ll discuss them later. RTX A-Series is now known as the RTX A-Series, however Radeon Pro and Radeon Instinct are still key AMD branding to be aware of (in the pro workstation field). Nvidia continues to rule the high-end markets around the world.
Introducing the Radeon RX 6700 XT!
No corporate credit cards for now, as Zlata Ivleva showed in her illustration. GeForce GTX and RTX cards from Nvidia will be available to consumers this year, and will be sold under the GeForce brand. No longer in production are the AMD’s consumer graphics card lines, including the RX, Vega and Radeon VII. The following are some considerations to bear in mind when shopping for a new video card.
Target Resolution and Monitor Tech: Your First Considerations
Your video card’s resolution is the number of horizontal and vertical pixels on either side of the screen. This has a significant impact on the video card to buy as well as how much i need to pay for it when seen from a gaming perspective.
A video card’s performance at different resolutions is an important consideration for PC gamers. In recent years, even the most basic graphics cards have been able to run software at high resolutions like 3,840 by 2,160 pixels (a.k.a., 4K). For demanding PC games that require smooth frame rates at high resolutions, the current generation of GPUs are not up to the task. Real-time rendering in video games is done by graphics processing units (GPUs). Higher levels of in-game detail and monitor resolution necessitate a more powerful graphics card.
Making a decision necessitates deciding on a course of action.
In today’s video games, 1080p, 1440p, and 2160p or 4K are the most common screen resolutions (3,840 by 2,160 pixels). When choosing a graphics card for your display, there are a few things to bear in mind. Use your display at its “original” resolution to get the best picture.
Ultra-widescreen monitors can have resolutions as high as 3,440 x 1,440 pixels and 3,840 x 1,440 pixels, with 2,160 x 1,080 being the most popular. Multiply the vertical value by the horizontal number of each pixel to see how these displays compare to 1080p, 1440p, and 2160p displays. All of your 1080p and 4K gaming graphics card needs are taken care of by our team.
What’s the big deal about this issue? For PC gaming, you need make sure that the power of the components in your next PC is distributed in a way that best matches your preferred method of playing.
Here’s how it works, without getting too bogged down in the weeds: When playing at high definition, even at the highest detail levels, frame rates are nearly always determined by a combination of CPU and GPU capabilities, rather than a single factor.
For those who prefer to play at a higher degree of detail, 1440p resolution is a good option. While some games begin to place greater demands on the GPU, others can continue to rely on the CPU to handle the more computationally intensive tasks. (This is dependent on the developer’s optimization techniques.) Finally, 4K resolution requires the GPU to undertake nearly all of its work in many circumstances.
To get a better performance out of games at a higher-than-recommended resolution, you may always lower the detail levels or lower the resolution altogether. But, to some extent, that undermines the purpose of purchasing a graphics card. To play at 1080p or 1440p at a high refresh rate, you’ll need a high-end graphics card. However, you don’t need to spend $1,000 or even half that to play well.
In a nutshell, you should always acquire a graphics card that is compatible with the display you anticipate on using in the near future. Many midrange GPUs can drive 1440p panels at their best, and if the Steam Hardware Survey is any indication, 4K is still a fringe display resolution for the most active PC gamers. By early 2022, just about 3% of people were using 4K resolution.)
High-Frequency Gaming: The Importance of High-End GPUs
Another gaming trend to keep an eye on is high-refresh gaming monitors, which have garnered significant traction in recent years. For a long time, most PC displays had a panel refresh rate of 60Hz (or 60 screen redraws per second), but that was before the esports genre truly took off.
There are panels that really can run at up to 360 frames per second if you’re interested in esports and high-refresh games. Your video card’s previously “wasted” frames can now be seen in the form of smoother gameplay on a high-refresh monitor.
Because of the increasing number of success stories in esports, the need for high-refresh monitors has grown in recent years (like the instant multi-millionaire status of 16-year-old Fortnite star Bugha). For competitive gamers across all genres, 1080p is by far the most popular resolution, but screens are leading the way.
As more and more gamers move up to the 1440p graphical resolutions, the number of high-res displays like the ViewSonic Elite XG270QG, which combine high refresh rates and high-quality panels, grows at a faster rate than ever before, thanks in no small part to recent game-monitor entries like the ViewSonic Elite. There is a degree of self-referential play going on between the cards and the panels in this game.
Many esports hopefuls and currently salaried pros still swear by playing games like Counter-Strike: Global Offensive at resolutions as low as 720p, even if they can’t hit their opponents with pixel-perfect precision. As a result, your results may vary depending on your personal preferences and the games you choose to play.
The fluidity of a high refresh rate is a competitive advantage for competitive esports players, but most casual gamers won’t notice the difference unless they play fast-action games. For the greatest gaming displays, including high-refresh monitors, check out our top recommendations. In other words, if you have a high-refresh display and a powerful video card, you can benefit from playing at a “pedestrian” resolution like 1080p.
Compatibility with HDR
Finally, keep in mind whether or not your display is HDR-compatible. Even our Editors’ Choice picks for best gaming displays of the last several months all support HDR at some extent. It is worth noting that in our tests, HDR 10 and HDR 400 monitors don’t have much of an impact on the images they produce, but any monitors over HDR 600 should be taken into consideration when making your GPU choice.
Inch 65-inch HP Omen Emporium (Photo: Zlata Ivleva)
Customers should also make sure the monitor they purchase supports HDR transmission at a refresh rate and bitrate that a new card can handle. You have to dance, but the results may be spectacular for both content creation and game displays.
It’s FreeSync vs. G-Sync, so let the battle commence. Sharks! Maria?
Buying a card based on whether it supports one of these two venerable standards for smoothing gameplay is a bad idea. It all depends on the screen you’re using.
A technology called adaptive sync, FreeSync (AMD’s solution) and G-Sync (Nvidia’s) are two sides of the same coin. In games with adaptive sync, the monitor’s refresh rate fluctuates based on the video card’s output capabilities at any given point in the game, allowing for smooth gameplay. Artifacts, stuttering, and tearing of the on-screen action can all occur as a result of frame rate wobbles, so it’s essential to have it. A full frame is drawn only if the video card is capable of delivering a full frame.
The HP Omen 25f
(Chris Stobing, photographer)
FreeSync or G-Sync may or may not be supported by your monitor. A monitor’s manufacturing costs are reduced by using FreeSync rather than G-Sync, which necessitates the use of specific hardware. Based on this, you might prefer the products of one GPU manufacturer over another, but keep in mind that the landscape is shifting. As part of its “G-Sync Compatible” certification, Nvidia released a driver patch at CES 2019 that allows FreeSync-compatible monitors to use adaptive sync with late-model Nvidia GeForce cards. There may not be as clear a cut a decision as it has been in the past as it has been in the past.
Even if you’re a professional CS:GO or Overwatch player, you may not notice any noticeable difference between the two in the latest models of the headsets. For many years, screen tearing was a difficult problem to overcome, but now days it’s only the eyes of an expert that can detect the difference between G-Sync and FreeSync monitors.
Find out more about the Radeon and GeForce families
To wrap things off, let’s talk about the differences between the two competing gangs that have come together in recent years. There is a wide range of GPUs available, from low-end versions for low-resolution gaming all the way up to high-end models for 4K and high-refresh rate gaming. Let’s take a look at Nvidia’s products first. (Again, keep in mind that MSRPs are only “recommended” prices these days, and that the volatility of today’s prices against the list prices we mention!)
An Examine of Nvidia’s Product Lineup
Its main card stack was divided between cards based on the “20-series” Turing GPUs and subsequent GTX 1600 series cards based on the Turing architecture until the end of 2020. The GeForce RTX 30-Series cards, the most recent additions, are high-end cards based on GPUs using the “Ampere” architecture.
Founders Edition Nvidia GeForce RTX 3070 (Photo: Zlata Ivleva)
In the “Pascal” (Turing’s predecessor), Turing and Ampere families, below is a brief overview of the key card classes, their cost, and their use cases…
There are a number of aged GeForce GTX Pascal cards, such as the GTX 1070 and GTX 1080, that are not included in the list above. In 2022, their GeForce RTX counterparts have mainly replaced them on the second-hand market after being sold out. After the debut of the GeForce GTX 1660 and GTX 1660 Ti, the GeForce GTX 1060 also met its demise, while the GTX 1050 has been rendered obsolete by the GTX 1650 and the GTX 1650 Super.
First, however, a word on Turing. In the wake of Nvidia’s 20-series GPU debut in September 2018, there was a mixed response. However, the business was touting some of the most powerful GPUs available, as well as cutting-edge features like ray-tracing and dynamic light scattering (DLSS). On the other hand, no games supported ray-tracing or DLSS at the time of the Turing launch. Even after two years, the number of games that support DLSS 2.0 on its own or in conjunction with ray-tracing is still small.
The Founders Edition of the GeForce RTX 2080 Ti (Photo: Zlata Ivleva)
Also around the same time, Nvidia increased high-end GPU pricing compared to previous generations, making it more expensive. If you wanted to get your hands on the GeForce RTX 2080 Ti or the GeForce RTX 2080 at all, you’d have to pay more than $1,000.