The Intel Core i5-14600K is not the kind of processor you're really going to want to upgrade to, despite technically offering the best value of any processor I've tested.
First, the good. This is one of the best processor values you're going to find on the market, no matter what happens with the price of its predecessor. Currently, it has the best performance for its $319 price tag (about £255/AU$465), and AMD's competing Ryzen 5 7600X isn't all that close. If you're looking to get the most bang for your buck today, then the Intel Core i5-14600K is it.
In terms of performance, this isn't a bad chip at all; I'd even say it's a great one if you take its predecessor out of the running, which will inevitably happen as its last remaining stock gets bought up. It doesn't have the performance of the Intel Core i7-14700K, but that's a workhorse chip, not the kind that's meant to power the best computers for the home or the best budget gaming PCs as these chips start making their way into prebuilt systems in the next couple of months.
For a family computer or one that's just meant for general, every day use, then this chip is more than capable of handling whatever y'll need it for. It can even handle gaming fairly well thanks to its strong single core performance. So, on paper at least, the Core i5-14600K is the best Intel processor for the mainstream user as far as performance goes.
The real problem with the i5-14600K is that its performance is tragically close to the Core i5-13600K's. And even though the MSRP of the Intel Core i5-13600K is technically higher than that of the Core i5-14600K, it's not going to remain that way for very long at all.
The real problem with the i5-14600K, and one that effectively sinks any reason to buy it, is that its performance is tragically close to the Core i5-13600K's.
As long as the i5-13600K is on sale, it will be the better value, and you really won't even notice a difference between the two chips in terms of day-to day-performance.
That's because there's no difference between the specs of the 14600K vs 13600K, other than a slightly faster turbo clock speed for the 14600K's six performance cores.
While this does translate into some increased performance, it comes at the cost of higher power draw and temperature. During testing, this chip hit a maximum temperature of 101ºC, which is frankly astounding for an i5. And I was using one of the best CPU coolers around, the MSI MAG Coreliquid E360 AIO, which should be more than enough to keep the temperature in check to prevent throttling.
Image 1 of 13
(Image credit: Future / Infogram)
Image 2 of 13
(Image credit: Future / Infogram)
Image 3 of 13
(Image credit: Future / Infogram)
Image 4 of 13
(Image credit: Future / Infogram)
Image 5 of 13
(Image credit: Future / Infogram)
Image 6 of 13
(Image credit: Future / Infogram)
Image 7 of 13
(Image credit: Future / Infogram)
Image 8 of 13
(Image credit: Future / Infogram)
Image 9 of 13
(Image credit: Future / Infogram)
Image 10 of 13
(Image credit: Future / Infogram)
Image 11 of 13
(Image credit: Future / Infogram)
Image 12 of 13
(Image credit: Future / Infogram)
Image 13 of 13
(Image credit: Future / Infogram)
Looking at the chip's actual performance, the Core i5-14600K beats the AMD Ryzen 5 7600X and the Intel Core i5-13600K in single core performance, multi core performance, and with productivity workloads, on average. Other than its roughly 44% better average multi core performance against the Ryzen 5 7600X, the Core i5-14600K is within 3% to 4% of its competing chips.
Image 1 of 7
(Image credit: Future / Infogram)
Image 2 of 7
(Image credit: Future / Infogram)
Image 3 of 7
(Image credit: Future / Infogram)
Image 4 of 7
(Image credit: Future / Infogram)
Image 5 of 7
(Image credit: Future / Infogram)
Image 6 of 7
(Image credit: Future / Infogram)
Image 7 of 7
(Image credit: Future / Infogram)
In creative workloads, the Core i5-14600K again manages to outperform the Ryzen 5 7600X by about 31% on average, but it's just 2.4% better than its predecessor, and none of these chips are especially great at creative content work. If you're messing around with family albums or cutting up TikTok videos, any one of these chips could do that fairly easily. For heavier-duty workloads like video encoding and 3D rendering, the Intel chips hold up better than the mainstream Ryzen 5, but these chips really aren't practical for that purpose.
Image 1 of 6
(Image credit: Future / Infogram)
Image 2 of 6
(Image credit: Future / Infogram)
Image 3 of 6
(Image credit: Future / Infogram)
Image 4 of 6
(Image credit: Future / Infogram)
Image 5 of 6
(Image credit: Future / Infogram)
Image 6 of 6
(Image credit: Future / Infogram)
On the gaming front, it's more of the same, though now at least the Ryzen 5 7600X is back in the mix. Overall, the Core i5-14600K beats its 13th-gen predecessor and AMD's rival chip by about 2.1% and 3.2% respectively.
Image 1 of 2
(Image credit: Future / Infogram)
Image 2 of 2
(Image credit: Future / Infogram)
All of this comes at the cost of higher power draw and hotter CPU temperatures, though, which isn't good especially for getting so little in return. What you really have here is an overclocked i5-13600K, and you can do that yourself and save some money by buying the 13600K when it goes on sale, which is will.
(Image credit: Future / John Loeffler)
Intel Core i5-14600K: Price & availability
How much does it cost? US MSRP $319 (about £255/AU$465)
When is it out? October 17, 2023
Where can you get it? You can get it in the US, UK, and Australia
The Intel Core i5-14600K is available in the US, UK, and Australia as of October 17, 2023, for an MSRP of $319 (about £255/AU$465).
This is a slight $10 price drop from its predecessor, which is always good thing, and comes in about $20 (about £15/AU$30) more than the AMD Ryzen 5 7600X, so fairly middle of the pack price-wise.
In terms of actual value, as it goes to market, this chip has the highest performance for its price of any chip in any product tier, but only by a thin margin, and one that is sure to fall very quickly once the price on the 13600K drops by even a modest amount.
Intel Core i5-14600K: Specs
Intel Core i5-14600K: Verdict
Best performance for the price of any chip tested...
...but any price drop in the Core i5-13600K will put the 14600K in second place
Not really worth upgrading to with the Core i7-14700K costing just $90 more
Image 1 of 7
(Image credit: Future / Infogram)
Image 2 of 7
(Image credit: Future / Infogram)
Image 3 of 7
(Image credit: Future / Infogram)
Image 4 of 7
(Image credit: Future / Infogram)
Image 5 of 7
(Image credit: Future / Infogram)
Image 6 of 7
(Image credit: Future / Infogram)
Image 7 of 7
(Image credit: Future / Infogram)
Ultimately, the market served by this chip specifically is incredibly narrow, and like the rest of the Raptor Lake Refresh line-up, this is the last hurrah for the Intel LGA 1700 socket.
That means if you go out and buy a motherboard and CPU cooler just for the 14th-gen, it's a one time thing, since another generation on this platform isn't coming. It doesn't make sense to do that, so, if you're upgrading from anything earlier than the 12th-gen, it just makes so much more sense to wait for Meteor Lake to land in several months time and possibly get something really innovative.
If you're on a 12th-gen chip and you can't wait for Meteor Lake next year, the smartest move is to buy the i7-14700K instead, which at least gives you i9-13900K-levels of performance for just $90 more than the i5-14600K.
Ultimately, this chip is best reserved for prebuilt systems like the best all-in-one computers at retailers like Best Buy, where you will use the computer for a reasonable amount of time, and then when it becomes obsolete, you'll go out and buy another computer rather than attempt to upgrade the one you've got.
In that case, buying a prebuilt PC with an Intel Core i5-14600K makes sense, and for that purpose, this will be a great processor. But if you're looking to swap out another Intel LGA 1700 chip for this one, there are much better options out there.
Should you buy the Intel Core i5-14600K?
Buy the Intel Core i5-14600K if...
Don't buy it if...
Also Consider
If my Intel Core i5-14600K review has you considering other options, here are two processors to consider...
How I tested the Intel Core i5-14600K
I spent nearly two weeks testing the Intel Core i5-14600K
I ran comparable benchmarks between this chip and rival midrange processors
I gamed with this chip extensively
Test System Specs
These are the specs for the test system used for this review:
I spent about two weeks testing the Intel Core i5-14600K and its competition, primarily for productivity work, gaming, and content creation.
I used a standard battery of synthetic benchmarks that tested out the chip's single core, multi core, creative, and productivity performance, as well as built-in gaming benchmarks to measure its gaming chops.
I then ran the same tests on rival chips from AMD as well as the other 14th-gen chips in the Raptor Lake Refresh launch lineup and 13th-generation Raptor Lake processors. For Intel chips, I used the same motherboard, RAM, SSD, and graphics card to ensure I was isolating just the CPU's performance across every chip. For AMD chips, I used a comparable AM5 motherboard so differences in the motherboard configuration and circuitry are mitigated to the largest extent possible.
I've been testing and reviewing computer hardware for years now, and with an extensive background in computer science, I know processors in and out, and I use that knowledge to ensure every chip is thoroughly tested.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
The Intel Arc A770 has had quite a journey since its release back on October 12, 2022, and fortunately, it has been a positive one for Intel despite a somewhat rocky start.
Right out the gate, I'll say that if you are looking for one of the best cheap graphics cards for 1440p gaming, this card definitely needs to be on your list. It offers great 1440p performance for most modern PC titles that most of us are going to be playing and it's priced very competitively against its rivals.
Where the card falters, much like with my Intel Arc A750 review earlier this year, is with older DirectX 9 and DirectX 10 titles, and this really does hurt its overall score in the end. Which is a shame, since for games released in the last five or six years, this card is going to surprise a lot of people who might have written it off even six months ago.
Intel's discrete graphics unit has been working overtime on its driver for this card, providing regular updates that continue to improve performance across the board, though some games benefit more than others.
Naturally, a lot of emphasis is going to be put on more recently released titles. And even though Intel has also been paying attention to shoring up support for older games as well, if you're someone with an extensive back catalog of DX9 and DX10 titles from the mid-2000s that you regularly return to, then this is not the best graphics card for your needs. Nvidia and AMD drivers carry a long legacy of support for older titles that Intel will honestly never be able to match.
But if what you're looking for is the best 1440p graphics card to play the best PC games of the modern era but you're not about to plop down half a grand on a new GPU, then the Intel Arc A770 is going to be a very solid pick with a lot more to offer than many will probably realize.
(Image credit: Future / John Loeffler)
Intel Arc A770: Price & availability
How much is it? US MSRP for 16GB card: $349 (about £280/AU$510); for 8GB card: $329 (about £265/AU$475)
When was it released? It went on sale on October 12, 2022
Where can you buy it? Available in the US, UK, and Australia
The Intel Arc A770 is available now in the US, UK, and Australia, with two variants: one with 16GB GDDR6 VRAM and an official US MSRP of $349 (about £280/AU$510), and one with 8GB GDDR6 VRAM and an official MSRP of $329 (about £265/AU$475).
Those are the launch MSRPs from October 2022, of course, and the cards have come down considerably in price in the year since their release, and you can either card for about 20% to 25% less than that. This is important, since the Nvidia GeForce RTX 4060 and AMD Radeon RX 7600 are very close to the 16GB Arc A770 cards in terms of current prices, and offer distinct advantages that will make potential buyers want to go with the latter rather than the former.
But those decisions are not as cut and dry as you might think, and Intel's Arc A770 holds up very well against modern midrange offerings, despite really being a last-gen card. And, currently, the 16GB variant is the only 1440p card that you're going to find at this price, even among Nvidia and AMD's last-gen offerings like the RTX 3060 Ti and AMD Radeon RX 6750 XT. So for 1440p gamers on a very tight budget, this card fills a very vital niche, and it's really the only card that does so.
Price score: 4/5
(Image credit: Future / John Loeffler)
Intel Arc A770: Design
Intel's Limited Edition reference card is gorgeous
Will fit most gaming PC cases easily
Intel Arc A770 Limited Edition Design Specs
Slot size: Dual slot Length: 11.02 inches | 280mm Height: 4.53 inches | 115mm Cooling: Dual fan Power Connection: 1 x 8-pin and 1 x 6-pin Video outputs: 3 x DisplayPort 2.0, 1 x HDMI 2.1
The Intel Arc A770 Limited Edition that I'm reviewing is Intel's reference model that is no longer being manufactured, but you can still find some stock online (though at what price is a whole other question).
Third-party partners include ASRock, Sparkle, and Gunnir. Interestingly, Acer also makes its own version of the A770 (the Acer Predator BiFrost Arc A770), the first time the company has dipped its toe into the discrete graphics card market.
All of these cards will obviously differ in terms of their shrouds, cooling solutions, and overall size, but as far as Intel's Limited Edition card goes, it's one of my favorite graphics cards ever in terms of aesthetics. If it were still easily available, I'd give this design five out of five, hands down, but most purchasers will have to opt for third-party cards which aren't nearly as good-looking, as far as I'm concerned, so I have to dock a point for that.
It's hard to convey from just the photos of the card, but the black finish on the plastic shroud of the card has a lovely textured feel to it. It's not quite velvety, but you know it's different the second you touch it, and it's something that really stands out from every other card I've reviewed.
(Image credit: Future / John Loeffler)
The silver trim on the card and the more subtle RGB lighting against a matte black shroud and fans really bring a bit of class to the RGB graphics card I typically see. The twin fans aren't especially loud (not any more so than other dual-fan cards, at least), and the card feels thinner than most other similar cards I've reviewed and used, whether or not the card is thinner in fact.
The power connector is an 8-pin and 6-pin combo, so you'll have a pair of cables dangling from the card which may or may not affect the aesthetic of your case, but at least you won't need to worry about a 12VHPWR or 12-pin adapter like you do with Nvidia's RTX 4000-series and 3000-series cards.
You're also getting three DisplayPort 2.0 outputs and an HDMI 2.1 output, which puts it in the same camp as Nvidia's recent GPUs, but can't match AMD's recent move to DisplayPort 2.1, which will enable faster 8K video output. As it stands, the Intel Arc A770 is limited to 8K@60Hz, just like Nvidia. Will you be doing much 8K gaming on a 16GB card? Absolutely not, but as we get more 8K monitors next year, it'd be nice to have an 8K desktop running at 165Hz, but that's a very speculative prospect at this point, so it's probably not anything anyone looking at the Arc A770 needs to be concerned about.
Design Score: 4 / 5
(Image credit: Future / John Loeffler)
Intel Arc A770: Specs & features
Good hardware AI cores for better XeSS upscaling
Fast memory for better 1440p performance
Intel's Xe HPG architecture inside the Arc A770 introduces a whole other way to arrange the various co-processors that make up a GPU, adding a third, not very easily comparable set of specs to the already head-scratching differences between Nvidia and AMD architectures.
Intel breaks up its architecture into "render slices", which contain 4 Xe Cores, which each contain 128 shaders, a ray tracing processor, and 16 matrix processors (which are directly comparable to Nvidia's vaunted tensor cores at least), which handle graphics upsampling and machine learning workflows. Both 8GB and 16GB versions of the A770 contain eight render slices for a total of 4096 shaders, 32 ray processors, and 512 matrix processors.
The ACM-G10 GPU in the A770 runs at 2,100MHz base frequency with a 2,400MHz boost frequency, with a slightly faster memory clock speed (2,184MHz) for the 16GB variant than the 8GB variant's 2,000MHz. This leads to an effective memory speed of 16 Gbps for the 8GB card and 17.5 Gbps for the 16GB.
With a 256-bit memory bus, this gives the Arc A770 a much wider lane for high-resolution textures to be processed through, reducing bottlenecks and enabling faster performance when gaming at 1440p and higher resolutions thanks to a 512 GB/s and 559.9 GB/s memory bandwidth for the 8GB and 16GB cards, respectively.
All of this does require a good bit of power, though, and the Arc A770 has a TDP of 225W, which is higher than most 1440p cards on the market today.
(Image credit: Future / John Loeffler)
As far as features all this hardware empowers, there's a lot to like here. The matrix cores are leveraged to great effect by Intel's XeSS graphics upscaling tech found in a growing number of games, and this hardware advantage generally outperforms AMD's FSR 2.0, which is strictly a software-based upscaler.
XeSS does not have frame generation though, and the matrix processors in the Arc A770 are not nearly as mature as Nvidia's 3rd and 4th generation tensor cores found in the RTX 3000-series and RTX 4000-series, respectively.
The Arc A770 also has AV1 hardware-accelerated encoding support, meaning that streaming videos will look far better than those with only software encoding at the same bitrate, making this a compelling alternative for video creators who don't have the money to invest in one of Nvidia's 4000-series GPUs.
Specs & features: 3.5 / 5
(Image credit: Future / John Loeffler)
Intel Arc A770: Performance
Great 1440p performance
Intel XeSS even allows for some 4K gaming
DirectX 9 and DirectX 10 support lacking, so older games will run poorly
Resizable BAR is pretty much a must
At the time of this writing, Intel's Arc A770 has been on the market for about a year, and I have to admit, had I gotten the chance to review this card at launch, I would probably have been as unkind as many other reviewers were.
As it stands though, the Intel Arc A770 fixes many of the issues I found when I reviewed the A750, but some issues still hold this card back somewhat. For starters, if you don't enable Resizable BAR in your BIOS settings, don't expect this card to perform well at all. It's an easy enough fix, but one that is likely to be overlooked, so it's important to know that going in.
Image 1 of 15
(Image credit: Future / Infogram)
Image 2 of 15
(Image credit: Future / Infogram)
Image 3 of 15
(Image credit: Future / Infogram)
Image 4 of 15
(Image credit: Future / Infogram)
Image 5 of 15
(Image credit: Future / Infogram)
Image 6 of 15
(Image credit: Future / Infogram)
Image 7 of 15
(Image credit: Future / Infogram)
Image 8 of 15
(Image credit: Future / Infogram)
Image 9 of 15
(Image credit: Future / Infogram)
Image 10 of 15
(Image credit: Future / Infogram)
Image 11 of 15
(Image credit: Future / Infogram)
Image 12 of 15
(Image credit: Future / Infogram)
Image 13 of 15
(Image credit: Future / Infogram)
Image 14 of 15
(Image credit: Future / Infogram)
Image 15 of 15
(Image credit: Future / Infogram)
In synthetic benchmarks, the A770 performed fairly well against the current crop of graphics cards, despite its effectively being a last-gen card. It is particularly strong competition against the Nvidia RTX 4060 Ti across multiple workloads, and it even beats the 4060 Ti in a couple of tests.
Its Achilles Heel, though, is revealed in the PassMark 3D Graphics test. Whereas 3DMark tests DirectX 11 and DirectX 12 workloads, Passmark's test also runs DirectX 9 and DirectX 10 workflows, and here the Intel Arc A770 simply can't keep up with AMD and Nvidia.
Image 1 of 24
(Image credit: Future / Infogram)
Image 2 of 24
(Image credit: Future / Infogram)
Image 3 of 24
(Image credit: Future / Infogram)
Image 4 of 24
(Image credit: Future / Infogram)
Image 5 of 24
(Image credit: Future / Infogram)
Image 6 of 24
(Image credit: Future / Infogram)
Image 7 of 24
(Image credit: Future / Infogram)
Image 8 of 24
(Image credit: Future / Infogram)
Image 9 of 24
(Image credit: Future / Infogram)
Image 10 of 24
(Image credit: Future / Infogram)
Image 11 of 24
(Image credit: Future / Infogram)
Image 12 of 24
(Image credit: Future / Infogram)
Image 13 of 24
(Image credit: Future / Infogram)
Image 14 of 24
(Image credit: Future / Infogram)
Image 15 of 24
(Image credit: Future / Infogram)
Image 16 of 24
(Image credit: Future / Infogram)
Image 17 of 24
(Image credit: Future / Infogram)
Image 18 of 24
(Image credit: Future / Infogram)
Image 19 of 24
(Image credit: Future / Infogram)
Image 20 of 24
(Image credit: Future / Infogram)
Image 21 of 24
(Image credit: Future / Infogram)
Image 22 of 24
(Image credit: Future / Infogram)
Image 23 of 24
(Image credit: Future / Infogram)
Image 24 of 24
(Image credit: Future / Infogram)
In non-ray-traced and native-resolution gaming benchmarks, the Intel Arc A770 managed to put up some decent numbers against the competition. At 1080p, the Arc A770 manages an average of 103 fps with an average minimum fps of 54. At 1440p, it averages 78 fps, with an average minimum of 47, and even at 4K, the A770 manages an average of 46 fps, with an average minimum of 27 fps.
Image 1 of 12
(Image credit: Future / Infogram)
Image 2 of 12
(Image credit: Future / Infogram)
Image 3 of 12
(Image credit: Future / Infogram)
Image 4 of 12
(Image credit: Future / Infogram)
Image 5 of 12
(Image credit: Future / Infogram)
Image 6 of 12
(Image credit: Future / Infogram)
Image 7 of 12
(Image credit: Future / Infogram)
Image 8 of 12
(Image credit: Future / Infogram)
Image 9 of 12
(Image credit: Future / Infogram)
Image 10 of 12
(Image credit: Future / Infogram)
Image 11 of 12
(Image credit: Future / Infogram)
Image 12 of 12
(Image credit: Future / Infogram)
Turn on ray tracing, however, and these numbers understandably tank, as they do for just about every card below the RTX 4070 Ti and RX 7900 XT. Still, even here, the A770 does manage an average fps of 41 fps, with an average minimum of 32 fps) at 1080p with ray tracing enabled, which is technically still playable performance. Once you move up to 1440p and 4K, however, your average title isn't going to be playable at native resolution with ray tracing enabled.
Image 1 of 9
(Image credit: Future / Infogram)
Image 2 of 9
(Image credit: Future / Infogram)
Image 3 of 9
(Image credit: Future / Infogram)
Image 4 of 9
(Image credit: Future / Infogram)
Image 5 of 9
(Image credit: Future / Infogram)
Image 6 of 9
(Image credit: Future / Infogram)
Image 7 of 9
(Image credit: Future / Infogram)
Image 8 of 9
(Image credit: Future / Infogram)
Image 9 of 9
(Image credit: Future / Infogram)
Enter Intel XeSS. When set to "Balanced", XeSS turns out to be a game changer for the A770, getting it an average framerate of 66 fps (with an average minimum of 46 fps) at 1080p, an average of 51 fps (with an average minimum of 38 fps) at 1440p, and an average 33 fps (average minimum 26 fps) at 4K with ray tracing maxed out.
While the 26 fps average minimum fps at 4K means it's really not playable at that resolution even with XeSS turned on, with settings tweaks, or more modest ray tracing, you could probably bring that up into the low to high 30s, making 4K games playable on this card with ray tracing turned on.
That's something the RTX 4060 Ti can't manage thanks to its smaller frame buffer (8GB VRAM), and while the 16GB RTX 4060 Ti could theoretically perform better (I have not tested the 16GB so I cannot say for certain), it still has half the memory bus width of the A770, leading to a much lower bandwidth for larger texture files to pass through.
This creates an inescapable bottleneck that the RTX 4060 Ti's much larger L2 cache can't adequately compensate for, and so takes it out of the running as a 4K card. When tested, very few games managed to maintain playable frame rates even without ray tracing unless you dropped the settings so low as to not make it worth the effort. The A770 16GB, meanwhile, isn't technically a 4K card, but it can still dabble at that resolution with the right settings tweaks and still look reasonably good.
Image 1 of 9
(Image credit: Future / John Loeffler)
Image 2 of 9
(Image credit: Future / John Loeffler)
Image 3 of 9
(Image credit: Future / John Loeffler)
Image 4 of 9
(Image credit: Future / John Loeffler)
Image 5 of 9
(Image credit: Future / John Loeffler)
Image 6 of 9
(Image credit: Future / John Loeffler)
Image 7 of 9
(Image credit: Future / Infogram)
Image 8 of 9
(Image credit: Future / Infogram)
Image 9 of 9
(Image credit: Future / Infogram)
All told, then, the Intel Arc A770 turns out to be a surprisingly good graphics card for modern gaming titles that can sometimes even hold its own against the Nvidia RTX 4060 Ti. It can't hold a candle to the RX 7700 XT or RTX 4070, but it was never meant to, and given that those cards cost substantially more than the Arc A770, this is entirely expected.
Its maximum observed power draw of 191.909W is pretty high for the kind of card the A770 is, but it's not the most egregious offender in that regard. All this power meant that keeping it cool was a struggle, with its maximum observed temperature hitting about 74 ºC.
Among all the cards tested, the Intel Arc A770 was at nearly the bottom of the list with the RX 6700 XT, so the picture for this card might have been very different had it launched three years ago and it had to compete with the RTX 3000-series and RX-6000 series exclusively. In the end, this card performs like a last-gen card, because it is.
Despite that, it still manages to be a fantastic value on the market right now given its low MSRP and fairly solid performance, rivaling the RTX 4060 Ti on the numbers. In reality though, with this card selling for significantly less than its MSRP, it is inarguably the best value among midrange cards right now, and it's not even close.
Performance score: 3.5 / 5
(Image credit: Future / John Loeffler)
Should you buy the Intel Arc A770?
Buy the Intel Arc A770 if...
Don't buy it if...
Also Consider
If my Intel Arc A770 review has you considering other options, here are two more graphics cards for you to consider.
How I tested the Intel Arc A770
I spent several days benchmarking the card, with an additional week using it as my primary GPU
I ran our standard battery of synthetic and gaming benchmarks
Test Bench
These are the specs for the test system used for this review: CPU: Intel Core i9-13900K
CPU Cooler: Cougar Poseidon GT 360 AIO Cooler Motherboard: MSI MPG Z790E Tomahawk Wifi
Memory: 64GB Corsair Dominator Platinum RGB DDR5-6000 SSD: Samsung 990 Pro PSU: Thermaltake PF3 1050W ATX 3.0 Case: Praxis Wetbench
I spent about two weeks with the Intel Arc A770 in total, with a little over half that time using it as my main GPU on my personal PC. I used it for gaming, content creation, and other general-purpose use with varying demands on the card.
I focused mostly on synthetic and gaming benchmarks since this card is overwhelmingly a gaming graphics card. Though it does have some video content creation potential, it's not enough to dethrone Nvidia's 4000-series GPUs, so it isn't a viable rival in that sense and wasn't tested as such.
I've been reviewing computer hardware for years now, with an extensive computer science background as well, so I know how graphics cards like this should perform at this tier.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
• Original review date: June 2023 • Launch price: MSRP at $299 / £289 / AU$479 • Lowest price now: $445.60 / £279 / AU$543
Update – April 2025: With the recent release of Nvidia's 50 series graphics cards—and the mixed reviews they've received from both reviewers and customers alike—the Nvidia RTX 4060 would look like the best graphics card to buy for upgrading your 1080p GPU under normal circumstances, but unfortunately this card is becoming increasingly hard to find on store shelves.
Worse still, those places that do still have the card in stock are selling it for much more than the card's MSRP in the US, making it a much less attractive option when the Nvidia GeForce RTX 5060 will launch at the same MSRP in a few weeks (the UK and Australia have an easier time finding it at or near RRP). Even with the current US market's price inflation, the RTX 5060 will likely sell for the same amount as the inflated RTX 4060 prices I'm seeing online today.
If you can find this card for its MSRP or less and you only want something cheap for 1080p gaming, definitely consider it, as it's one of the best cheap graphics cards you're going to find. Otherwise, I'd recommend you wait and see what the RTX 5060 looks like, not to mention the AMD Radeon RX 9060 XT and RX 9060, which are also due out in the next month or two and should be priced similarly.
Original unedited review follows...
Nvidia GeForce RTX 4060: Two-minute review
Nvidia really wants you to know that the Nvidia GeForce RTX 4060 is a card for those who are still running a GTX 1060 or RTX 2060, and it's really Team Green's best marketing strategy for this card.
To be clear, the Nvidia RTX 4060 is respectably better than the Nvidia RTX 3060 it replaces, and comes in at a lower launch MSRP of $299 (about £240/AU$450) than its predecessor. Its 1080p gaming performance is the best you're going to find under $300, and its 1440p performance is pretty solid, especially when you turn on DLSS. If you're playing a game with DLSS 3 and frame generation, even better.
Unfortunately, the card's 4K performance suffers due to the limited video memory it's working with, which is a 50% decrease from the initial RTX 3060 run's 12GB VRAM pool (though at least it doesn't go below the 8GB of the later RTX 3060s).
You also get more sophisticated ray tracing and tensor cores than those found in the Ampere generation, and this maturity shows up in the card's much-improved ray tracing and DLSS performance.
There are also some added bonuses for streamers as well like AV1 support, but this is going to be a lower-midrange gamer's card, not a streamer's, and for what you're getting for the price, it's a great card.
The real problem for this card though is the Nvidia GeForce RTX 3060 Ti. For more than a year after the RTX 3060 Ti hit the scene, it topped our best graphics card list for its spectacular balance of price and performance, punching well above its weight and even outshining the Nvidia GeForce RTX 3070.
Ever since the crypto bubble popped and Nvidia Lovelace cards started hitting the shelves, the last-gen Nvidia Ampere cards have absolutely plummeted in price, including the RTX 3060 Ti. You can now get the RTX 3060 Ti for well below MSRP, and even though the RTX 4060 outperforms the RTX 3060 by roughly 20%, it still falls short of the RTX 3060 Ti, so if you are able to get an RTX 3060 Ti for near or at the same price as the RTX 4060, it might be a better bet. I haven't seen the RTX 3060 Ti drop that low yet, but it's definitely possible.
The reason why the RTX 3060 Ti is competitive here is especially because many of the best features of the RTX 4060 depend on other people implementing Nvidia's DLSS 3 technology in their products. DLSS 3 with Frame Generation is incredible for most games (though there are some latency issues to work out), but the number of games that implement it is rather small at the moment.
Many newer games will have it, but as we've seen with the recent controversy over Starfield partnering with AMD, one of the biggest PC games of the year might not have DLSS implemented at all at launch. It's a hard thing to hold against the RTX 4060 as a solid negative, since when the technology is implemented, it works incredibly well. But it's also unavoidable that Nvidia's biggest selling point of this generation of graphics cards is explicitly tied to the cooperation of third-party game developers.
With something like the Nvidia GeForce RTX 4070, DLSS 3 is a nice feature to have, but it doesn't make or break the card. With the RTX 4060, its appeal is deeply tied to whether or not you have this tech available in your games, and it seriously undercuts the card when it isn't. Its non-DLSS performance is only better than the RTX 3060 by a standard gen-on-gen uplift at 1080p, and without DLSS, 1440p gaming is possible, but will be severely hampered by the limited VRAM. 4K gaming, meanwhile, would be out of the question entirely.
All that said, the Nvidia RTX 4060 is still going to be one hell of an upgrade for anyone coming from a GTX 1060 or RTX 2060, which is really where this card is trying to find its market. RTX 3060 gamers will honestly be better off just saving up some more money for the RTX 4070 than worrying about the RTX 4060 (and you can probably skip the Nvidia RTX 4060 Ti, honestly).
If you're looking for the best cheap graphics card from Nvidia, the Nvidia GeForce RTX 4060 is probably as good as it's going to get for a while, since there have been few - if any - rumblings about an Nvidia RTX 4050 or Nvidia RTX 4050 Ti coming to the budget segment any time soon. Whether it's worth upgrading from an RTX 3060 is debatable, but if money is tight and looking for an upgrade from the Pascal- or Turing-era 60-series cards, you'll absolutely love this card.
Nvidia GeForce RTX 4060: Price & availability
(Image credit: Future / John Loeffler)
How much is it? MSRP is $299 / £289 / AU$479
When is it out? June 29, 2023
Where can you get it? Available globally
The Nvidia GeForce RTX 4060 is available on June 29, 2023, for an MSRP of $299 / £289 / AU$479, which is about 10% less than the RTX 3060 was when it launched in 2021.
There is a caveat to this pricing in that there is no Nvidia Founders Edition of the RTX 4060, so it is only available from third-party partners like Asus, PNY, and others. These manufacturers can charge whatever they want for the card, so you can expect to see many of the cards priced higher than Nvidia's MSRP, but there will be those like the Asus RTX 4060 Dual that I tested for this review that will sell at MSRP.
While this card is cheaper than most, it's not the cheapest of the current generation. That would be the AMD Radeon RX 7600, which has an MSRP of $269.99 (about £215/AU$405), which still offers the best performance-to-price value of any of the current-gen cards. Still, given the actual level of performance you get from the RTX 4060, it definitely offers a compelling value over its rival cards, even if they are cheaper in the end.
Nvidia GeForce RTX 4060: Features and chipset
(Image credit: Future / John Loeffler)
3rd-gen ray tracing and 4th-gen tensor cores
Only 8GB VRAM
DLSS 3 with Frame Generation under $300
In terms of specs, the Nvidia GeForce RTX 4060 is a marked improvement over the Nvidia RTX 3060 thanks to a smaller TSMC 5nm process node compared to the RTX 3060's 8nm Samsung node. It also features much faster clock speeds, with a roughly 39% faster base and boost clock speed.
You also have a faster memory speed, but a smaller VRAM pool and smaller memory bus, so you end up with a roughly 25% smaller memory bandwidth, which really puts a ceiling on higher resolution performance.
Still, with faster clock speeds, more mature ray tracing and tensor cores, and a lower TGP than its predecessor, this is one of the most powerful and energy-efficient graphics cards in its class.
Nvidia GeForce RTX 4060: design
(Image credit: Future / John Loeffler)
There is no reference design for the Nvidia RTX 4060, since there is no Founders Edition, so the design of the card is going to depend entirely on which version you get from which manufacturer.
In my case, I received the Asus GeForce RTX 4060 Dual OC edition, which features a dual fan design and a much smaller footprint befitting a midrange card. Thankfully, the card uses an 8-pin power connector, so there's no need to fuss with any 12VHPWR adapter cables.
Image 1 of 2
(Image credit: Future / John Loeffler)
Image 2 of 2
(Image credit: Future / John Loeffler)
It comes with the now-standard three DisplayPort 1.4 and one HDMI 2.1 video outputs on this generation of Nvidia cards, so those with one of the best USB-C monitors will once again be out of luck here.
The card is a dual-slot width, so you shouldn't have any issues getting it into a case, and it's light enough that you really should be able to get away without having to use a support bracket.
Nvidia GeForce RTX 4060: Performance
(Image credit: Future / John Loeffler)
Best-in-class 1080p gaming performance
Huge improvement if coming from RTX 2060 or older
Test system specs
This is the system we used to test the Nvidia GeForce RTX 4060:
When it comes to 1080p, the Nvidia RTX 4060 offers the best gaming performance under $300.
The AMD RX 7600 gives it a run for its money in pure rasterization performance, and even manages to beat out the RTX 4060 on occasion, but once you start cranking up ray tracing the RTX 4060 absolutely pulls away from its rivals.
This is especially true when you flip the switch on DLSS, which makes 1440p gaming a very feasible option with this card. While this definitely isn't going to be one of the best 1440p graphics cards, on certain titles with certain settings, you'll be surprised what you can get away with.
Synthetic Benchmarks
When it comes to synthetic benchmarks, you get the typical blow-for-blow between Nvidia and AMD cards that we've seen in the past, with AMD outperforming on pure rasterization tests like 3DMark Time Spy and Firestrike, while Nvidia pulls ahead on ray tracing workloads like Port Royal and Speedway.
The RTX 4060 and RX 7600 are close enough in terms of raw performance that it might as well be a wash on average, but it's worth noting that the RTX 4060 is about 20% better on average than the RTX 3060. I point that out mostly to contrast it with the RTX 4060 Ti, which was only about 10-12% better than the RTX 3060 Ti on average.
A 20% improvement gen-on-gen, on the other hand, is much more respectable and justifies considering the RTX 4060 as an upgrade even with an RTX 3060 in your rig. You might not actually make that jump for an extra 20% performance with this class of GPU, but it's at least worth considering, unlike with the RTX 4060 Ti.
Gaming Benchmarks
Where the RTX 4060 really takes off though is in gaming performance. Compared to the RX 7600, it's more or less even when just playing at 1080p with max settings without ray tracing or upscaling. Notably, the RTX 4060 actually underperforms the RX 7600 by about 9% in Cyberpunk 2077 when you're not using ray tracing or upscaling.
Crank ray tracing up to Psycho in Cyberpunk 2077 though, and the value of the RTX 4060 really starts to show through. The RX 7600 absolutely tanks when RT is maxed, but that's not universal across the board. In other games, the RX 7600 is competitive, but Cyberpunk 2077 really is AMD's Achilles' Heel. Meanwhile, the RTX 3060 holds up fairly well on some titles, while the RTX 4060 pulls ahead by a substantial amount on others.
With upscaling turned on, the RTX 4060 manages to substantially outperform both the RTX 3060 and the RX 7600. If you leave the base DLSS settings and don't mess with frame generation, the RTX 4060 pulls off a clean win on Cyberpunk 2077, while it has a slightly lower average framerate than the RTX 3060, but a higher minimum framerate, it's a much more stable experience across the board.
Once you turn on frame generation though, things swing dramatically in the RTX 4060's favor. You can even increase the resolution in Cyberpunk 2077 to 1440p with Frame Generation on and you'll get more fps on average and at a minimum than you would with the RTX 3060 at 1080p, while the RX 7600 simply can't keep up at this level.
Unfortunately, a lot of this is dependent on developers implementing Nvidia's new technology. Without DLSS 3 with Frame Generation, you still get respectably better performance than the RTX 3060, but nothing that absolutely blows you away.
Meanwhile, the RX 7600 offers a compelling alternative if you're looking to save some money and don't care about 1440p or ray tracing.
Still, if you can toggle a setting and give yourself an extra 50 fps on a demanding game, there really is no comparison, and on this alone, the RTX 4060 wins out by default.
Should you buy the Nvidia GeForce RTX 4060?
(Image credit: Future / John Loeffler)
Buy it if...
You want the best 1080p gaming under $300 This card is a 1080p champ in its weight class, even if it walks right up to the line of the middle midrange.
You want fantastic ray tracing support Nvidia pioneered real-time ray tracing in games, and it really shows here.
Don't buy it if...
You want the best value While the RTX 4060 is very well-priced, the AMD RX 7600 offers a much better price-to-performance ratio.
You don't care about ray tracing or upscaling Ray tracing is honestly overrated and a lot of games don't offer or even need upscaling, so if you don't care about these features, Nvidia's RTX 4060 might not offer enough for you to spend the extra money.
Nvidia GeForce RTX 4060: Also consider
AMD Radeon RX 7600 Team Red's competing midrange card is a fantastic value while offering compelling 1080p performance (so long as ray tracing and upscaling aren't your biggest concerns).
Nvidia GeForce RTX 3060 Ti With graphics card prices for the Nvidia RTX 3000-series continuing to come down, its possible that this card might come in close to the RTX 4060's MSRP, and with its better performance, it offers a compeling alternative.
I looked at the cards gaming performance and raw synthetic performance
I used our standard battery of graphics card tests and several current PC games to push the GPU to its limits.
I spent extensive time testing the RTX 4060 over a number of days, using synthetic tests like 3DMark and Passmark, while also running several games on the card at different settings and resolutions.
I also tested its closest rival card as well as the card it is replacing in Nvidia's product stack and compared the performance scores across the cards to assess the card's overall performance.
I did this using the latest Nvidia and AMD drivers on a test bench using all of the same hardware for each card tested so that I could isolate the graphics card's contribution to the overall performance I found in-game or in synthetic benchmarks.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
• Original review date: April 2023 • Launch price: MSRP at $599 / £589 / AU$1,109 • Lowest price now: $792.99 / £485.28 / AU$899
Update – April 2025: With the release of the Nvidia GeForce RTX 5070, you'd hope that it would force the price of the RTX 4070, one of the best graphics cards of the last generation, down somewhat, but that doesn't appear to be the case in the US (UK and Australian shoppers can actually find this card for less than RRP right now).
With the lowest price I've found for the RTX 4070 is in the US, coming in at just under $800, it's a much harder card to recommend in 2025, especially with AMD Radeon RX 7800 XT cards selling for much less and much more readily available online.
Original unedited review follows...
Nvidia GeForce RTX 4070: Two-minute review
The Nvidia GeForce RTX 4070 is here at long last, and for gamers who've been starved for an upgrade, go ahead and pick this one up. It can do just about everything.
It's hard to follow up the RTX 3070, one of the best graphics cards of all time, and in our Nvidia GeForce RTX 3070 review, we praised that card for being an outstanding performer at 1080p and 1440p — which is where the overwhelming number of PC gamers game at — while also being a much more affordable option over the other two launch cards for Nvidia's Ampere lineup. We especially noted how the RTX 3070 offered comparable performance to the RTX 2080 Ti for half the price.
Everything we said about the RTX 3070 applies just as easily to the RTX 4070, only now it doesn't just dabble in 4K; it can competently game at every resolution, making it a graphics card that everybody can fall in love with without spending a fortune.
A lot has changed since the RTX 3070 launched towards the end of 2020, and unfortunately, not everything changed for the better. Things are more expensive pretty much everywhere you look, and the Nvidia RTX 4070 isn't immune. At $599 (about £510 / AU$870), the RTX 4070 is fully 20% more expensive than the RTX 3070 was at launch.
(Image credit: Future / John Loeffler)
I'm not happy about this at all, and you shouldn't be either, but all you have to do is look at the scores the RTX 4070 puts up on the board and you'll be as hard pressed as I am to dock it any points for this. It consistently puts out RTX 3080-level performance more or less across the board and even manages to bloddy the nose of the Nvidia GeForce RTX 3080 Ti, and while the RTX 3080 beats out the RTX 4070 at native 4K, turn on DLSS and the RTX 3080 simply gets blown out.
On the other side of the aisle, the AMD Radeon RX 7900 XT is Team Red's nearest real competition, and it struggles to justify itself in the presence of the RTX 4070. While the RX 7900 XT solidly outperforms the 4070, it's also 50% more expensive, and the benefits of the RX 7900 XT get quickly drowned out by the power of DLSS, especially in titles with DLSS 3.
Moreover, the RTX 4070 makes for a pretty competent creator GPU, offering indie developers and artists who don't have the funding to get themselves an Nvidia GeForce RTX 4090 a handy option for getting some work done within a more limited budget. It's not going to power a major movie studio or anything, but if you're dabbling in 3D modeling or video editing, this card is great compromise between price and performance.
Finally, wrap this all into a package that feels like a downright normal graphics card from ye olden days, back before you needed to include support brackets and balast to keep your gaming PC from tipping over, and you end up with a graphics card that can easily power some of the best gaming PCs that can actually fit into your PC case and your budget.
This graphics card has its issues, which is inevitable, but given what's on offer here, it's easy enough to look past its shortcomings and enjoy some truly outstanding performance at at a reasonable enough price.
Third-party cards retail prices will match or exceed Nvidia's MSRP
The Nvidia GeForce RTX 4070 is available starting April 13, 2023, with an MSRP of $599 (about £510 / AU$870). Third-party partners will have their own versions of the RTX 4070 that will vary in price, but they will always have a matching or higher regular retail price than the Nvidia GeForce RTX 4070 Founders Edition.
Notably, the RTX 4070 is getting a 20% price increase over the card it's replacing, the RTX 3070, which had a launch price of $499 in the US (about £425 / AU$725). While we'd have loved to see the price stay the same gen-over-gen, this should come as no surprise to anyone who has been watching GPU price inflation recently.
Meanwhile, we haven't seen AMD's direct RTX 4070 competitor yet, the AMD Radeon RX 7800 XT, but the AMD Radeon RX 7900 XT is the closest AMD has this generation with an $899 / £799 (around AU$1,350) MSRP, putting it 50% more expensive than the RTX 4070.
This card is also the same price as the Nvidia GeForce RTX 3070 Ti, for what it's worth, and considering that the RTX 4070 punches well above the 3070 Ti's performance, you do at least get a better sense of value out of this card than anything from the last generation.
Price score: 4 / 5
Nvidia GeForce RTX 4070 review: Features & chipset
(Image credit: Future / John Loeffler)
DLSS 3 with full Frame Generation
Third-gen Ray Tracing Cores and fourth-gen Tensor Cores
Lower TGP than RTX 3070
The Nvidia RTX 4070 doesn't change too much on paper over its last-gen predecessor, featuring the same number of streaming multiprocessors, therefore the same number of CUDA cores (5,888), ray-tracing cores (46), and tensor cores (184).
It does bump up its memory to the faster GDDR6X and adds an additional 50% VRAM for a total of 12GB. With a 192-bit bus and a memory clock of 1,313MHz, the RTX 4070 has an effective memory speed of 21 Gbps, equal to that of the Nvidia RTX 4070 Ti, for a memory bandwidth of 504.2 GB/s.
It has a lower base and boost frequency than the 4070 Ti, clocking in at 1,920MHz base and 2,475MHz boost (compared to 2,310MHz base and 2,610MHz boost for the 4070 Ti), but this is a substantial bump up from the 1,500MHz base and 1,725MHz boost frequency of the RTX 3070.
This is owing to the 5nm TSMC process used to fab the AD104 GPU, compared to the Samsung 8nm process for the RTX 3070's GA104. Those faster clocks also power next-gen ray tracing and tensor cores, so even though there are the same number of cores in both the RTX 4070 and the RTX 3070, the RTX 4070's are both much faster and more sophisticated.
Also factor in Nvidia Lovelace's DLSS 3 with Frame Generation capacity, something that Nvidia Ampere and Turing cards don't have access to, and what looks like two very similar cards on paper turns out to be anything but in practice.
Finally, thanks to the 5nm process, Nvidia is able to squeeze more performance out of less power, so the TGP for the RTX 4070 is just 200W, making it a fantastic card for a lower-power, sub-600W build.
Features & chipset: 5 / 5
Nvidia GeForce RTX 4070 review: Design
(Image credit: Future / John Loeffler)
Same size as the RTX 3070
16-pin power connector
Same design as RTX 4090 and RTX 4080
With the RTX 4070 Founders Edition, Nvidia finally delivers a next-gen graphics card that can actually fit in your case without requiring a construction winch to hold it in place.
OK, the previous cards weren't that bad, and even at the reduced form factor and weight, you'll still want to toss a GPU bracket into your case for good measure (there's no harm in protecting your investment, after all).
Image 1 of 6
(Image credit: Future / John Loeffler)
Image 2 of 6
(Image credit: Future / John Loeffler)
Image 3 of 6
(Image credit: Future / John Loeffler)
Image 4 of 6
(Image credit: Future / John Loeffler)
Image 5 of 6
(Image credit: Future / John Loeffler)
Image 6 of 6
(Image credit: Future / John Loeffler)
But holding the RTX 4070 in my hand, this is the first card of this generation that doesn't feel like a piece of machinery. Even the more modestly-sized AMD Radeon RX 7900 XTX and RX 7900 XT feel substantial, while the RTX 4070 feels like an old school GeForce graphics card from a couple years back.
The RTX 4070 Founders Edition keeps the same fan design as the RTX 4090 and RTX 4080 that preceeded it (a fan on the front and back), but it shrinks everything down to a dual-slot card about two-thirds the size of those monsters. The RTX 4070 also features the same outputs as previous RTX Lovelace cards (so no USB-C out), and a 16-pin power connector with an included adapter for two 8-pin leads to power the card.
With a TGP of 200W, Nvidia could theoretically have just gone with a single 8-pin connector, but Team Green seems absolutely committed to the 12VHPWR cable, it seems. I'll never stop complaining about this, but it is what it is. If you have an ATX 3.0 power supply, you won't need to worry about that, but the rest of us will have to deal with additional cable management.
Design score: 4.5 / 5
Nvidia GeForce RTX 4070 review: Performance
(Image credit: Future / John Loeffler)
Phenomenal gaming performance
Can easily push 60 fps in 4K gaming with DLSS
RTX 3080 performance at 60% of the power
Right out the gate, let's just say that the Nvidia RTX 4070 is the best 1440p graphics card on the market right now, and it's likely to remain at the top of that list for a good long while.
Its performance prowess isn't limited to just 1440p, mind you, and when I get into the gaming performance, you'll see that its 4K gaming potential is exciting (with caveats), but for starters, we can dig into its synthetic performance in tests like 3DMark to see how the fundamentals stack up.
General Performance
As you can see, the RTX 4070 outperforms the RTX 3070 by about 21% overall, while underperforming the RTX 3080 by about 1.37%, which is close enough to effectively tie the last-gen 4K powerhouse, and underperforms the RTX 3080 Ti by about 6%. Considering that the RTX 3080 Ti's MSRP is nearly twice that of the RTX 4070, this is an astounding result.
The RTX 4070 does lag behind the RTX 4070 Ti and the RX 7900 XT by quite a bit, averaging about 22% worse performance than the RX 7900 XT and about 13.5% worse performance than the RTX 4070 Ti. These current-gen cards also have substantially better hardware, so this isn't unexpected.
Creative Performance
When it comes to creative performance, well, we have a more limited dataset to work with since Blender Benchmark 3.5.0 decided it only wanted to test half the cards I tried to run it on (including failing to run on the RTX 4070), so we'll have to come back to that one at a later date once the benchmark is updated.
In the meantime, the tests I was able to run really showcased how well the RTX 4070 can handle creative workloads. On Adobe Premiere and Adobe Photoshop, the RTX 4070 performed noticeably better than the RTX 3080 across both apps and fell in very close behind the RTX 4070 Ti for an overall second place finish.
In lieu of Blender's Benchmark, V-Ray 5 is a fairly good stand-in, as well as an excellent benchmark in its own right. Here, the RX 7900 XT wouldn't run, since it doesn't use CUDA or Nvidia's RTX, but we can see the RTX 4070 coming in a respectable runner up to the RTX 4070 Ti.
One of my recent favorite workloads, Lumion 12.5, renders an architectural design into either a short movie clip at 1080p or 4K at 60 fps, making it one of the best benchmarks for creatives to see how a graphics card handles production level workloads rather than synthetic tests.
It requires the same kind of hardware as many of the best PC games in order to light a scene, create realistic water effects, and reproduce foliage on trees, and it's the kind of real-world benchmark that tells more about the card than a simple number devoid of context.
Considering that it can take a five-second, 60 fps movie clip an hour to render at production quality, I switched things up a bit and rather than calculate frames per second, like I do with Handbrake's encoding test, I use frames per hour to give a sense of how long a movie clip you can produce if you leave the clip to render overnight (a common practice).
In the case of the RTX 4070, it rendered a five-second movie clip at 60 fps at draft (1-star) quality 13% faster than the RTX 3080, about 7% faster than the RTX 3080 Ti, and nearly 23% faster than the RX 7900 XT.
It lagged behind the RTX 4070 Ti, though, by about 8%, a deficit that grew wider at 1080p production (4-star) quality, where the RTX 4070 rendered the movie 25% slower than the 4070 Ti and 6.78% slower than the RX 7900 XT.
For Handbrake, the RTX 4070 manages to pull out its first clean win on the creative side, though not by a whole lot. Still, 170 frames per second encoding from 4K to 1080p is not bad at all.
Overall then, the RTX 4070 puts in a solid creative performance, besting the RTX 3080, the RX 7900 XT, the RTX 3070 Ti, and the RTX 3070, while barely losing out to the RTX 3080 Ti.
Gaming Performance
(Image credit: Future / John Loeffler)
As good of a creative card as the RTX 4070 is, in its bones, this is a gamers' graphics card, so gaming performance is definitely where I spent most of my time testing the RTX 4070. I want to note that the included figures here are a representative sample of my testing, and that not all test results are shown.
When it comes to gaming performance, the RTX 4070 offers some of the best you're going to get at this price, though there are some stipulations to bring up right out the gate.
First, broadly speaking, this card can game at 4K on most games not called Cyberpunk 2077 or Metro: Exodus using max settings natively, so long as you keep things within reasonable limits. Or, really, one limit: keep ray tracing turned off.
Overall, the RTX 4070 gets about 58 fps on average at 4K when not ray tracing, with a floor of 45 fps at 4K, which is eminently playable. Turn ray tracing to the max and your get an average fps of 34 with a floor of 25, which is just better than a slideshow.
The RTX 3080 doesn't fare too much better on this metric, managing 40 fps on average with a floor of 29 fps at max settings with ray tacing turned on, while the RTX 3080 Ti averages about 36 fps and a floor of 19 fps. This does put the RTX 4070 just behind the 3080 Ti in terms of average fps and with a higher fps floor than the 3080 Ti.
If you're dead set on ray tracing, the RTX 4070 can certainly deliver, thanks to DLSS, which can bump those numbers back up to 79 fps on average with a floor of 55 fps. Compare that to the RTX 3080's 80 fps average with a 58 fps floor in our tests and the RTX 4070 can definitely go toe to toe with the RTX 3080 when ray tracing on max settings if DLSS is on.
In addition, the RTX 4070 gets about 10% less fps on average than the RTX 3080 Ti at 4K with ray tracing and DLSS on, (79 fps to the 3080 Ti's 88 fps), and a roughly 14% lower fps floor than the RTX 3080 Ti (55 fps to the 3080 Ti's 64 fps).
Overall, the RTX 4070 manages an average 57 fps at 4K, with a floor of 41 fps, across all the settings I tested. This is about 28% lower than the RTX 4070 Ti (79 fps average, overall), about 10% lower than the RTX 3080 (63 fps average, overall), the RX 7900 XT (64 fps average, overall), and the RTX 3080 Ti (64 fps average, overall).
These numbers skew a bit against the RTX 4070, since the RTX 4070 Ti, RX 7900 XT, RTX 3080, and RTX 3080 Ti all handle native 4K gaming much better, but so few people play at native 4K anymore that is a fairly meaningless advantage.
Meanwhile, the RTX 4070 actually beats the RX 7900 XT by about 20% when using DLSS (versus the RX 7900 XT's FSR) at 4K with max settings and ray tracing; 79 fps on average to 66 fps on average, respectively. It also manages to strike a dead heat with the RTX 3080 (80 fps average) and come just 10% short of the RTX 3080 Ti's average RT performance at 4K with ray tracing.
It's important to note as well that these don't factor in DLSS 3 Frame Generation, to make it a fair comparison.
As for the RTX 3070, the RTX 4070 manages about 39% better average 4K performance, with a 53% higher fps floor (57 fps average with a 43 fps floor for the RTX 4070 compared to the RTX 3070's 41 fps average and 28 fps floor).
When it comes to 1440p gaming, the RTX 4070 is on much more solid footing, even if some of the bigger cards definitely perform better in absolute terms. The RTX 4070 underperforms the RTX 3080 by about 8% in non-ray-traced, non-upscaled 1440p gaming, on average (105 fps to the RTX 3080's 115 fps), though they both have a very similar floor around 80-85 fps.
Meanwhile, the RTX 4070 falls about 12% short of the RTX 3080 Ti's 119 average fps at non-ray-traced, non-DLSS 1440p.
Both the RTX 4070 Ti and RX 7900 XT kinda clobber the RTX 4070 with roughly 25-29% better performance at non-ray-traced, non-upscaled 1440p gaming, and this carries over into gaming with ray tracing settings maxed out, though the RTX 4070 is still getting north of 60 fps on average (67 fps, to be precise), with a relatively decent floor of 51 fps.
The real kicker though is when we turn on DLSS, at which point the RTX 4070 beats out everything but the RTX 4070 Ti and RTX 3080 Ti, including the RX 7900 XT, which it outperforms by about 29% on average (125 fps to 97 fps), with a much higher floor of 88 fps to the RX 7900 XT's 60 fps, a nearly 49% advantage.
The RTX 4070 also beats the RTX 3080 here too, with about 5% better performance on average and a 7.5% higher fps floor on average than the RTX 3080. Incredibly, the RTX 4070 is just 3% slower than the RTX 3080 Ti when both are using DLSS at 1440p with max ray tracing.
As for the RTX 3070, the RTX 4070 gets about 35% better performance at 1440p with ray tracing and DLSS 2.0 than the card it replaces (125 fps to 93 fps), with a nearly 53% higher fps floor on average (87 fps to the 3070's 57 fps), meaning that where the RTX 3070 is setting the 1440p standard, the RTX 4070 is blowing well past it into territory the RTX 3070 simply cannot go.
The story is pretty much the same at 1080p, with there being essentially no difference between the RTX 4070, the RTX 3080, the RTX 3080 Ti, and the RX 7900 XT, with the RTX 3070 languishing about 30% behind and the RTX 4070 Ti off on its own out ahead of everyone else.
There has been a lot of talk about the RTX 4070 ahead of its launch as benchmarks have leaked and people have looked at numbers out of context and downplayed the performance of the RTX 4070 based on one or two tests. They've even pointed to the price increase to say that this card is a disappointment.
Granted, I'm not thrilled about the 20% price increase either, but there's no getting around the fact that you're getting a graphics card here with just 200W TGP that's putting up numbers to rival the RTX 3080 Ti. And I haven't even touched on the new features packed into Lovelace that you can't get with the last-gen Nvidia graphics cards.
The numbers are what they are, and the RTX 4070's performance is simply outstanding across every resolution in all the ways that matter.
Performance score: 5 / 5
Should you buy the Nvidia GeForce RTX 4070 ?
(Image credit: Future / John Loeffler)
Buy it if...
You want next-gen performance for less than $600 The Nvidia RTX 4070 offers performance on par with the RTX 3080 and even the RTX 3080 Ti for a good deal less.
You don't want a massive GPU Graphics cards are starting to resemble transformers nowadays (both the autobot and power plant variety), so it's nice to get a graphics card that's just normal-sized.
You want next-gen features like DLSS 3 Nvidia's hardware is often on the bleeding edge of the industry, but things like DLSS 3 and Nvidia Reflex are Nvidia's not-so-secret force multiplier here.
Don't buy it if...
You can get an RTX 3080 cheap Generally, the RTX 4070 is going to outperform the 3080, but if you don't care about the advanced features and can grab the 3080 in a bargain bin, you could save some money.
You're looking for Nvidia's next budget card The RTX 4070 is a lot cheaper than the rest of the current-gen graphics card lineups from Nvidia and AMD, but at $600, it's still too expensive to truly be a "budget" GPU.
Nvidia GeForce RTX 4070 review: Also consider
If our Nvidia GeForce RTX 4070 review has you considering other options, here are two more graphics cards to consider...
AMD Radeon RX 7900 XT While its bigger sibling gets a lot more attention, don't sleep on the RX 7900 XT. It's one of the best graphics cards AMD has ever produced, and while it's a good bit more expensive than the RTX 4070, it's powerful and future-proofed enough for 8K gaming that you'll be able to get a lot of use out of this card in the long-term.
Nvidia GeForce RTX 4070 Ti The Nvidia RTX 4070 Ti doesn't have a Founders Edition, so it's going to be more expensive than its $799 MSRP, but the performance on offer here makes this an excellent alternative to the RTX 4070 if you've got some extra cash to spend.
When I test a graphics card, I start by making sure that all tests are performed on the same test bench setup to isolate GPU performance. I then run it through a series of synthetic benchmarking tools like 3DMark as well as in-game benchmarks in the most recent PC games I can access like Cyberpunk 2077 and F1 2022.
I run everything on the maximum settings possible without upscaling tech, and I run all tests at the resolution a reader is most likely to use a given card at. In the case of the RTX 4070, this meant testing at 1080p, 1440p, and 2160p.
I also make sure to install the latest relevant drivers and rerun tests on any competing graphics card that I might have already reviewed and tested, like the RTX 4070 Ti, RX 7900 XT, and RTX 3080 to make sure that I have the most current scores to account for any driver updates. All of these scores are recorded and compared against the card's predecessor, its most direct rival, and the card directly above and below it in the product stack, if those cards are available.
I then average these scores to come to a final overall score and divide that by the card's MSRP to see how much performance every dollar or pound spent actually gets you to find how much value the card actually brings to the table.
Finally, I actually use the card in my own personal computer for several days, playing games, using apps like Adobe Photoshop or Adobe Illustrator, and watching for any anomalies, crashes, glitches, or visual disruptions that may occur during my time with the card. Having extensively covered and tested many graphics cards over the years, I know what a graphics card should do and how it should perform, and can readily identify when something is not performing up to expectations and when it exceeds them.
• Original review date: October 2022 • Launch price: MSRP at $1,599 / £1,649 / AU$2,959 • Lowest price now: $2459 / £1,999.99 / AU$2,999
Update – April 2025: The Nvidia RTX 4090 is currently the second most powerful 'consumer' graphics card on the market today, so it's definitely still worth buying after the release of the Nvidia GeForce RTX 5090.
Whether it's gaming or creative work, this is one of the best graphics cards you can get, capable of some of the fastest 4K framerates around with creative chops second only to the RTX 5090.
Its market price is way higher than its launch MSRP (often close to double its launch price, at least in the US), but if you're in the market for an RTX 4090, chances are money isn't as big of a concern as it is further down the premium GPU stack, and this might be an excellent alternative to the RTX 5090, whose price right now is simply offensive. Your biggest problem, though, is going to be finding this card, which is increasingly difficult as most retailers selling new cards are completely sold out and aren't expecting restocks, so you might have to look to the RTX 5090 or RTX 5080 instead.
Original unedited review follows...
Nvidia GeForce RTX 4090: two minute review
Well, the Nvidia GeForce RTX 4090 is finally here, and there's no question that it delivers on many of the lofty promises made by Nvidia ahead of its launch, delivering stunning gen-on-gen performance improvement that is more akin to a revolution than an advance.
That said, you won't find four times performance increases here, and only in some instances will you see a 2x increase in performance over the Nvidia GeForce RTX 3090, much less the Nvidia GeForce RTX 3090 Ti, but a 50% to 70% increase in synthetic and gaming performance should be expected across the board with very rare exceptions where the GPU runs too far ahead of the CPU.
On the creative side of things, this card was made to render, completely lapping the RTX 3090 in Blender Cycles performance, which makes this the best graphics card for creatives on the market, period, hands down.
On the gaming side, this is the first graphics card to deliver fully native 4K ray-traced gaming performance at a very playable framerate, without the need for DLSS, showing the maturity of Nvidia's third-generation ray tracing cores.
Even more incredible, Nvidia's new DLSS 3 shows even more promise, delivering substantially faster framerates over the already revolutionary DLSS 2.0. And while we did not test DLSS 3 as extensively as we did the RTX 4090's native hardware (for reasons we'll explain in a bit), from what we've seen, Nvidia's new tech is probably an even more important advance than anything having to do with the hardware.
On trhe downside, the card does require even more power than its predecessor, and when paired with something like the Intel Core i9-12900K, you're going to be pulling close to 700W of power between these two components alone. Worse still, this additional power draw requires some very strategic cable management to practically use, and for a lot of builders, this is going to be a hard card to show off in a case with a bundle of PCIe cables in the way.
The price has also increased over its predecessor, though given its incredible performance and the price of the previous graphics card champ, the RTX 3090 Ti, the RTX 4090 offers for more performance for the price than any other card on the market other than the Nvidia GeForce RTX 3080 and Nvidia GeForce RTX 3080 Ti. So even though the Nvidia RTX 4090 is a very expensive card, what you are getting for the price makes it a very compelling value proposition if you can afford it.
In the end, the Nvidia GeForce RTX 4090 is definitely an enthusiast graphics card in terms of price and performance, since the level of power on offer here is really overkill for the vast majority of people who will even consider buying it. That said, if you are that enthusiast – or if you are a creative or a researcher who can actually demonstrate a need for this much power – there's isn't much else to say but to buy this card.
It is more powerful than many of us ever thought it could be, and while I'd definitely argue that the Nvidia GeForce RTX 4070 Ti or AMD Radeon RX 7900 XTX is the better purchase for gamers given their price, the RTX 4090 was always going to be a card for the early adopters out there, as well as creatives who are out to spend the company's money, not their own, and the RTX 4090 will give you everything you could want in an enthusiast graphics card.
Nvidia GeForce RTX 4090: Price & availability
(Image credit: Future)
How much is it? MSRP listed at $1,599 (about £1,359, AU$2,300)
When is it out? It is available October 12, 2022.
Where can you get it? Available in the US, UK, and Australia.
The Nvidia GeForce RTX 4090 goes on sale worldwide on October 12, 2022, with an MSRP of $1,599 in the US (about £1,359/AU$2,300).
This is $100 more than the MSRP of the RTX 3090 when it was released in September 2020, but is also about $400 less than the MSRP of the RTX 3090 Ti, though the latter has come down considerably in price since the RTX 4090 was announced.
And while this is unquestionably expensive, this card is meant more as a creative professional's graphics card than it is for the average consumer, occupying the prosumer gray area between the best gaming PC and a Pixar-workstation.
Of course, third-party versions of the RTX 4090 are going to cost even more, and demand for this card is likely to drive up the price quite a bit at launch, but with the crash of the cryptobubble, we don't think we'll see quite the run-up in prices that we saw with the last generation of graphics cards.
Finally, one thing to note is that while this is an expensive graphics card, its performance is so far out ahead of similarly priced cards, that it offers a much better price to performance value than just about any other card out there, and it is far ahead of its immediate predecessors in this regard. Honestly, we don't really see this kind of price-to-performance ratio outside of the best cheap graphics cards, so this was definitely one of the biggest surprises coming out of our testing.
Value: 4 / 5
Nvidia GeForce RTX 4090: features & chipset
(Image credit: Future)
4nm GPU packs in nearly three times the transistors
Substantial increase in Tensor Cores
Third generation RT Cores
Nvidia GeForce RTX 4090 key specs
GPU: AD102 CUDA cores: 16,384 Tensor cores: 512 Ray tracing cores: 128 Power draw (TGP): 450W Base clock: 2,235 MHz Boost clock: 2,520 MHz VRAM: 24GB GDDR6X Bandwith: 1,018 GB/s Bus interface: PCIe 4.0 x16 Outputs: 1 x HDMI 2.1, 3 x DisplayPort 1.4a Power connector: 1 x 16-pin
The Nvidia GeForce RTX 4090 features some major generational improvements on the hardware front, courtesy of the new Nvidia Lovelace architecture. For one, the AD102 GPU uses TSMC's 4nm node rather the Samsung 8nm node used by the Nvidia Ampere GeForce cards.
The die size is 608mm², so a little bit smaller than the 628mm² die in the GA102 GPU in the RTX 3090, and thanks to the TSMC node, Nvidia was able to cram 76.3 billion transistors onto the AD102 die, a 169% increase in transistor count over the GA102's 28.3 billion.
The clock speeds have also see a substantial jump, with the RTX 4090's base clock running at a speedy 2,235 MHz, compared to the RTX 3090's 1,395 MHz. It's boost clock also gets a commesurate jump up to 2,520 MHz from 1,695 MHz.
It's memory clock is also slightly faster at 1,325 MHz, up from 1,219 MHz, giving the RTX 4090 a faster effective memory speed of 21.2 Gbps versus the RTX 3090's 19.5 Gbps. This lets the RTX 4090 get more out of the same 24GB GDDR6X VRAM as the RTX 3090.
When it comes to the number of cores, the RTX 4090 packs in 56% more streaming multiprocessors than the RTX 3090, 128 to 82, which translates into nearly 6,000 more CUDA cores as the RTX 3090 (16,384 to 10,496). That also means that the RTX 4090 packs in 46 additional ray tracing cores and an additional 184 Tensor cores, and next-gen cores at that, so they are even better at ray tracing and vectorized computations than its predecessor.
This is immediately apparent when cranking up ray tracing to the max on games like Cyberpunk 2077, and especially when running DLSS 3, which makes the jump to full-frame rendering rather than just the pixel rendering done by earlier iterations of DLSS.
Features & Chipset: 5 / 5
Nvidia GeForce RTX 4090: design
(Image credit: Future)
Yeah, that 16-pin connector is a pain to work with
A little bit thicker, but a little shorter, than the RTX 3090
The Nvidia GeForce RTX 4090 Founders Edition looks very much like its predecessor, though there are some subtle and not-so-subtle differences. First off, this is a heavier card for sure, so don't be so surprised that we need to start adding support brackets to our PC builds. It might have been optional in the last generation, but it is absolutely a necessity with the Nvidia RTX 4090.
The Founders Edition does not come with one, but third-party cards will likely include them and manufacturers are already starting to sell them separately so we would definitely suggest you pick one up.
Otherwise, the dimensions of the RTX 4090 around that much different than the RTX 3090. It's a bit thicker than the RTX 3090, but it's a bit shorter as well, so if your case can fit an RTX 3090 FE it will most likely fit an RTX 4090 FE.
The fans on either side of the card help pull air through the heatsink to cool off the GPU and these work reasonably well, considering the additional power being pulled into the GPU.
Speaking of power, the RTX 4090 introduces us to a new 16-pin connector that requires four 8-pin connectors plugged into an adaptor to power the card. Considering the card's 450W TDP, this shouldn't be surprising, but actually trying to work with this kind of adapter in your case is probably going to be a nightmare. We definitely suggest that you look into the new PSU's coming onto the market that support this new connector without needing to resort to an adapter. If you're spending this much money on a new graphics card, you might as well go hog and make your life – and cable management – a bit easier.
Design: 4 / 5
Nvidia GeForce RTX 4090: performance
(Image credit: Future)
Unassisted native 4K ray-traced gaming is finally here
Creatives will love this card
So here we are, the section that really matters in this review. In the lead up to the Nvidia GeForce RTX 4090 announcement, we heard rumors of 2x performance increases, and those rumors were either not too far off or were actually on the mark, depending on the workload in question.
Image 1 of 8
(Image credit: Future / InfoGram)
Image 2 of 8
(Image credit: Future / InfoGram)
Image 3 of 8
(Image credit: Future / InfoGram)
Image 4 of 8
(Image credit: Future / InfoGram)
Image 5 of 8
(Image credit: Future / InfoGram)
Image 6 of 8
(Image credit: Future / InfoGram)
Image 7 of 8
(Image credit: Future / InfoGram)
Image 8 of 8
(Image credit: Future / InfoGram)
Across our synthetic benchmark tests, the Nvidia RTX 4090 produced eye-brow raising results from the jump, especially on more modern and advanced benchmarks like 3DMark Port Royal and Time Spy Extreme, occasionally fully lapping the RTX 3090 and running well ahead of the RTX 3090 Ti pretty much across the board.
Image 1 of 6
(Image credit: Future / InfoGram)
Image 2 of 6
(Image credit: Future / InfoGram)
Image 3 of 6
(Image credit: Future / InfoGram)
Image 4 of 6
(Image credit: Future / InfoGram)
Image 5 of 6
(Image credit: Future / InfoGram)
Image 6 of 6
(Image credit: Future / InfoGram)
This trend continues on to the GPU heavy creative benchmarks, with the Nvidia RTX 4090's Blender performance being especially noteable for more than doubling the RTX 3090 Ti's performance on two out of three tests, and blowing out any other competing 4K graphics card in Cycles rendering.
On Premiere Pro, the RTX 4090 scores noticeably higher than the RTX 3090 Ti, but the difference isn't nearly as dramatic since PugetBench for Premiere Pro measures full system performance rather than just isolating the GPU, and Adobe Photoshop is a heavily raterized workload, which is something that AMD has an advantage in over the past couple of generations, which is something we see pretty clearly in our tests.
Image 1 of 5
(Image credit: Future / InfoGram)
Image 2 of 5
(Image credit: Future / InfoGram)
Image 3 of 5
(Image credit: Future / InfoGram)
Image 4 of 5
(Image credit: Future / InfoGram)
Image 5 of 5
(Image credit: Future / InfoGram)
Gaming is obviously going to see some of the biggest jumps in performance with the RTX 4090, and our tests bear that out. Most gaming benchmarks show roughly 90% to 100% improved framerates with the RTX 4090 over the RTX 3090, and roughly 55% to 75% better performance than the Nvidia RTX 3090 Ti.
Image 1 of 13
(Image credit: Future / InfoGram)
Image 2 of 13
(Image credit: Future / InfoGram)
Image 3 of 13
(Image credit: Future / InfoGram)
Image 4 of 13
(Image credit: Future / InfoGram)
Image 5 of 13
(Image credit: Future / InfoGram)
Image 6 of 13
(Image credit: Future / InfoGram)
Image 7 of 13
(Image credit: Future / InfoGram)
Image 8 of 13
(Image credit: Future / InfoGram)
Image 9 of 13
(Image credit: Future / InfoGram)
Image 10 of 13
(Image credit: Future / InfoGram)
Image 11 of 13
(Image credit: Future / InfoGram)
Image 12 of 13
(Image credit: Future / InfoGram)
Image 13 of 13
(Image credit: Future / InfoGram)
These numbers are likely to jump even higher when you factor in DLSS 3. DLSS 3 isn't available in any commercially available games yet, but we were able to test DLSS 3 on a couple of special builds games that will be available shortly after the release of the RTX 4090. A few of these games had in-game benchmarks that we could use to test the performance of DLSS 3 using Nvidia's FrameView tool and the results showed two to three times better performance on some games than we got using current builds on Steam with DLSS 2.0.
Since we were using special builds and Nvidia-provided tools, we can't necessarily declare these results representative until we are able to test them out on independent benchmarks, but just eyeballing the benchmark demos themselves we see an obvious improvement to the framerates of DLSS 3 over DLSS 2.0.
Whether the two to three times better performance will hold up after its official release remains to be seen, but as much as DLSS 2.0 revolutionized the performance of the best PC games, DLSS 3 looks to be just as game-changing once it gets picked up by developers across the PC gaming scene. Needless to say, AMD needs to step up its upscaling game if it ever hopes to compete on the high-end 4K scene.
Now, there is a real question about whether most gamers will ever need anything coming close to this kind of performance, and there is such a thing as diminishing returns. Some might find that the native 4K ray tracing is neat, but kind of redundant since DLSS can get you roughly the same experience with an RTX 3090 or even an RTX 3080 Ti, but that's a judgment that individual consumers are going to have to make.
Personally, I think this card is at least approaching the point of overkill, but there's no doubt that it overkills those frame rates like no other.
Performance: 5 / 5
Should you buy an Nvidia GeForce RTX 4090?
(Image credit: Future)
Buy the Nvidia GeForce RTX 4090 if…
You want the best graphics card on the market There really is no competition here. This is the best there is, plain and simple.
You want native 4K ray-traced gaming DLSS and other upscaling tech is fantastic, but if you want native 4K ray-traced gaming, this is literally the only card that can consistently do it.
You are a 3D graphics professional If you work with major 3D rendering tools like Maya, Blender, and the like, then this graphics card will dramatically speed up your workflows.
Don’t buy the Nvidia GeForce RTX 4090 if…
You're not looking to do native, max-4K gaming Unless you're looking to game on the bleeding edge of graphical performance, you probably don't need this card.
You're on a budget This is a very premium graphics card by any measure.
Also consider
Nvidia GeForce RTX 3090 The RTX 3090 isn't nearly as powerful as the RTX 4090, but it is still an amazing gaming and creative professional's graphics card and is likely to be very cheap right now.
Nvidia GeForce RTX 3080 The RTX 3080 is a far cry from the RTX 4090, no doubt, but the RTX 3080 currently has the best price to performance proposition of any 4K card on the market. If you're looking for the best value, the RTX 3080 is the clear winner here.
AMD Radeon RX 6950 XT In another universe, AMD would have lead the Big Navi launch with the RX 6950 XT. It is a compelling gaming graphics card, offering excellent 4K gaming performance on par with the RTX 3090 and generally coming in at the same price as the RTX 3080 Ti.
The current state of Cobian Backup and Cobian Reflector
What Cobian Backup and Cobian Reflector lack in appealing aesthetics and instant allure, it more than makes up for in terms of powerful options.
That’s no real surprise when you realize that this app comes from a sole developer, Luis Cobian, rather than a software house with the resources to make an app both powerful and pretty.
It’s also no shock when you realize that Cobian Backup and its successor, Cobian Reflector, are freeware, too, so these apps very much prize function over form.
Cobian Backup began life in 2000, and it’s still available – although the last update was released in July 2024. Cobian Reflector, its successor, was first released in 2021 and also last updated in July 2024, but the future of the app is uncertain as the developer, Luis Cobian, is unsure about whether to continue the project.
For now, though, Cobian Backup and Reflector are still available and extremely viable options for home and SMB backups.
Plans & pricing
Cobian Backup and Cobian Reflector are both freeware, so you don’t have to pay a penny.
That said, it’s possible to donate to the developer on Cobian’s website. If you’re a power user or a big fan and choose to donate, who knows – it may help convince Cobian to keep working on this app.
Features
Cobian Backup and Reflector both have similar feature sets. They’re specific backup apps without the extra features you’ll find elsewhere, like options to create bootable media, wipe drives or manage your PC.
Cobian Backup allows users to back up files, folders, directories or entire disks to any local or FTP destination, and it’s packed with scheduling options – you can choose to backup on particular days of the week or days of the month or when your PC turns on.
There are compression options, up to 256-bit encryption available to protect backups, and you can filter to include or exclude files and specify pre- or post-backup events for extra customization. That’s ideal if you want to script or customize backups, or run them in conjunction with other tools.
Cobian Backup also has modules to decrypt and decompress your backups, wipe files securely, and manage backups remotely.
(Image credit: Cobian)
Switch over to Cobian Reflector and you’ll find a similar suite of features with improvements throughout.
In Reflector you can choose from full, incremental and differential backups, and more scheduling options help users take full advantage of incremental backups. Encryption has been improved, with compression and password options available, and there are more filtering options.
Reflector also now includes tools for database optimization and file repair alongside all of the modules that were available in Cobian Backup. It also has decompression and decryption options that weren’t present in Cobian Backup, which is a boon for restoring preserved files.
Business users will be pleased to see options to run either app as a service or using System or standard user accounts, and you can back up to your network – handy for those using NAS devices.
For home and SMB backups, it has a solid range of features, especially for a freeware app. But open up a paid-for rival, like tools from EaseUS or Paragon, and you’ll find options to back up entire operating systems, data from specific apps, or smartphones. It’s also common to find tools to clone drives, create bootable media, and mount or unmount images.
Still, Cobian Backup and Reflector are both free apps, so it’s worth tempering your expectations and remembering that both offer an excellent range of core functionality.
Interface & use
(Image credit: Cobian)
Cobian Backup and Reflector both have straightforward, unfussy interfaces. At the top of each app is a large row of icons for starting, pausing, creating and cancelling backup jobs, and a big icon for the options menu.
On the left is a pane with your list of tasks, and if you click those tasks, a pane on the right shows its properties. If you run a backup, its progress is shown in that pane on the right-hand side, and a graphical representation of its progress appears at the bottom of the window.
Opt to create a new backup, and a wizard-style window will guide you through the process, from files and destinations to archival, filtering, and encryption settings.
It’s straightforward, and Cobian Reflector uses the same system – albeit with a slightly cleaner and updated visual style.
We tested our latest slate of backup apps with a 42GB document folder, a 2.5GB spreadsheet folder, a 162GB folder of media and an 82GB file that mixes all of those file types. We backed them up to three different SSDs to weed out any inconsistency.
These two apps may be free, but they’re not fast. Cobian Backup averaged just over 90 minutes to back up those document folders, which was the slowest result across the nearly 20 apps in our latest testing slate. Its Excel and mixed media results of 51 minutes 26 seconds and 73 minutes 46 seconds aren’t much better.
Strangely, Cobian Backup’s media average of 11 minutes and 12 seconds was actually one of the best in our tests, which indicates that this app performs better when working with a smaller number of large files than with many tiny files.
Cobian Reflector performed similarly in the media test, with an average of 13 minutes and 1 second. Its Document performance also improved with a mid-table average of 19 minutes 28 seconds. But its Excel and mixed folder performance didn’t improve when compared to the original Cobian Backup.
Both of Cobian’s apps are among the slowest in our tests, with media performance the only highlight. That bodes well for media backups and it’s understandable from a one-person band software house and free pieces of software.
Support
Perhaps understandably for free software, there are no extensive support options available for Cobian Backup or Reflector.
Emailing the developer directly is possible, but it’s unclear if that will bear much fruit since the project appears to be in limbo. If you’re having trouble with either of Cobian’s apps, your best bet is to head to the forum and ask the experts, even if the forum doesn’t seem to be particularly active.
Competition
Cobian Backup and Reflector are not the only free backup software options if you want to preserve your files: EaseUS ToDo Backup, FBackup and Paragon Backup & Recovery all have free versions too.
Those apps tend to have a similar set of features to Cobian’s tools, but they excel in other areas, with slicker user interfaces and more accessible support options – even if you sometimes have to see adverts when you use the apps.
They’re often faster, too. FBackup is faster with documents, spreadsheets and mixed files, but slower with media. EaseUS and Paragon’s apps are far quicker in every category.
Verdict
Cobian Backup and Reflector are straightforward, effective backup tools with every key option that home users and small businesses need, and their interfaces ensure that they are easy to use. The free cost can’t be discounted, either.
But if you’re after free backup tools, there are faster options elsewhere – Cobian’s tools only offer effective speed if you’re preserving media files. Those alternatives also have more accessible support options.
Cobian Backup and Reflector are effective free backup tools, especially if you’re more comfortable navigating old-school software designs that hark back to the earlier days of Windows. But other free rivals are faster, and Cobian’s uncertain future means it may not be the best long-term solution.