Velocity Micro's ProMagix HD80 is a powerhouse desktop computer, with a focus on professional users rather than gamers. That distinction is necessary because gamers notoriously require the best graphics available.
Unlike many of the best workstations I've seen, this computer does not boast the newest GPU; however, it does have a top-of-the-line CPU, ample RAM, solid storage, powerful ports, and effective cooling, making it an excellent machine for productivity and business work that may not be as graphics-intensive.
The GPU is no slouch, but it's not an RTX 5090, so don't expect to game at maximum settings. However, this is a work machine, and sometimes that distinction is still necessary, even in 2025, where the lines between work and home are more blurred than ever.
(Image credit: Collin Probst // Future )
Velocity Micro ProMagix HD80: Pricing and Availability
The Velocity Micro ProMagix HD80 is a pre-built computer from Velocity Micro. They specialize in building computers to the specifications that are best suited for certain groups of people. A middleman that helps connect great computer hardware with consumers.
This model is meant for business productivity and is spec'd out to cost nearly $7,000. If you are interested in something like this, reach out to Velocity Micro or check out their website for their builds.
(Image credit: Collin Probst // Future )
Velocity Micro ProMagix HD80: Unboxing & first impressions
The ProMagix HD80 packaging is very well done. It's packed in the box tightly to prevent damage in transit. The box is massive, and since I knew what was inside, I made sure to buddy-lift this box, rather than try to muscle it up myself and risk dropping it. Once I had everything moved to where I needed it, I opened the box and set up the items.
The case build looks fantastic, even on first glance. If you're not familiar with Velocity Micro, the brief answer is that they specialize in building custom PCs for gaming and workstations. They also offer builds like these, where they pre-select a great combination of gear to create a dream setup with a specific outcome in mind. This outcome, of course, is business productivity. One of the things that Velocity Micro is known for is its excellence, and let me tell you, taking a look at the build quality of this PC, I can agree that Velocity Micro has gone above and beyond in assembling this computer with precision, care, and intentionality.
The whole build is simplistic, but not in a cheap way. They intentionally chose to omit the RGB colorways everywhere except for the CPU cooler, as RGB is a pretty clear gaming aesthetic. To further the minimalist branding and loud design, a slight Velocity Micro badge is visible on the glass case, paying homage to the company that built this beast.
The ProMagix HD80 features a solid steel construction with mesh front and top panels, creating overall fantastic airflow for this machine. The internals of this PC feature clean routing and sleek, modular components that are easy to access, service, and upgrade in the future when needed.
The materials chosen are solid and intentionally understated. Velocity Micro understands that this is not intended for gamers; it's not designed for that purpose. It's intended for professionals.
(Image credit: Collin Probst // Future )
Velocity Micro ProMagix HD80: In use
The design for the ProMagix HD80 from Velocity Micro is intended for creative professionals, specifically those who would use software such as Adobe, CAD, Revit, Blender, and similar applications. The Intel Ultra 9 effortlessly handles any workload I throw at it, thanks to its 64GB of RAM. For large files, the 6TB of storage works great, making it easy to store even massive files on this machine.
Another thing that this machine is excellent for is coding. I was able to run large databases, huge Postman files, and large codebases without any issue. The only spot that I began to notice issues was with GPU-heavy tasks or programs. The RTX 4500 is not the most cutting-edge GPU, but considering the focus of this machine, I'm not too upset. Plus, if you wanted one of the brand-new RTX 5090, you could pop one in here without hardly any work.
Attributes
Notes
Rating
Design
Sleek and minimal
⭐⭐⭐⭐⭐
Ease of use
Very easy to use
⭐⭐⭐⭐⭐
Practicality
Practical for some
⭐⭐⭐⭐
Price
Highly priced
⭐⭐⭐⭐
(Image credit: Collin Probst // Future )
Velocity Micro ProMagix HD80: Final verdict
Velocity Micro's ProMagix HD80 is a beautifully crafted custom build that is ideal for professionals who need to have a machine that they know will perform even with intensive apps and programs, but it's also good to keep in mind that just becuase it's a powerful machine.
Alienware has a reputation among system-building enthusiasts for being, well, a little odd, a little out of the ordinary. As premium brands go, there's a certain air about it, a je ne sais quoi, so to speak. Iconic? Perhaps, but it also doesn't know quite what it wants to be. Whether it's a company that's dedicated to the teenage gamer from yesteryear or the modern-day millennial professional is still up for debate, and its products show that. None more so than the Alienware Area-51 (2025), I've been testing over the last few weeks.
Built for a gamer who's not interested in the finer details, yet equipped with enough hardware to simulate the sun, it has a professional workstation price tag but a physical appearance that'd be more at home in 2009 than 2025. It's got a top-tier spec sheet, yet lacks some of the fundamentals that would make it a more pro-grade workstation. In short, who it's for is kind of a mystery.
The unit I tested comes with an Intel Core Ultra 9 285K processor, Nvidia RTX 5090 GPU, 64GB of DDR5, and a 2TB PCIe 5.0 SSD, so this is not going to be a budget gaming PC, that's for sure. The configuration I tested comes in at $5,700 in the US (although that's with a 2TB PCIe 4.0 SSD, not the 5.0 SSD in review), £5,469 in the UK, and an obscene AU$12,320 in Australia. That is a staggering cost, particularly when you consider similar-spec'd DIY machines can be built for a lot less.
That, of course, comes with some fairly major caveats. If you do want to build your own machine, you have to know what you're doing, put in the research, and be comfortable buying all those individual parts and putting it all together. There is some justification for skipping all of that and picking one of these up. Particularly if you're dead set on the hardware and have the budget for it.
(Image credit: Future / Zak Storey)
What you get is otherworldly performance, as you'd imagine. Computational tasks absolutely crumble before the Area-51, and gaming, particularly at 4K, is outstanding with even the most intense titles out there landing north of 100 fps on average without AI upscaling or any of the more modern frame generation shenanigans. Slap those settings on top of that stock performance, and that RTX 5090 just runs away with it, hitting frame rates well into the 200s.
The Area-51 keeps its components nice and cool too, thanks to twin 180mm intake fans in the front, two 120mm fans in the PSU floor, plus a 360mm AIO in the roof, exhausting upwards. That's all backed up with a rather curious 1500W platinum-rated PSU delivering power to the lot.
Aside from the premium pricing, problems also arise when you begin to dig under the surface. The rear I/O on that custom Alienware motherboard is sub-tier at best, with an overreliance on USB-C and very little USB-A at all, and the rest of the I/O is equally as lackluster, with minimal ethernet support and little in the way of integrated HDMI/DisplayPort or other features all too commonplace on even the cheapest of modern-day Z890 boards.
Then there's the case itself. It's big, bulky, and far too heavy. The dimensions are massive, and on delivery, the entire thing weighs 88 lbs (or 40 kg), requiring either one strong PC gamer or a two-person team to lift it and chuck it on your desk. That's surprising given the exterior of the chassis feels particularly dull, mostly composed of an unemotive satin plastic, rather than the thick, girthy steel you'd likely expect, given the heft.
Yes, there are those signature Alienware curves and lines and enough RGB lighting dotted around to keep that 15-year-old kid in you happy, but it just lacks the finesse that the best gaming PCs of this price and caliber should come with, and that's a problem.
Alienware Area-51 (2025): Price & Availability
How much is it? Starting at $3,749.99 / £3,799 / AU$7,271
When is it available? You can pick one up today
Where can you get it? Directly from Dell's webstore
Let's be fair, we all knew the price tag for this thing was going to be ridiculous; after all, it's without a doubt Alienware's signature party trick and is loaded down with top-tier specs from the best graphics cards, best processors, best RAM, and best SSDs you can find on the market right now.
The Area 51 starts at $3,749.99 / £3,799 / AU$7,271 in the US, UK, and Australia, respectively, which is still pretty premium as far as gaming PCs go. That's especially true considering you're getting a Core Ultra 7 system with an RTX 5080, 32GB DDR5 (or even 16GB DDR5 in Australia), and fairly modest 1TB or 2TB storage. The RTX 5090 configuration I tested (with Core Ultra 9 285K, 64GB RAM, and 2TB PCIe SSD) comes in at $5,699.99 in the US, £5,469 in the UK, and a frightful AU$12,320 Down Under.
If I'm honest, we've not tested much like this PC at TechRadar to date, largely because of the RTX 5090 at its heart. And while it's an unabashed monster that delivers exceptional performance, compared to last gen's RTX 4090, it's seen a significant price increase—and that was before Nvidia's low availability and stock issues that it's suffered since its release.
Put this against the best price possible on a DIY rig, though, with the same component tier as my review unit, and the price difference is substantial. According to PC Part Picker, a DIY build would set you back just $4,842.91 in the US, £4,267.64 in the UK, or AU$9,914.60 in Australia. It's up to you whether you want to pay a roughly 18-28% premium to have a prebuilt system like this, but you can likely get the same performance for cheaper.
All configurations come with a custom 02JGX1 E-ATX Z890 motherboard and vary from there based on region. Additionally, all models feature a bespoke PSU, with the US and Australia starting with an 80+ Gold 850W power supply, while the UK only has an option for a 1500W 80+ Platinum PSU.
The US and Australia start with 1TB PCIe 4.0 SSD storage, while the UK starts with a larger 2TB PCIe 4.0 SSD. The US and Australia also start with a smaller 240mm AIO cooler, while the UK only has a 360mm AIO option.
There are also a total of eight fans included: two 180mm intakes, two 120mm intakes in the PSU floor, and three 120mm exhausts hidden above the topmost radiator.
Starting memory options come in the form of a dual-channel kit of Kingston Fury DDR5, ranging from 16GB all the way up to 64GB capacity, depending on your region.
When it comes to max spec configurations, there's not much difference between regions, other than the US maxing out at just one 4TB PCIe 4.0 SSD, while the UK and Australia come with two 4TB PCIe 4.0 SSDs for a total of 8TB of storage.
For the top-tier configurations, you get an Intel Core Ultra 9 285K CPU, an Nvidia RTX 5090 GPU, 64GB DDR5-6400 memory with XMP overclocking, a 1500W Platinum-rated PSU, and a 360mm AIO cooler.
2TB PCIe NVMe SSD (PCIe 4.0 in the US, PCIe 5.0 in UK and Australia)
Cooling
360mm AIO
PSU
1500W 80+ Platinum
The configuration I'm reviewing here is towards the higher end, featuring a Core Ultra 9 285K, RTX 5090, 64GB DDR5 RAM, and a 2TB SSD, though the closest US config to my review unit has a PCIe 4.0 SSD, rather than a PCIe 5.0. It also has a 360mm AIO cooler and the beefier 1500W PSU.
Specs: 4 / 5
Alienware Area-51 (2025): Design
(Image credit: Future / Zak Storey)
Oversized case leaves much to be desired
Internal industrial styling is intense
External shell a bit dull in the modern era
The Alienware Area-51 desktop is big. Seriously big. Its monstrous size will likely keep it off most desks. Even on my own test bench, at three meters long and 60cm deep, it could easily hang off the edge if I had situated it like I do my normal machine.
It's heavy too; that nearly 90 lbs/40kg weight is nothing to snort at—it's the kind of heft I'd expect from a custom liquid-cooled machine, not a pre-built system like this that's mostly comprised of plastic and a single AIO cooler.
(Image credit: Future / Zak Storey)
The overall design is alright. It's got that Alienware chic, with the curves and the alien head logo on the front. Fonts are tidy, and cooling is for the most part well implemented across the board. The internal layout is massive, and there are QR codes littered everywhere for you to scan if you ever need a handy guide on how to update the graphics card or install new memory in the future. It still looks better suited to a launch a decade ago by modern styling standards.
(Image credit: Future / Zak Storey)
What's less impressive is Alienware cutting some corners to bring this machine to market. Cable management internally is less than stellar. There are no braided cables here, and although the rear of the case is tidy, neat, and well-managed, there's a lot of extra cable around the front jutting out that's quite unsightly, or it's bound together awkwardly, pushed into headers on that bespoke motherboard.
(Image credit: Future / Zak Storey)
There's even a massive chunk of metal strapped to the right-hand side of the GPU, solely to help cable-manage that 12VHPWR cable going into the RTX 5090, which not only feels massively overkill but also kind of doesn't work, as it's still draped along the top of it anyway. I mean, yes, technically it is acting as an anti-sag bracket as well here. The RTX 5090 isn't the lightest of cards out there, but there are so many better, more elegant solutions out there these days than just a large block of rectangular metal. It's a real shame.
(Image credit: Future / Zak Storey)
Then there's that custom Z890 motherboard, the adorably named—hang on, let me check my notes here—ah yes, the 02JGX1. A bizarre-looking thing, complete with two DIMM slots for your RAM, three M.2 ports, and, well, that's about it.
It does look like Alienware has attempted to lean into that industrial aesthetic here, but there's so much exposed PCB, wireless cards, and ports just littered everywhere, combined with that poor cabling, it's seriously distracting.
(Image credit: Future / Zak Storey)
Power phase setup is fairly tame too (which directly translates to CPU performance, which I'll speak to momentarily), with a 14-phase design, and the rear I/O is practically nonexistent, with only a smattering of USB Type-A and far too many USB Type-C.
You get one Ethernet port and WiFi support (weirdly running as a passthrough all the way at the bottom of the chassis), and that's kind of it. No Clear CMOS, BIOS flashback, HDMI or DisplayPort out, and no 5.1 audio either.
I bring this up very specifically because this is a $5,700 machine, and motherboards at $235 deliver far more for far less.
Design: 3 / 5
Alienware Area-51 (2025): Performance
(Image credit: Future / Zak Storey)
Incredible 4K gaming framerates
PCIe 5.0 SSD rips
CPU relatively sluggish
Alienware Area-51 (2025) Benchmarks
Here's how the Alienware Area-51 (2025) performed in our suite of synthetic and gaming benchmarks:
When it comes to performance, the Area-51 delivers, although you probably expect it to with top-line specs like it has.
Cinebench R24 performance was well into the two thousands, with a multi-core count of 2,186, on average, and an average single-core score of 136. That's not too shabby at all. In fact, the multi-score number is around 91 points per thread, making it wildly efficient. Similarly, Geekbench 6 also had a pretty good outing, with 21,786 points scored on the multi-core and 3,148 in single-core performance.
That SK Hynix PCIe 5.0 drive in my system, though, is the real winner, and although its sequentials were relatively low for a modern-day PCIe 5.0 drive (averaging just 12.3 GB/s on both read and write), the random 4K performance absolutely decimates pretty much every drive I've seen this year. Its random 4K read performance hit a relatively speedy 113 MB/s, with random writes coming in at 350 MB/s. You're going to see some seriously quick load times with this drive.
Temperatures and power draw were well within parameters, too, with the CPU topping out at around 92°C and the GPU at 75.8°C. Nothing out of the ordinary there. And while it is power-thirsty, the max power draw I saw during testing hit 840.8 W from the wall at peak, which isn't great, but isn't the worst either.
On the gaming side, in my 1080p testing, pretty much every title was well into the 120+ fps range or higher without the help of DLSS or Frame Gen, and a 4K, I saw upwards of 150 fps in Total War: Warhammer III's Battle benchmark, while Cyberpunk 2077 averaged 57 fps at 4K with ray tracing and no DLSS support at all.
The only mild problem I have with this setup is the slight discrepancy against an equally kitted-out rig I built earlier this year. Complete with an RTX 5090 plus Intel Core Ultra 9 285K, my own DIY rig beat out the Area-51 in practically every graphical and computational test.
At a guess, this is likely down to that CPU performance being heavily limited by the 14-phase VRM design, as it just couldn't produce enough juice to keep that Ultra 9 running at full speed for as long as its DIY counterpart.
The odd element about that, though, is that the DIY machine only featured a $220 Gigabyte motherboard, yet temps were at 100°C for the chip (and stable), but performance, both computational and in-game, was around 5-10% higher, depending. For a system that's $1,500 cheaper, that's not a good look.
Performance: 4 / 5
Should You Buy The Alienware Area-51 (2025)?
Alienware Area-51 (2025) Scorecard
Category
Verdict
Score
Value
This isn't a cheap gaming PC, not one bit. Unless you can justify the cost, or it saves you time in some manner, you'd be far better off building your own.
3 / 5
Specs
With the right config, you can easily get the best of the best hardware on the market right now, and you can upgrade it later—if you've got any budget left over.
4 / 5
Design
A design straight of the 2010s, there's numerous fumbles here that let down an otherwise stellar spec sheet.
3 / 5
Performance
Unsurprisingly with that top-tier hardware it absolutely dominates 4K gaming and any task you can throw at it.
4 / 5
Total
Big, bold, and a bit brash. It delivers on the performance front, but with mediocre styling, average build quality, and an insane price, it doesn't quite hit the mark.
3.5 / 5
Buy the Alienware Area-51 If…
You need to save time If you're not interested in building your own machine but want the best hardware, there's no denying this is a good pick, and easily upgradable long-term.
You have the desk space for it It's massive; the case is seriously long, and it's heavy enough that you'll need help just getting it on your desk.
Don't buy it if...
You want the best value A similarly kitted-out gaming PC, built yourself, can save you a lot of money.
You're looking for something a little more stylish Alienware has a style you'll either love or hate, but if you're after something with sharp lines and modern flair, then aside from the interior, you might want to look elsewhere.
Alienware has a reputation among system-building enthusiasts for being, well, a little odd, a little out of the ordinary. As premium brands go, there's a certain air about it, a je ne sais quoi, so to speak. Iconic? Perhaps, but it also doesn't know quite what it wants to be. Whether it's a company that's dedicated to the teenage gamer from yesteryear or the modern-day millennial professional is still up for debate, and its products show that. None more so than the Alienware Area-51 (2025), I've been testing over the last few weeks.
Built for a gamer who's not interested in the finer details, yet equipped with enough hardware to simulate the sun, it has a professional workstation price tag but a physical appearance that'd be more at home in 2009 than 2025. It's got a top-tier spec sheet, yet lacks some of the fundamentals that would make it a more pro-grade workstation. In short, who it's for is kind of a mystery.
The unit I tested comes with an Intel Core Ultra 9 285K processor, Nvidia RTX 5090 GPU, 64GB of DDR5, and a 2TB PCIe 5.0 SSD, so this is not going to be a budget gaming PC, that's for sure. The configuration I tested comes in at $5,700 in the US (although that's with a 2TB PCIe 4.0 SSD, not the 5.0 SSD in review), £5,469 in the UK, and an obscene AU$12,320 in Australia. That is a staggering cost, particularly when you consider similar-spec'd DIY machines can be built for a lot less.
That, of course, comes with some fairly major caveats. If you do want to build your own machine, you have to know what you're doing, put in the research, and be comfortable buying all those individual parts and putting it all together. There is some justification for skipping all of that and picking one of these up. Particularly if you're dead set on the hardware and have the budget for it.
(Image credit: Future / Zak Storey)
What you get is otherworldly performance, as you'd imagine. Computational tasks absolutely crumble before the Area-51, and gaming, particularly at 4K, is outstanding with even the most intense titles out there landing north of 100 fps on average without AI upscaling or any of the more modern frame generation shenanigans. Slap those settings on top of that stock performance, and that RTX 5090 just runs away with it, hitting frame rates well into the 200s.
The Area-51 keeps its components nice and cool too, thanks to twin 180mm intake fans in the front, two 120mm fans in the PSU floor, plus a 360mm AIO in the roof, exhausting upwards. That's all backed up with a rather curious 1500W platinum-rated PSU delivering power to the lot.
Aside from the premium pricing, problems also arise when you begin to dig under the surface. The rear I/O on that custom Alienware motherboard is sub-tier at best, with an overreliance on USB-C and very little USB-A at all, and the rest of the I/O is equally as lackluster, with minimal ethernet support and little in the way of integrated HDMI/DisplayPort or other features all too commonplace on even the cheapest of modern-day Z890 boards.
Then there's the case itself. It's big, bulky, and far too heavy. The dimensions are massive, and on delivery, the entire thing weighs 88 lbs (or 40 kg), requiring either one strong PC gamer or a two-person team to lift it and chuck it on your desk. That's surprising given the exterior of the chassis feels particularly dull, mostly composed of an unemotive satin plastic, rather than the thick, girthy steel you'd likely expect, given the heft.
Yes, there are those signature Alienware curves and lines and enough RGB lighting dotted around to keep that 15-year-old kid in you happy, but it just lacks the finesse that the best gaming PCs of this price and caliber should come with, and that's a problem.
Alienware Area-51 (2025): Price & Availability
How much is it? Starting at $3,749.99 / £3,799 / AU$7,271
When is it available? You can pick one up today
Where can you get it? Directly from Dell's webstore
Let's be fair, we all knew the price tag for this thing was going to be ridiculous; after all, it's without a doubt Alienware's signature party trick and is loaded down with top-tier specs from the best graphics cards, best processors, best RAM, and best SSDs you can find on the market right now.
The Area 51 starts at $3,749.99 / £3,799 / AU$7,271 in the US, UK, and Australia, respectively, which is still pretty premium as far as gaming PCs go. That's especially true considering you're getting a Core Ultra 7 system with an RTX 5080, 32GB DDR5 (or even 16GB DDR5 in Australia), and fairly modest 1TB or 2TB storage. The RTX 5090 configuration I tested (with Core Ultra 9 285K, 64GB RAM, and 2TB PCIe SSD) comes in at $5,699.99 in the US, £5,469 in the UK, and a frightful AU$12,320 Down Under.
If I'm honest, we've not tested much like this PC at TechRadar to date, largely because of the RTX 5090 at its heart. And while it's an unabashed monster that delivers exceptional performance, compared to last gen's RTX 4090, it's seen a significant price increase—and that was before Nvidia's low availability and stock issues that it's suffered since its release.
Put this against the best price possible on a DIY rig, though, with the same component tier as my review unit, and the price difference is substantial. According to PC Part Picker, a DIY build would set you back just $4,842.91 in the US, £4,267.64 in the UK, or AU$9,914.60 in Australia. It's up to you whether you want to pay a roughly 18-28% premium to have a prebuilt system like this, but you can likely get the same performance for cheaper.
All configurations come with a custom 02JGX1 E-ATX Z890 motherboard and vary from there based on region. Additionally, all models feature a bespoke PSU, with the US and Australia starting with an 80+ Gold 850W power supply, while the UK only has an option for a 1500W 80+ Platinum PSU.
The US and Australia start with 1TB PCIe 4.0 SSD storage, while the UK starts with a larger 2TB PCIe 4.0 SSD. The US and Australia also start with a smaller 240mm AIO cooler, while the UK only has a 360mm AIO option.
There are also a total of eight fans included: two 180mm intakes, two 120mm intakes in the PSU floor, and three 120mm exhausts hidden above the topmost radiator.
Starting memory options come in the form of a dual-channel kit of Kingston Fury DDR5, ranging from 16GB all the way up to 64GB capacity, depending on your region.
When it comes to max spec configurations, there's not much difference between regions, other than the US maxing out at just one 4TB PCIe 4.0 SSD, while the UK and Australia come with two 4TB PCIe 4.0 SSDs for a total of 8TB of storage.
For the top-tier configurations, you get an Intel Core Ultra 9 285K CPU, an Nvidia RTX 5090 GPU, 64GB DDR5-6400 memory with XMP overclocking, a 1500W Platinum-rated PSU, and a 360mm AIO cooler.
2TB PCIe NVMe SSD (PCIe 4.0 in the US, PCIe 5.0 in UK and Australia)
Cooling
360mm AIO
PSU
1500W 80+ Platinum
The configuration I'm reviewing here is towards the higher end, featuring a Core Ultra 9 285K, RTX 5090, 64GB DDR5 RAM, and a 2TB SSD, though the closest US config to my review unit has a PCIe 4.0 SSD, rather than a PCIe 5.0. It also has a 360mm AIO cooler and the beefier 1500W PSU.
Specs: 4 / 5
Alienware Area-51 (2025): Design
(Image credit: Future / Zak Storey)
Oversized case leaves much to be desired
Internal industrial styling is intense
External shell a bit dull in the modern era
The Alienware Area-51 desktop is big. Seriously big. Its monstrous size will likely keep it off most desks. Even on my own test bench, at three meters long and 60cm deep, it could easily hang off the edge if I had situated it like I do my normal machine.
It's heavy too; that nearly 90 lbs/40kg weight is nothing to snort at—it's the kind of heft I'd expect from a custom liquid-cooled machine, not a pre-built system like this that's mostly comprised of plastic and a single AIO cooler.
(Image credit: Future / Zak Storey)
The overall design is alright. It's got that Alienware chic, with the curves and the alien head logo on the front. Fonts are tidy, and cooling is for the most part well implemented across the board. The internal layout is massive, and there are QR codes littered everywhere for you to scan if you ever need a handy guide on how to update the graphics card or install new memory in the future. It still looks better suited to a launch a decade ago by modern styling standards.
(Image credit: Future / Zak Storey)
What's less impressive is Alienware cutting some corners to bring this machine to market. Cable management internally is less than stellar. There are no braided cables here, and although the rear of the case is tidy, neat, and well-managed, there's a lot of extra cable around the front jutting out that's quite unsightly, or it's bound together awkwardly, pushed into headers on that bespoke motherboard.
(Image credit: Future / Zak Storey)
There's even a massive chunk of metal strapped to the right-hand side of the GPU, solely to help cable-manage that 12VHPWR cable going into the RTX 5090, which not only feels massively overkill but also kind of doesn't work, as it's still draped along the top of it anyway. I mean, yes, technically it is acting as an anti-sag bracket as well here. The RTX 5090 isn't the lightest of cards out there, but there are so many better, more elegant solutions out there these days than just a large block of rectangular metal. It's a real shame.
(Image credit: Future / Zak Storey)
Then there's that custom Z890 motherboard, the adorably named—hang on, let me check my notes here—ah yes, the 02JGX1. A bizarre-looking thing, complete with two DIMM slots for your RAM, three M.2 ports, and, well, that's about it.
It does look like Alienware has attempted to lean into that industrial aesthetic here, but there's so much exposed PCB, wireless cards, and ports just littered everywhere, combined with that poor cabling, it's seriously distracting.
(Image credit: Future / Zak Storey)
Power phase setup is fairly tame too (which directly translates to CPU performance, which I'll speak to momentarily), with a 14-phase design, and the rear I/O is practically nonexistent, with only a smattering of USB Type-A and far too many USB Type-C.
You get one Ethernet port and WiFi support (weirdly running as a passthrough all the way at the bottom of the chassis), and that's kind of it. No Clear CMOS, BIOS flashback, HDMI or DisplayPort out, and no 5.1 audio either.
I bring this up very specifically because this is a $5,700 machine, and motherboards at $235 deliver far more for far less.
Design: 3 / 5
Alienware Area-51 (2025): Performance
(Image credit: Future / Zak Storey)
Incredible 4K gaming framerates
PCIe 5.0 SSD rips
CPU relatively sluggish
Alienware Area-51 (2025) Benchmarks
Here's how the Alienware Area-51 (2025) performed in our suite of synthetic and gaming benchmarks:
When it comes to performance, the Area-51 delivers, although you probably expect it to with top-line specs like it has.
Cinebench R24 performance was well into the two thousands, with a multi-core count of 2,186, on average, and an average single-core score of 136. That's not too shabby at all. In fact, the multi-score number is around 91 points per thread, making it wildly efficient. Similarly, Geekbench 6 also had a pretty good outing, with 21,786 points scored on the multi-core and 3,148 in single-core performance.
That SK Hynix PCIe 5.0 drive in my system, though, is the real winner, and although its sequentials were relatively low for a modern-day PCIe 5.0 drive (averaging just 12.3 GB/s on both read and write), the random 4K performance absolutely decimates pretty much every drive I've seen this year. Its random 4K read performance hit a relatively speedy 113 MB/s, with random writes coming in at 350 MB/s. You're going to see some seriously quick load times with this drive.
Temperatures and power draw were well within parameters, too, with the CPU topping out at around 92°C and the GPU at 75.8°C. Nothing out of the ordinary there. And while it is power-thirsty, the max power draw I saw during testing hit 840.8 W from the wall at peak, which isn't great, but isn't the worst either.
On the gaming side, in my 1080p testing, pretty much every title was well into the 120+ fps range or higher without the help of DLSS or Frame Gen, and a 4K, I saw upwards of 150 fps in Total War: Warhammer III's Battle benchmark, while Cyberpunk 2077 averaged 57 fps at 4K with ray tracing and no DLSS support at all.
The only mild problem I have with this setup is the slight discrepancy against an equally kitted-out rig I built earlier this year. Complete with an RTX 5090 plus Intel Core Ultra 9 285K, my own DIY rig beat out the Area-51 in practically every graphical and computational test.
At a guess, this is likely down to that CPU performance being heavily limited by the 14-phase VRM design, as it just couldn't produce enough juice to keep that Ultra 9 running at full speed for as long as its DIY counterpart.
The odd element about that, though, is that the DIY machine only featured a $220 Gigabyte motherboard, yet temps were at 100°C for the chip (and stable), but performance, both computational and in-game, was around 5-10% higher, depending. For a system that's $1,500 cheaper, that's not a good look.
Performance: 4 / 5
Should You Buy The Alienware Area-51 (2025)?
Alienware Area-51 (2025) Scorecard
Category
Verdict
Score
Value
This isn't a cheap gaming PC, not one bit. Unless you can justify the cost, or it saves you time in some manner, you'd be far better off building your own.
3 / 5
Specs
With the right config, you can easily get the best of the best hardware on the market right now, and you can upgrade it later—if you've got any budget left over.
4 / 5
Design
A design straight of the 2010s, there's numerous fumbles here that let down an otherwise stellar spec sheet.
3 / 5
Performance
Unsurprisingly with that top-tier hardware it absolutely dominates 4K gaming and any task you can throw at it.
4 / 5
Total
Big, bold, and a bit brash. It delivers on the performance front, but with mediocre styling, average build quality, and an insane price, it doesn't quite hit the mark.
3.5 / 5
Buy the Alienware Area-51 If…
You need to save time If you're not interested in building your own machine but want the best hardware, there's no denying this is a good pick, and easily upgradable long-term.
You have the desk space for it It's massive; the case is seriously long, and it's heavy enough that you'll need help just getting it on your desk.
Don't buy it if...
You want the best value A similarly kitted-out gaming PC, built yourself, can save you a lot of money.
You're looking for something a little more stylish Alienware has a style you'll either love or hate, but if you're after something with sharp lines and modern flair, then aside from the interior, you might want to look elsewhere.
This review first appeared in issue 340 of PC Pro.
HP’s acquisition of Poly in August 2022 gives it a strong presence in the hybrid working market with access to a fine range of VC products. Poly retains its name for now, and the Studio R30 on review aims to offer SMBs an affordable all-in-one solution for small conference spaces.
At first glance, the R30 looks very similar to Poly’s P15 video bar, but its chassis is slightly larger, the central 4K UHD camera has a much wider diagonal 120° field of view (FoV) and its digital zoom steps up from 4x to 5x. Internally, things remain the same: it has an 8W internal speaker, while a triple-microphone beamforming array provides speaker tracking and automatic framing.
The Studio R30 employs Poly’s NoiseBlockAI and Acoustic Fence technologies to identify and remove annoying background noises such as keyboard heavy hitters. It goes a step further as Poly’s new DirectorAI feature ensures no meeting participant feels left out by providing automated group, people and speaker framing, along with presenter tracking.
The central 4K UHD camera has a wide 120° field of view(Image credit: Future)
Rear-mounted ports include an external PSU connector and a USB Type-C port for host connection, with the kit including a generous five-meter cable. Two USB Type-A ports allow the R30 to function as a USB hub but, unlike the P15’s mechanical privacy shutter on its lens, the R30 gets only a cheap rubber cap.
Installation is swift. We connected the R30 to a Windows 10 PC and watched it load the camera and audio device drivers in a few seconds. You’ll want to add Poly’s free Lens Desktop app as this provides a firmware upgrade tool along with local access for manually adjusting audio and video functions and selecting a framing mode.
Using the app to link up with our Lens cloud account, we could remotely manage the R30, change its settings from the portal and use the inventory service to see its physical location. Use the app to connect the R30 to a wireless network and it will link up with a remote Poly provisioning server for pushing custom settings to it.
Rear ports include a USB-C, two USB-A and an external PSU connector(Image credit: Future)
During meetings, we found Poly’s tracking and framing features worked very well, with the camera snapping to the current speaker, zooming back out when they stopped talking and moving effortlessly to other speakers. In presenter mode, the R30 easily kept track of us as we moved around our meeting room while we spoke, with shift delays of around two to three seconds.
The speaker delivers a clean soundscape, and in our 24m2 room we found a volume level of 75% was sufficient to cover all areas. The integral mics also impressed, with remote meeting participants saying they could hear us clearly at distances of up to three meters.
In a direct comparison with the lab’s Studio P15, we found the R30’s wider FoV clearly provides greater horizontal coverage. It didn’t suffer from the P15’s slightly soft focus and presented a sharper, cleaner picture with a more natural color balance, while its backlight compensation coped better with bright sunlight.
Video quality is good and the Poly Lens service enables remote management(Image credit: Future)
The camera’s digital pan, tilt and zoom (PTZ) functions can be manually controlled from the Lens app but only when auto-tracking is disabled. Although currently in a preview testing phase, you can also select a conversation mode to display two speakers in a split screen, while people framing shows all participants using up to six split screens.
Poly’s Studio R30 offers SMBs an affordable and easy-to-use 4K videoconferencing solution. Video and audio quality are very good and its clever people-tracking and framing features add that all-important professional touch to your meetings.
XPG Core Reactor II 1200W ATX 3.0 80 PLUS Gold: Two-minute review
The XPG Core Reactor II 1200W ATX 3.0 80 Plus Gold PSU is the company's latest compelling mid-range component for builders and follows off its success in the more premium-tier Platinum-certified power supply segment with its Cybercore II series.
The Core Reactor II series then, which covers the spectrum from 650W units to its highest wattage 1200W PSU, leads this venture, showcasing XPG's ability to strike a crucial balance between performance, quality, and cost. This series is designed for users who want reliable performance without splurging, but also need some higher-tier power to power the best gaming PCs you can build.
Those high-end motherboards, processors, and graphics cards don't come cheap in terms of power draw, and so the Core Reactor II 1200W, an 80 Plus Gold certified unit, stands out for its practical design and consistent performance. It represents XPG's commitment to affordable quality and aims to meet the diverse needs of mid-range computing environments.
In terms of packaging, the Core Reactor II 1200W PSU comes in a robust, visually appealing box, complete with essential accessories like mounting screws, an AC power cable, and decorative stickers. The PSU itself is a blend of aesthetics and functionality, featuring a sleek matte black finish with embossed geometric patterns and a geometric fan cutout. Its 160mm length slightly exceeds conventional ATX size, but it is short enough to ensure compatibility with the best PC cases with ATX compliance.
(Image credit: Future / John Loeffler)
The front of the PSU is minimalist, housing only the on/off switch and AC receptacle, while the rear is thoughtfully designed for easy and accurate cable connections. The unit's modular cable system includes an array of uniformly black cables, with most being neatly sleeved.
Performance-wise, the Core Reactor II 1200W PSU aligns with its 80 Plus Gold certification, demonstrating commendable efficiency and thermal management. The fan operates optimally, maintaining reasonable internal temperatures, even at significant power output and under various testing conditions.
After running the XPG Core Reactor II in my main workstation at the office under some pretty heavy loads, the fan stayed mostly quiet and the temperature stayed well below its rated operating temperature without issue. This PSU is rated for operation at an ambient temperature of 50°C, a testament to its robustness and reliability, especially in demanding environments.
(Image credit: Future / John Loeffler)
Electrical performance is a highlight, with the primary 12V rail showing impressive regulation and effective voltage filtering. The PSU also passes tests for primary protections like Over Current, Over Voltage, Over Power, and Short Circuit, ensuring reliable performance.
In terms of internals, the Core Reactor II combines a robust build quality with a unique design, incorporating high-grade 105°C Japanese capacitors for enhanced reliability and durability. The PSU excels in power quality, achieving good energy conversion efficiency and maintaining steady efficiency across most load ranges. Its thermal management is effective, with the fan adjusting speed according to the load, ensuring efficient cooling while keeping noise levels minimal.
That said, if you're pushing this unit hard, such as with overclocking or loading up on the add-in cards, it can get a bit loud when load nears 100%, though never so much to be bothersome.
That said, if you're pushing this unit hard, such as with overclocking or loading up on the add-in cards, it can get a bit loud when load nears 100%, though never s much to be bothersome.range should add to their component shortlists.
(Image credit: Future / John Loeffler)
XPG Core Reactor II 1200W ATX 3.0 80 PLUS Gold: Price & availability
How much does it cost? $204.99 (about £165 / AU$290)
When is it available? Available now
Where can you get it? Available in the US. UK and Australia availability is spotty
The XPG Core Reactor II 1200W ATX 3.0 80 PLUS Gold is currently priced at $204.99 in the US and backed by a 10-year warranty, giving the Core Reactor II 1200W PSU a good value for its performance. As far as midrange PSUs go, this one is positioned as an appealing option for those seeking a balance between cost, efficiency, and reliability.
While it might not have a platinum rating, its performance is more than enough for most builders out there who need to run some high-powered components like the best processors and best graphics cards for gaming or content creation, without worrying about running hot at all hours under heavy industrial-grade workloads.
XPG Core Reactor II 1200W ATX 3.0 80 PLUS Gold: Specs
Should you buy the XPG Core Reactor II 1200W ATX 3.0 80 PLUS Gold?
Buy the XPG Core Reactor II 1200W ATX 3.0 80 PLUS Gold if...
You want a high-powered PSU for a decent price For the price you're paying, this is one of the highest wattage 80Plus Gold-rated ATX 3.0 power supplies going.
You want a modular PSU
As a modular PSU, cable management is much easier when you only use what you need.
Don't buy it if...
You need something more heavy-duty
While 80Plus Gold-rated is fantastic, if you need something more robust for a heavy-duty workstation, you might want to check out the Cybercore II Platinum-rated PSUs from XPG.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
The Intel Core i5-14600K is not the kind of processor you're really going to want to upgrade to, despite technically offering the best value of any processor I've tested.
First, the good. This is one of the best processor values you're going to find on the market, no matter what happens with the price of its predecessor. Currently, it has the best performance for its $319 price tag (about £255/AU$465), and AMD's competing Ryzen 5 7600X isn't all that close. If you're looking to get the most bang for your buck today, then the Intel Core i5-14600K is it.
In terms of performance, this isn't a bad chip at all; I'd even say it's a great one if you take its predecessor out of the running, which will inevitably happen as its last remaining stock gets bought up. It doesn't have the performance of the Intel Core i7-14700K, but that's a workhorse chip, not the kind that's meant to power the best computers for the home or the best budget gaming PCs as these chips start making their way into prebuilt systems in the next couple of months.
For a family computer or one that's just meant for general, every day use, then this chip is more than capable of handling whatever y'll need it for. It can even handle gaming fairly well thanks to its strong single core performance. So, on paper at least, the Core i5-14600K is the best Intel processor for the mainstream user as far as performance goes.
The real problem with the i5-14600K is that its performance is tragically close to the Core i5-13600K's. And even though the MSRP of the Intel Core i5-13600K is technically higher than that of the Core i5-14600K, it's not going to remain that way for very long at all.
The real problem with the i5-14600K, and one that effectively sinks any reason to buy it, is that its performance is tragically close to the Core i5-13600K's.
As long as the i5-13600K is on sale, it will be the better value, and you really won't even notice a difference between the two chips in terms of day-to day-performance.
That's because there's no difference between the specs of the 14600K vs 13600K, other than a slightly faster turbo clock speed for the 14600K's six performance cores.
While this does translate into some increased performance, it comes at the cost of higher power draw and temperature. During testing, this chip hit a maximum temperature of 101ºC, which is frankly astounding for an i5. And I was using one of the best CPU coolers around, the MSI MAG Coreliquid E360 AIO, which should be more than enough to keep the temperature in check to prevent throttling.
Image 1 of 13
(Image credit: Future / Infogram)
Image 2 of 13
(Image credit: Future / Infogram)
Image 3 of 13
(Image credit: Future / Infogram)
Image 4 of 13
(Image credit: Future / Infogram)
Image 5 of 13
(Image credit: Future / Infogram)
Image 6 of 13
(Image credit: Future / Infogram)
Image 7 of 13
(Image credit: Future / Infogram)
Image 8 of 13
(Image credit: Future / Infogram)
Image 9 of 13
(Image credit: Future / Infogram)
Image 10 of 13
(Image credit: Future / Infogram)
Image 11 of 13
(Image credit: Future / Infogram)
Image 12 of 13
(Image credit: Future / Infogram)
Image 13 of 13
(Image credit: Future / Infogram)
Looking at the chip's actual performance, the Core i5-14600K beats the AMD Ryzen 5 7600X and the Intel Core i5-13600K in single core performance, multi core performance, and with productivity workloads, on average. Other than its roughly 44% better average multi core performance against the Ryzen 5 7600X, the Core i5-14600K is within 3% to 4% of its competing chips.
Image 1 of 7
(Image credit: Future / Infogram)
Image 2 of 7
(Image credit: Future / Infogram)
Image 3 of 7
(Image credit: Future / Infogram)
Image 4 of 7
(Image credit: Future / Infogram)
Image 5 of 7
(Image credit: Future / Infogram)
Image 6 of 7
(Image credit: Future / Infogram)
Image 7 of 7
(Image credit: Future / Infogram)
In creative workloads, the Core i5-14600K again manages to outperform the Ryzen 5 7600X by about 31% on average, but it's just 2.4% better than its predecessor, and none of these chips are especially great at creative content work. If you're messing around with family albums or cutting up TikTok videos, any one of these chips could do that fairly easily. For heavier-duty workloads like video encoding and 3D rendering, the Intel chips hold up better than the mainstream Ryzen 5, but these chips really aren't practical for that purpose.
Image 1 of 6
(Image credit: Future / Infogram)
Image 2 of 6
(Image credit: Future / Infogram)
Image 3 of 6
(Image credit: Future / Infogram)
Image 4 of 6
(Image credit: Future / Infogram)
Image 5 of 6
(Image credit: Future / Infogram)
Image 6 of 6
(Image credit: Future / Infogram)
On the gaming front, it's more of the same, though now at least the Ryzen 5 7600X is back in the mix. Overall, the Core i5-14600K beats its 13th-gen predecessor and AMD's rival chip by about 2.1% and 3.2% respectively.
Image 1 of 2
(Image credit: Future / Infogram)
Image 2 of 2
(Image credit: Future / Infogram)
All of this comes at the cost of higher power draw and hotter CPU temperatures, though, which isn't good especially for getting so little in return. What you really have here is an overclocked i5-13600K, and you can do that yourself and save some money by buying the 13600K when it goes on sale, which is will.
(Image credit: Future / John Loeffler)
Intel Core i5-14600K: Price & availability
How much does it cost? US MSRP $319 (about £255/AU$465)
When is it out? October 17, 2023
Where can you get it? You can get it in the US, UK, and Australia
The Intel Core i5-14600K is available in the US, UK, and Australia as of October 17, 2023, for an MSRP of $319 (about £255/AU$465).
This is a slight $10 price drop from its predecessor, which is always good thing, and comes in about $20 (about £15/AU$30) more than the AMD Ryzen 5 7600X, so fairly middle of the pack price-wise.
In terms of actual value, as it goes to market, this chip has the highest performance for its price of any chip in any product tier, but only by a thin margin, and one that is sure to fall very quickly once the price on the 13600K drops by even a modest amount.
Intel Core i5-14600K: Specs
Intel Core i5-14600K: Verdict
Best performance for the price of any chip tested...
...but any price drop in the Core i5-13600K will put the 14600K in second place
Not really worth upgrading to with the Core i7-14700K costing just $90 more
Image 1 of 7
(Image credit: Future / Infogram)
Image 2 of 7
(Image credit: Future / Infogram)
Image 3 of 7
(Image credit: Future / Infogram)
Image 4 of 7
(Image credit: Future / Infogram)
Image 5 of 7
(Image credit: Future / Infogram)
Image 6 of 7
(Image credit: Future / Infogram)
Image 7 of 7
(Image credit: Future / Infogram)
Ultimately, the market served by this chip specifically is incredibly narrow, and like the rest of the Raptor Lake Refresh line-up, this is the last hurrah for the Intel LGA 1700 socket.
That means if you go out and buy a motherboard and CPU cooler just for the 14th-gen, it's a one time thing, since another generation on this platform isn't coming. It doesn't make sense to do that, so, if you're upgrading from anything earlier than the 12th-gen, it just makes so much more sense to wait for Meteor Lake to land in several months time and possibly get something really innovative.
If you're on a 12th-gen chip and you can't wait for Meteor Lake next year, the smartest move is to buy the i7-14700K instead, which at least gives you i9-13900K-levels of performance for just $90 more than the i5-14600K.
Ultimately, this chip is best reserved for prebuilt systems like the best all-in-one computers at retailers like Best Buy, where you will use the computer for a reasonable amount of time, and then when it becomes obsolete, you'll go out and buy another computer rather than attempt to upgrade the one you've got.
In that case, buying a prebuilt PC with an Intel Core i5-14600K makes sense, and for that purpose, this will be a great processor. But if you're looking to swap out another Intel LGA 1700 chip for this one, there are much better options out there.
Should you buy the Intel Core i5-14600K?
Buy the Intel Core i5-14600K if...
Don't buy it if...
Also Consider
If my Intel Core i5-14600K review has you considering other options, here are two processors to consider...
How I tested the Intel Core i5-14600K
I spent nearly two weeks testing the Intel Core i5-14600K
I ran comparable benchmarks between this chip and rival midrange processors
I gamed with this chip extensively
Test System Specs
These are the specs for the test system used for this review:
I spent about two weeks testing the Intel Core i5-14600K and its competition, primarily for productivity work, gaming, and content creation.
I used a standard battery of synthetic benchmarks that tested out the chip's single core, multi core, creative, and productivity performance, as well as built-in gaming benchmarks to measure its gaming chops.
I then ran the same tests on rival chips from AMD as well as the other 14th-gen chips in the Raptor Lake Refresh launch lineup and 13th-generation Raptor Lake processors. For Intel chips, I used the same motherboard, RAM, SSD, and graphics card to ensure I was isolating just the CPU's performance across every chip. For AMD chips, I used a comparable AM5 motherboard so differences in the motherboard configuration and circuitry are mitigated to the largest extent possible.
I've been testing and reviewing computer hardware for years now, and with an extensive background in computer science, I know processors in and out, and I use that knowledge to ensure every chip is thoroughly tested.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
The Intel Core i7-14700K is the workhorse CPU in the Intel's 14th generation launch line-up, and like any good workhorse, it's going to be the one to do the heavy lifting for this generation of processors. Fortunately for Intel, the Core i7-14700K succeeds in keeping Raptor Lake Refresh from being completely forgettable.
Of all the chips launched on October 17, 2023, the Core i7-14700K is the only one to get a substantive spec upgrade over its predecessor as well as a slight cut in price to just $409 (about £325/AU$595), which is $10 less than the Intel Core i7-13700K it replaces.
So what do you get for $10 less? Gen-on-gen, you don't get a whole lot of improvement (about 6% better performance overall compared to the 13700K), but that figure can be deceiving, since the Core i7-13700K was at the top of our best processor list for a reason.
With the 13700K's performance being within striking distance of the Intel Core i9-13900K, that 6% improvement for the 14700K effectively closes the gap, putting the 14700K within just 3% of the 13900K overall, and even allowing it to pull ahead in average gaming performance, losing out to only the AMD Ryzen 7 7800X3D.
Fortunately for Intel, the Core i7-14700K succeeds in keeping Raptor Lake Refresh from being completely forgetable.
Given its excellent mix of performance and price, the Intel Core i7-14700K could very well be the last Intel chip of the LGA 1700 epoch that anyone should consider buying, especially if you're coming from a 12th-gen chip.
With the Core i9-13900K outperforming the Intel Core i9-12900K by as much as 25% in some workloads, someone coming off an i9-12900K or lower will find it hard to believe that an i7 could perform this well, but that's where we're at. And with the i7-14700K coming in about 30% cheaper than the Intel Core i9-14900K, while still managing to come remarkably close in terms of its performance, the Intel Core i7-14700K is the Raptor Lake Refresh chip to buy if you're going to buy one at all.
(Image credit: Future / John Loeffler)
Intel Core i7-14700K: Price & availability
How much does it cost? US MSRP $409 (about £325/AU$595)
When is it out? October 17, 2023
Where can you get it? You can get it in the US, UK, and Australia
The Intel Core i7-14700K is available on October 17, 2023, with a US MSRP of $409 (about £325/AU$595), which is a slight decrease from its predecessor's MSRP of $419 (about £335/AU$610), and about 31% lower than the Intel Core i9-14900K and 32% percent lower than the AMD Ryzen 9 7950X.
It's also cheaper than the AMD Ryzen 7 7800X3D, and just $10 more expensive than the AMD Ryzen 7 7700X, putting it very competitively priced against processors in its class.
The comparisons against the Core i9 and Ryzen 9 are far more relevant, however, since these are the chips that the Core i7-14700K are competing against in terms of performance, and in that regard, the Intel Core i7-14700K is arguably the best value among consumer processors currently on the market.
Price score: 4 / 5
Intel Core i7-14700K: Specs & features
Four additional E-Cores
Slightly faster clock speeds
Increased Cache
Discrete Wi-Fi 7 and Thunderbolt 5 support
The Intel Core i7-14700K is the only processor from Intel's Raptor Lake Refresh launch line-up to get a meaningful spec upgrade.
Rather than the eight performance and eight efficiency cores like the i7-13700K, the i7-14700K comes with eight performance cores and 12 efficiency cores, all running with a slightly higher turbo boost clock for extra performance. The i7-14700K also has something called Turbo Boost Max Technology 3.0, which is a mouthful but also gives the best performing P-core an extra bump up to 5.6GHz so long as the processor is within power and thermal limits.
The increased core count also adds 7MB of additional L2 cache for the efficiency cores to use, further improving their performance over the 13700K's, as well as four additional processing threads for improved multitasking.
It has the same TDP of 125W and same Max Turbo Power rating of 253W as the 13700K, with the latter being the upper power limit of sustained (greater than one second) power draw for the processor. This ceiling can be breached, however, and processing cores can draw much more power in bursts as long as 10ms when necessary.
There is also support for discrete Wi-Fi 7 and Bluetooth 5.4 connectivity, as well as discrete Thunderbolt 5 wired connections, so there is a decent bit of future proofing in its specs.
Chipset & features score: 4 / 5
(Image credit: Future / John Loeffler)
Intel Core i7-14700K: Performance
Outstanding performance on par with the i9-13900K
Best gaming performance of any Intel processor
More power hungry than predecessor, so also runs hotter
The Intel Core i7-14700K is arguably the best performing midrange processor on the market, coming within striking distance of the Core i9-13900K and Ryzen 9 7950X across most workloads, including very strong multi core performance thanks to the addition of four extra efficiency cores.
Image 1 of 13
(Image credit: Future / Infogram)
Image 2 of 13
(Image credit: Future / Infogram)
Image 3 of 13
(Image credit: Future / Infogram)
Image 4 of 13
(Image credit: Future / Infogram)
Image 5 of 13
(Image credit: Future / Infogram)
Image 6 of 13
(Image credit: Future / Infogram)
Image 7 of 13
(Image credit: Future / Infogram)
Image 8 of 13
(Image credit: Future / Infogram)
Image 9 of 13
(Image credit: Future / Infogram)
Image 10 of 13
(Image credit: Future / Infogram)
Image 11 of 13
(Image credit: Future / Infogram)
Image 12 of 13
(Image credit: Future / Infogram)
Image 13 of 13
(Image credit: Future / Infogram)
The strongest synthetic benchmarks for the 14700K are single core workloads, which puts it effectively level with the Core i9-13900K and often beating the Ryzen 9 7950X and 7950X3D chips handily.
This translates into better dedicated performance, rather than multitasking, but even there the Core i7-14700K does an admirable just keeping pace with chips with much higher core counts.
Image 1 of 7
(Image credit: Future / Infogram)
Image 2 of 7
(Image credit: Future / Infogram)
Image 3 of 7
(Image credit: Future / Infogram)
Image 4 of 7
(Image credit: Future / Infogram)
Image 5 of 7
(Image credit: Future / Infogram)
Image 6 of 7
(Image credit: Future / Infogram)
Image 7 of 7
(Image credit: Future / Infogram)
In creative workloads, the 14700K also performs exceptionally well, beating out the 13900K on everything except 3D model rendering, which is something that is rarely given to a CPU to do any when even the best cheap graphics cards can process Blender or V-Ray 5 workloads many times faster than even the best CPU can.
Image 1 of 6
(Image credit: Future / Infogram)
Image 2 of 6
(Image credit: Future / Infogram)
Image 3 of 6
(Image credit: Future / Infogram)
Image 4 of 6
(Image credit: Future / Infogram)
Image 5 of 6
(Image credit: Future / Infogram)
Image 6 of 6
(Image credit: Future / Infogram)
In gaming performance, the Core i7-14700K scores a bit of an upset over its launch sibling, the i9-14900K, besting it in gaming performance overall, though it has to be said that it got some help from a ridiculously-high average fps in Total War: Warhammer III's battle benchmark. In most cases, the i7-14700K came up short of the 13900K and 14900K, but not by much.
And while it might be tempting to write off Total War: Warhammer III as an outlier, one of the biggest issues with the Core i9's post-Alder Lake is that they are energy hogs and throttle under load quickly, pretty much by design.
In games like Total War: Warhammer III where there are a lot of tiny moving parts to keep track of, higher clock speeds don't necessarily help. When turbo clocks kick into high gear and cause throttling, the back-and-forth between throttled and not-throttled can be worse over the course of the benchmark than the cooler but consistent Core i7s, which don't have to constantly ramp up and ramp down.
So the 14700K isn't as much of an outlier as it looks, especially since the 13700K also excels at Total War: Warhammer III, and it too beats the two Core i9s. Total War: Warhammer III isn't the only game like this, and so there are going to be many instances where the cooler-headed 14700K steadily gets the work done while the hot-headed i9-13900K and 14900K sprint repeatedly, only to effectively tire themselves out for a bit before kicking back up to high gear.
Image 1 of 2
(Image credit: Future / Infogram)
Image 2 of 2
(Image credit: Future / Infogram)
The additional efficiency cores might not draw as much power as the performance cores, but the additional power is still noticeable. The 14700K pulls down nearly 30W more watts than the 13700K, though it is still a far cry from the Core i9-13900K's power usage.
This additional power also means that the Core i7-14700K runs much hotter than its predecessor, maxing out at 100ºC, triggering the CPU to throttle on occasion. This is something that the i7-13700K didn't experience during my testing at all, so you'll need to make sure your cooling solution is up to the task here.
Performance: 4.5 / 5
(Image credit: Future / John Loeffler)
Intel Core i7-14700K: Verdict
Fantastic single-core performance
Intel's best gaming processor, and second overall behind the Ryzen 7 7800X3D
Best value of any midrange processor
Image 1 of 7
(Image credit: Future / Infogram)
Image 2 of 7
(Image credit: Future / Infogram)
Image 3 of 7
(Image credit: Future / Infogram)
Image 4 of 7
(Image credit: Future / Infogram)
Image 5 of 7
(Image credit: Future / Infogram)
Image 6 of 7
(Image credit: Future / Infogram)
Image 7 of 7
(Image credit: Future / Infogram)
Ultimately, the Intel Core i7-14700K is the best processor in the Raptor Lake Refresh line-up, offering very competitive performance for a better price than its predecessor and far better one than comparable chips one tier higher in the stack.
It's not without fault, though. It's not that much better than the i7-13700K, so everything I'm saying about the i7-14700K might reasonably apply to its predecessor as well. And honestly, the i7-14700K doesn't have too high a bar to clear to standout from its launch siblings, so it's performance might only look as good in comparison to the i9 and i5 standing behind it.
But, the numbers don't lie, and the Intel Core i7-14700K displays flashes of brilliance that set it apart from its predecessor and vault it into competition with the top-tier of CPUs, and that's quite an achievement independent of how the rest of Raptor Lake Refresh fares.
(Image credit: Future / John Loeffler)
Should you buy the Intel Core i7-14700K?
Buy the Intel Core i7-14700K if...
Don't buy it if...
Also Consider
If my Intel Core i7-14700K review has you considering other options, here are two processors to consider...
How I tested the Intel Core i7-14700K
I spent nearly two weeks testing the Intel Core i7-14700K
I ran comparable benchmarks between this chip and rival midrange processors
I gamed with this chip extensively
Test System Specs
These are the specs for the test system used for this review:
I spent about two weeks testing the Intel Core i7-14700K and its competition, primarily for productivity work, gaming, and content creation.
I used a standard battery of synthetic benchmarks that tested out the chip's single core, multi core, creative, and productivity performance, as well as built-in gaming benchmarks to measure its gaming chops.
I then ran the same tests on rival chips from AMD as well as the other 14th-gen chips in the Raptor Lake Refresh launch line-up and 13th-generation Raptor Lake processors. For Intel chips, I used the same motherboard, RAM, SSD, and graphics card to ensure I was isolating just the CPU's performance across every chip. For AMD chips, I used a comparable AM5 motherboard so differences in the motherboard configuration and circuitry are mitigated to the largest extent possible.
I've been testing and reviewing computer hardware for years now, and with an extensive background in computer science, I know processors in and out, and I use that knowledge to ensure every chip is thoroughly tested.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
The Intel Core i9-14900K is a hard chip to justify, which is a weird thing to say about a processor that is arguably the best Intel has ever put out.
With very little fanfare to herald its arrival following the announcement of Intel Meteor Lake at Intel Innovation in September 2023 (and confirmation that Intel Meteor Lake is coming to desktop in 2024), Intel's 14th-generation flagship processor cannot help but draw parallels to the 11th-gen Rocket Lake chips that immediately preceded Intel Alder Lake.
The Core i9-11900K was something of a placeholder in the market until Intel could launch Alder Lake at the end of 2021. Those processors featured a new hybrid architecture and a more advanced 10nm process that helped propel Intel back to the top of our best processor list, despite strong competition from AMD.
With Intel Raptor Lake Refresh, we're back in placeholder territory, unfortunately. The performance gains here are all but non-existent, so we are essentially waiting on Meteor Lake while the i9-14900K absolutely guzzles electricity and runs hot enough to boil water under just about any serious workload with very little extra performance over the Intel Core i9-13900K to justify the upgrade.
The problem for the Core i9-14900K is that you can still get the i9-13900K.
It's not that the Core i9-14900K isn't a great processor; again, it's unquestionably the best Intel processor for the consumer market in terms of performance. It beats every other chip I tested in most categories with the exception of some multitasking workflows and average gaming performance, both of which it comes in as a very close runner-up. On top of that, at $589, it's the same price as the current Intel flagship, the Intel Core i9-13900K (assuming the i9-14900K matches the i9-13900K's £699 / AU$929 sale price in the UK and Australia).
The problem for the Core i9-14900K is two-fold: you can still get the i9-13900K and will be able to for a long while yet at a lower price, and the Intel Core i7-14700K offers performance so close to the 14th-gen flagship at a much lower price that the 14900K looks largely unnecessary by comparison. Essentially, If you've got an i7-13700K or i9-13900K, there's is simply nothing for you here.
If you're on an 11th-gen chip or older, or you've got an AMD Ryzen processor and you're looking to switch, this chip will be the last one to use the LGA 1700 socket, so when Meteor Lake-S comes out in 2024 (or even Lunar Lake-S, due out at the end of 2024 or early 2025), you won't be able to upgrade to that processor with an LGA 1700 motherboard. In other words, upgrading to an LGA 1700 for this chip is strictly a one-shot deal.
The only people who might find this chip worth upgrading to are those currently using a 12th-gen processor who skipped the 13th-gen entirely, or someone using a 13th-gen core i5 who wants that extra bit of performance and doesn't mind dropping $589 on a chip they might be upgrading from again in a year's time, which isn't going to be a whole lot of people.
Unfortunately, at this price, it'll be better to save your money and wait for Meteor Lake or even Lunar Lake to drop next year and put the $589 you'd spend on this chip towards the new motherboard and CPU cooler you'll need once those chips are launched.
(Image credit: Future / John Loeffler)
Intel Core i9-14900K: Price & availability
How much does it cost? US MSRP $589 (about £470/AU$855)
When is it out? October 17, 2023
Where can you get it? You can get it in the US, UK, and Australia
The Intel Core i9-14900K is available as of October 17, 2023, for a US MSRP of $589 (about £470/AU$855), which is the same as the Intel Core i9-13900K it is replacing. We don't have confirmation on UK and Australia pricing yet, though I've asked Intel for clarification and will update this review if and when I hear back from the company. If the 14900K keeps the same UK and Australia pricing as the Core i9-13900K, however, it'll sell for £699/AU$929 in the UK and Australia respectively.
This does make the Core i9-14900K the better value against these chips, especially given the level of performance on offer, but it's ultimately too close to the 13900K performance-wise to make this price meaningful, as a cheaper 13900K will offer an even better value against AMD's Ryzen 9 lineup.
Price score: 3 / 5
(Image credit: Future / John Loeffler)
Intel Core i9-14900K: Specs & features
Faster clock speeds than i9-13900K
Some additional AI-related features
The Intel Core i9-14900K is the final flagship using Intel's current architecture, so it makes sense that there is very little in the way of innovation over the Intel Core i9-13900K.
Using the same 10nm Intel 7 process node as its predecessor and with the same number of processor cores (8 P-cores/16 E-cores), threads (32), and cache (32MB total L2 cache plus additional 36MB L3 cache), the only real improvement with the 14900K in terms of specs are its faster clock speeds.
All cores get a 0.2GHz increase to their base frequencies, while the P-core turbo boost clock increases to 5.6GHz and the E-core turbo clock bumps up to 4.4GHz from the 13900K's 5.4GHz P-Core turbo clock and 4.3GHz E-core turbo clock.
While those clock speeds are the official max turbo clocks for the two types of cores, the Core i9-14900K and Intel Core i7-14700K have something called Turbo Boost Max Technology 3.0, which increases the frequency of the best-performing core in the chip and gives it even more power within the power and thermal limits. That gets the Core i9-14900K up to 5.8GHz turbo clock on specific P-cores while active.
Additionally, an exclusive feature of the Core i9 is an additional Ludicrous-Speed-style boost called Intel Thermal Velocity Boost. This activates if there is still power and thermal headroom on a P-core that is already being boosted by the Turbo Boost Max Technology 3.0, and this can push the core as high as 6.0GHz, though these aren't typical operating conditions.
Both of these technologies are present in the 13900K as well, but the 14900K bumps up the maximum clock speeds of these modes slightly, and according to Intel, that 6.0GHz clock speed makes this the world's fastest processor. While that might technically be true, that 6.0GHz is very narrowly used so in practical terms, the P-Core boost clock is what you're going to see almost exclusively under load.
The Core i9-14900K has the same 125W TDP as the 13900K and the same 253W maximum turbo power as well, though power draw in bursts of less than 10ms can go far higher.
If this reads like a Redditor posting about their successful overclocking setup, then you pretty much get what this chip is about. If you're looking for something innovative about this chip, I'll say it again, you're going to have to wait for Meteor Lake.
The Core i9-14900K also has support for discrete Wi-Fi 7 and Bluetooth 5.4 connectivity, as does the rest of the 14th-gen lineup, as well as support for discrete Thunderbolt 5, both of which are still a long way down the road.
The only other thing to note is that there have been some AI-related inclusions that are going to be very specific to AI workloads that almost no one outside of industry and academia is going to be running. If you're hoping for AI-driven innovations for everyday consumers, let's say it once more, with feeling: You're going to have to wait for—
Chipset & features score: 3.5 / 5
(Image credit: Future / John Loeffler)
Intel Core i9-14900K: Performance
Best-in-class performance, but only by a hair
Gets beat by AMD Ryzen 7 7800X3D and i7-14700K in gaming performance
Runs even hotter than the i9-13900K
If you took any elite athlete who's used to setting records in their sport, sometimes they break their previous record by a lot, and sometimes it's by milliseconds or fractions of an inch. It's less sexy, but it still counts, and that's really what we get here with the Intel i9-14900K.
On pretty much every test I ran on it, the Core i9-14900K edged out its predecessor by single digits, percentage-wise, which is a small enough difference that a background application can fart and cause just enough of a dip in performance that the 14900K ends up losing to the 13900K.
I ran these tests more times than I can count because I had to be sure that something wasn't secretly messing up my results, and they are what they are. The Core i9-14900K does indeed come out on top, but it really is a game of inches at this point.
Image 1 of 13
(Image credit: Future / Infogram)
Image 2 of 13
(Image credit: Future / Infogram)
Image 3 of 13
(Image credit: Future / Infogram)
Image 4 of 13
(Image credit: Future / Infogram)
Image 5 of 13
(Image credit: Future / Infogram)
Image 6 of 13
(Image credit: Future / Infogram)
Image 7 of 13
(Image credit: Future / Infogram)
Image 8 of 13
(Image credit: Future / Infogram)
Image 9 of 13
(Image credit: Future / Infogram)
Image 10 of 13
(Image credit: Future / Infogram)
Image 11 of 13
(Image credit: Future / Infogram)
Image 12 of 13
(Image credit: Future / Infogram)
Image 13 of 13
(Image credit: Future / Infogram)
Across all synthetic performance and productivity benchmarks, the Core i9-14900K comes out on top, with the notable exception of Geekbench 6.1's multi-core performance test, where the AMD Ryzen 9 7950X scores substantially higher, and the Passmark Performance Test's overall CPU score, which puts the AMD Ryzen 9 7950X and Ryzen 9 7950X3D significantly higher. Given that all 16 cores of the 7950X and 7950X3D are full-throttle performance cores, this result isn't surprising.
Other than that though, it's the 14900K all the way, with a 5.6% higher geometric average on single-core performance than the 13900K. For multi-core performance, the 14900K scores a 3.1% better geometric average, and in productivity workloads, it scores a 5.3% better geometric average than its predecessor.
Against the AMD Ryzen 9 7950X, the Core i9-14900K scores about 13% higher in single-core performance, about 1% lower in multi-core performance, and 5% better in productivity performance.
Image 1 of 7
(Image credit: Future / Infogram)
Image 2 of 7
(Image credit: Future / Infogram)
Image 3 of 7
(Image credit: Future / Infogram)
Image 4 of 7
(Image credit: Future / Infogram)
Image 5 of 7
(Image credit: Future / Infogram)
Image 6 of 7
(Image credit: Future / Infogram)
Image 7 of 7
(Image credit: Future / Infogram)
Creative benchmarks reveal something of a mixed bag for the Core i9-14900K. In all cases, it beats its predecessor by between 2.6% to as much as 10.9%. Against the AMD Ryzen 9 7950X and 7950X3D, the Core i9-14900K consistently loses out when it comes to rendering workloads like Blender and V-Ray 5, but beats the two best AMD processors by just as much in photo and video editing. And since 3D rendering is almost leaning heavily on a GPU rather than the CPU, AMD's advantage here is somewhat muted in practice.
Image 1 of 6
(Image credit: Future / Infogram)
Image 2 of 6
(Image credit: Future / Infogram)
Image 3 of 6
(Image credit: Future / Infogram)
Image 4 of 6
(Image credit: Future / Infogram)
Image 5 of 6
(Image credit: Future / Infogram)
Image 6 of 6
(Image credit: Future / Infogram)
Gaming is another area where Intel had traditionally done well thanks to its strong single-core performance over AMD, but all that flipped with the introduction of AMD's 3D V-Cache.
While the Intel Core i9-14900K barely moves the needle from its predecessor's performance, it really doesn't matter, since the AMD Ryzen 7 7800X3D manages to ultimately score an overall victory and it's not very close. The Core i9-14900K actually manages a tie for fourth place with the Intel Core i7-13700K, with the Core i7-14700K edging it out by about 4 fps on average.
Image 1 of 2
(Image credit: Future / Infogram)
Image 2 of 2
(Image credit: Future / Infogram)
Of course, all this performance requires power, and lots of it. The Core i9-14900K pretty much matched the maximum recorded power draw of the Core i9-13900K, with less of a watt's difference between the two, 351.097W to 351.933, respectively.
The Core i9-14900K still managed to find a way to run hotter than its predecessor, however; something I didn't really think was possible. But there it is, the 14900K maxing out at 105ºC, three degrees hotter than the 13900K's max. It's the hottest I've ever seen a CPU run, and I'm genuinely shocked it was allowed to run so far past its official thermal limit without any overclocking on my part.
Performance: 3.5 / 5
(Image credit: Future / John Loeffler)
Intel Core i9-14900K: Verdict
The best chip for dedicated performance like video editing and productivity
There are better gaming processors out there for cheaper
The Intel Core i7-14700K offers a far better value
Image 1 of 7
(Image credit: Future / Infogram)
Image 2 of 7
(Image credit: Future / Infogram)
Image 3 of 7
(Image credit: Future / Infogram)
Image 4 of 7
(Image credit: Future / Infogram)
Image 5 of 7
(Image credit: Future / Infogram)
Image 6 of 7
(Image credit: Future / Infogram)
Image 7 of 7
(Image credit: Future / Infogram)
In the final assessment then, the Core i9-14900K does manage to win the day, topping the leaderboard by enough of a margin to be a clear winner, but close enough that it isn't the cleanest of wins.
Overall, its single-core and productivity performance are its best categories, slightly faltering in creative workloads, and coming up short enough on gaming that it's not the chip I would recommend as a gaming CPU.
Like all Core i9s before it, the 14900K is the worst value of Intel's 14th-gen launch lineup, but it's better than its predecessor for the time being (though that advantage won't last very long at all), and it does manage to be a better value proposition than the Ryzen 9 7950X and Ryzen 9 7950X3D, while matching the Ryzen 7 7800X3D, so all in all, not too bad for an enthusiast chip.
Still, the Intel Core i7-14700K is right there, and its superior balance of price and performance makes the Intel Core i9-14900K a harder chip to recommend than it should be.
Should you buy the Intel Core i9-14900K?
Buy the Intel Core i9-14900K if...
Don't buy it if...
Also Consider
If my Intel Core i9-14900K review has you considering other options, here are two processors to consider...
How I tested the Intel Core i9-14900K
I spent nearly two weeks testing the Intel Core i9-14900K
I ran comparable benchmarks between this chip and rival flagship processors
I gamed with this chip extensively
Test System Specs
These are the specs for the test system used for this review:
I spent about two weeks testing the Intel Core i9-14900K and its competition, using it mostly for productivity and content creation, with some gaming thrown in as well.
I used the standard battery of synthetic benchmarks I use for processor testing, and ran the same tests on rival chips from AMD as well as the other 14th-gen chips in the Raptor Lake Refresh launch lineup and 13th-generation Raptor Lake processors. For Intel chips, I used the same motherboard, RAM, SSD, and graphics card to ensure I was isolating just the CPU's performance across every chip. For AMD chips, I used a comparable AM5 motherboard so differences in the motherboard configuration and circuitry are mitigated to the largest extent possible.
I've been testing and reviewing computer hardware for years now, and with an extensive background in computer science, I know processors in and out, and I use that knowledge to ensure every chip is thoroughly tested.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
To say I've been looking forward to the AMD Radeon RX 7800 XT for over a year is an understatement, and if I were to judge this card on its merits, I have to say that this is easily one of the best graphics card releases we've gotten out of this generation. My heart, though, knows that it should have been even better, so I can't help but feel slightly disappointed.
Released right on the heels of Labor Day here in the US, getting this card properly tested was obviously going to be a heavy lift, so when my preliminary benchmark numbers showed it edging out the Nvidia GeForce RTX 4070 by about 2% overall (while not getting as badly crushed by Nvidia's midrange rival in ray-tracing performance as during the previous generation), I figured this card was going to be an easy one to review.
Coming in at $499.99 (about £380/AU$725) compared to the RTX 4070's MSRP of $599.99 (about £460/AU$870), that roughly 17% price difference in AMD's favor is going to make a world of difference for a lot of gamers out there looking to upgrade to a current-gen midrange card.
In addition to fantastic 1440p gaming performance and even very respectable 4K gaming performance (thanks in no small part to the 16GB VRAM and 256-bit memory bus), ray tracing performance has gotten better as AMD's ray accelerators have improved and a host of new anti-latency and upscaling features make this pretty much the best 1440p graphics card on the market, hands down.
(Image credit: Future / John Loeffler)
So why does my heart ache having done a very intense week's worth of testing on this card?
Well, the single biggest negative in this card's column is that there is very little gen-on-gen improvement in terms of its rasterization performance over the AMD Radeon RX 6800 XT.
Now the RX 7800 XT does have things that the RX 6800 XT doesn't have, namely AI accelerator cores that can power more advanced AI workloads like upscaling and other generative AI processes, and the 7800 XT does feature much better ray tracing performance than its predecessor, so calling these cards essentially the same would be factually and substantively wrong.
But rasterization is AMD Radeon's bread-and-butter, and by that metric, you only really get about 12% and 5% better gaming performance at 1080p and 1440p, respectively, and there's essentially no difference at 4K. If you don't care about ray tracing or running Stable Diffusion-like AI models (which you're likely to use Nvidia hardware for anyway), then this card is going to feel much more like a refresh of the RX 6800 XT, or even the RX 6850 XT that we didn't get a year ago.
And for that, the RX 7800 XT leaves me somewhat disappointed. If you aren't upgrading from an RX 6800 XT (which you shouldn't be doing even if this card was a true gen-on-gen successor like the fantastic AMD Radeon RX 7700 XT is to the AMD Radeon RX 6700 XT), then none of this is really going to matter to you.
I'd still tell you to buy the RX 7800 XT over the RX 6800 XT and even the RTX 4070, without question, but there's no getting around the fact that the AMD Radeon RX 7800 XT misses its shot at being truly magnificent.
AMD Radeon RX 7800 XT: Price & availability
(Image credit: Future / John Loeffler)
How much does it cost? $499.99 (about £380/AU$725)
When is it available? Available September 6, 2023
Where can you get it? Available in the US, UK, and Australia
The AMD Radeon RX 7800 XT is available on September 6, 2023, starting at $499.99 (about £380/AU$725), which puts it about 23% cheaper than the RX 6800 XT was when it launched in 2020, and $100 cheaper than direct competitor the Nvidia RTX 4070.
It's also just $50 more expensive than the RX 7700 XT that it launched alongside, so anyone looking at the RX 7700 XT might be better served by buying the RX 7800 XT instead since you'll get better performance and extra VRAM without spending a whole lot more money.
AMD Radeon RX 7800 XT: Specs
(Image credit: Future / John Loeffler)
AMD Radeon RX 7800 XT: Design
Unlike the RX 7700 XT, the AMD Radeon RX 7800 XT does have a reference card, and it'll look familiar to anyone who's been looking at AMD cards this generation. Opting for a two-fan cooling solution, this dual-slot card looks a lot like the AMD Radeon RX 7600 would if you stretched the card lengthwise.
It's not a long card either, measuring 267mm, or about 10.5 inches, so you shouldn't have any issues getting this card to fit inside a mid-tower case or larger. You might even be able to squeeze it into some tighter-fitting cases as well, but that'll depend on the case itself and what version of the RX 7800 XT you end up getting (third-party versions will vary in size and will likely be longer).
(Image credit: Future / John Loeffler)
The reference model of the card features three DisplayPort 2.1 outputs along with an HDMI 2.1 port, so it'll be more than capable of powering the best 4K monitors with ease, along with the various sizes and resolutions of the best gaming monitors on the market.
What it doesn't have, however, is a USB-C output, so if you have one of the best USB-C monitors (which is common in creative industries), youi'll likely need to pick up an adapter if you plan on slotting this card into a workstation.
(Image credit: Future / John Loeffler)
You'll also only need two free 8-pin power connectors, so no 12HPWR cable like Nvidia's competing cards. The card is fairly solid with a decent amount of weight, so you'll definitely need a support bracket if you're slotting this directly into a motherboard's PCIe slot.
Overall, the appearance is the same no-fuss, no-bling aesthetic we've gotten from AMD's RDNA 3 reference cards this generation, so if you want that RGB look, you're better off with a third-party card, but otherwise it's a lovely card to look at and won't be the shame of anyone's PC case.
(Image credit: Future / John Loeffler)
AMD Radeon RX 7800 XT: Chipset & features
The Navi 32 GPU in the AMD Radeon RX 7800 XT is the full version of the chip compared to the slightly trimmed-down GPU powering the RX 7700 XT, with an additional 6 compute units over the RX 7700 XT's 54, giving the RX 7800 XT an additional 384 shaders, 6 ray accelerators, and 12 AI accelerators.
The RX 7800 XT has a fairly low base clock of 1,295 MHz, compared to the RX 7700 XT's 1,700 MHz, but the RX 7800 XT's boost clock runs as high as 2,430 MHz (compared to the RX 7700 XT's 2,544 MHz).
This means that even though the RX 7800 XT has slightly more compute units, everything is running slightly slower, which goes a long way to explaining the relatively close levels of performance between the two GPUs.
The RX 7800 XT does feature 16GB VRAM with a large 256-bit memory bus, with a memory clock of 2,425 MHz for an effective 19.4 Gbps. This is slower than the RTX 4070's 21 Gbps effective memory speed, but the wider bus and larger frame buffer offered by the additional 4GB VRAM with the RX 7800 XT really highlights where Nvidia went wrong with lower VRAM and tighter buses this generation, compared to AMD who generally got the memory question on their cards right.
Finally, the TGP on the RX 7800 XT is a rather high 263W, compared to the 200W RTX 4070, but this is still less than the RX 6800 XT's 300W TGP, so there's progress at least.
(Image credit: Future / John Loeffler)
AMD Radeon RX 7800 XT: Performance
And here is where the AMD Radeon RX 7800 XT impresses the most, even as it breaks my heart: performance.
I'll start with the good news for AMD here, which is that it by and large scores even with the RTX 4070 in terms of synthetic tests and gameplay performance while faltering rather badly against the RTX 4070 in creative workloads, which is pretty much expected given the Nvidia CUDA instruction set's dominance in all things creative.
Image 1 of 15
(Image credit: Future / Infogram)
Image 2 of 15
(Image credit: Future / Infogram)
Image 3 of 15
(Image credit: Future / Infogram)
Image 4 of 15
(Image credit: Future / Infogram)
Image 5 of 15
(Image credit: Future / Infogram)
Image 6 of 15
(Image credit: Future / Infogram)
Image 7 of 15
(Image credit: Future / Infogram)
Image 8 of 15
(Image credit: Future / Infogram)
Image 9 of 15
(Image credit: Future / Infogram)
Image 10 of 15
(Image credit: Future / Infogram)
Image 11 of 15
(Image credit: Future / Infogram)
Image 12 of 15
(Image credit: Future / Infogram)
Image 13 of 15
(Image credit: Future / Infogram)
Image 14 of 15
(Image credit: Future / Infogram)
Image 15 of 15
(Image credit: Future / Infogram)
On the synthetic side, the AMD Radeon RX 7800 XT outperforms the RTX 4070 by about 2% overall, with rasterization workloads being its breakout strength, while Nvidia's ray tracing capabilities continue to outperform AMD's. Though it's worth noting that the RX 7800 XT does a lot to close the gap here, so Nvidia's advantage is only about 15% at best during 3DMark Speedway and just 6% better in Port Royal.
Meanwhile, the RX 7800 XT manages to score 25% better in 3DMark Firestrike Ultra, showing it to be a much better 4K card than the RTX 4070 thanks to the additional VRAM, a level of performance that is replicated in our gaming tests.
Image 1 of 24
(Image credit: Future / Infogram)
Image 2 of 24
(Image credit: Future / Infogram)
Image 3 of 24
(Image credit: Future / Infogram)
Image 4 of 24
(Image credit: Future / Infogram)
Image 5 of 24
(Image credit: Future / Infogram)
Image 6 of 24
(Image credit: Future / Infogram)
Image 7 of 24
(Image credit: Future / Infogram)
Image 8 of 24
(Image credit: Future / Infogram)
Image 9 of 24
(Image credit: Future / Infogram)
Image 10 of 24
(Image credit: Future / Infogram)
Image 11 of 24
(Image credit: Future / Infogram)
Image 12 of 24
(Image credit: Future / Infogram)
Image 13 of 24
(Image credit: Future / Infogram)
Image 14 of 24
(Image credit: Future / Infogram)
Image 15 of 24
(Image credit: Future / Infogram)
Image 16 of 24
(Image credit: Future / Infogram)
Image 17 of 24
(Image credit: Future / Infogram)
Image 18 of 24
(Image credit: Future / Infogram)
Image 19 of 24
(Image credit: Future / Infogram)
Image 20 of 24
(Image credit: Future / Infogram)
Image 21 of 24
(Image credit: Future / Infogram)
Image 22 of 24
(Image credit: Future / Infogram)
Image 23 of 24
(Image credit: Future / Infogram)
Image 24 of 24
(Image credit: Future / Infogram)
When not using any upscaling tech, on average, the RX 7800 XT performs 15% better without ray tracing than the RTX 4070 (and just 4% worse with ray tracing at max settings) at 1080p, 6% better on average at 1440p (16% worse when ray tracing on max settings), and 17% better at 4K (though about 25% worse at 4K when ray tracing).
FSR 2 can't hold a candle to DLSS 3 when ray tracing, but in non-RT gameplay, FSR 2 and the RX 7800 XT actually comes out way ahead across all resolutions when FSR 2 and DLSS 3 are set to balanced, with the RX 7800 XT getting 53%, 21%, 19% better performance at 1080p, 1440p, and 4K, respectively.
Turning on ray tracing prety much reverses the case and the RTX 4070 gets as much as 47%, 16%, and 12% better performance at 1080p, 1440p, and 4K resolutions, respectively.
In short, if you're planning on gaming without ray tracing, there is no question that between the RX 7800 XT and RTX 4070, the RX 7800 XT is the card you'll want to buy.
Here, as well, the RX 7800 XT manages to perform better than the RX 6800 XT, by about 15%, which isn't awful, but gamers hoping for a much larger improvement on the RX 6800 XT (such as myself) will be disappointed. Getting 15% better FPS on average when talking about the RX 7600 is one thing.
Given the price and the class of card in question, 15% is pretty much all you're going to get, but for a nearly $500 graphics card, I'd have liked to see 25% to 33%, if I'm being honest, and that's where this card ultimately should have landed in a perfect world.
But ours is a fallen land, and we're not comparing this card against a Platonic ideal projecting onto a cave wall, we're comparing it to the cards on the shelf that you have to pick between for your next upgrade.
If you can find the RX 6800 XT for more than 15% less than the RX 7800 XT, that might make the last-gen card the better buy. If that's not an option though, and you're like most gamers looking at the RTX 4070 vs. RX 7800 XT, the vast majority are going to get a better experience from the RX 7800 XT, especially when they have an extra $100 to buy themselves something else that's nice, as a treat.
(Image credit: Future / John Loeffler)
Should you buy the AMD Radeon RX 7800 XT?
Buy it if...
You want to play at 4K This card has serious 4K gaming chops thanks to its 16GB VRAM and wide memory bus.
You don't want to completely sacrifice ray tracing
AMD is finally getting to the point where you can have both great rasterization and decent ray tracing performance.
Don't buy it if...
You want the best ray tracing and upscaling possible If ray tracing and upscaling are your bag, then the RTX 4070 is going to be the better buy here.
AMD Radeon RX 7800 XT: Also consider
If my AMD Radeon RX 7800 XT review has you considering other options, here are two more graphics cards to consider.
How I tested the AMD Radeon RX 7800 XT
I spent about a week with the RX 7800 XT
I focused mostly on gaming, since that is what AMD Radeon graphics cards are primarily used for
I used our standard battery of benchmark tests and personal gameplay experience
Test System Specs
These are the specs for the test system used for this review:
I spent about a week extensively testing the RX 7800 XT, both in a test bench and as my personal gaming card at home.
I ran our standard battery of performance benchmarks, including 3DMark tests and various in-game gaming benchmarks, on the RX 7800 XT and various competing graphics cards from AMD and Nvidia to get a slate of comparable figures.
In addition to my extensive computer science education and years as a tech product reviewer, I've been a PC gamer my whole life, so I know what to look for and what to expect from a graphics card at this price point.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
The AMD Radeon RX 7700 XT is a very solid 1440p graphics card with a lot going for it, but its price isn't one of them, and that ultimately holds it back from scoring a major upset for Team Red.
That isn't to say that the RX 7700 XT isn't a card worth buying, but it is one that comes with a pretty big caveat, namely its bigger sibling, the AMD Radeon RX 7800 XT that launches alongside it on September 6, 2023. At $449.99 (about £350/AU$660), this puts it exactly $50 cheaper than the RX 7800 XT. Now, there are caveats around that as well, since the RX 7800 XT's performance is better, but not so much as to make it the best graphics card to buy between the two.
By offering better ray tracing performance, a significant performance gain over its predecessor, and doing so at a cheaper price point to boot makes the AMD Radeon RX 7700 XT an easy card to recommend for anyone looking for the best 1440p graphics card without breaking the bank.
(Image credit: AMD)
Given the end of the summer holiday (in the Northern Hemisphere, at least), I'm still in the process of wrapping up testing on the RX 7800 XT, but it looks to be about 10% to 15% faster than the RX 7700 XT, so the price isn't that far out of pocket on either card, but that does leave you with two cards occupying a very similar niche.
In terms of the RX 7700 XT, its ostensible competition is the Nvidia GeForce RTX 4060 Ti, and broadly speaking it wipes the floor with the RTX 4060 Ti 8GB variant when you're not using ray tracing at 1440p and 1080p.
There are times when the RX 7700 XT even gives the Nvidia RTX 4070 some competition, even though that card is supposed to be squaring up against the RX 7800 XT, and it's against our current top pick for the best graphics card overall that really pushes the RX 7700 XT above all the caveats and make it a card worth buying.
AMD Radeon RX 7700 XT: Price & availability
(Image credit: Future / John Loeffler)
How much does it cost? $449.99 (about £350/AU$660)
When is it available? Available September 6, 2023
Where can you get it? Available in the US, UK, and Australia
The AMD Radeon RX 7700 XT is available on September 6, 2023 starting at $449.99 (about £350/AU$660).
This puts the RX 7700 XT about $30 cheaper than the AMD Radeon RX 6700 XT it replaces (which launched at $479.99), which is great, and it puts it dead center between the two RTX 4060 Ti variants in terms of price, so we'll go ahead and call it a wash on that front.
If there's an issue here really it's that the AMD Radeon RX 7800 XT is also just $50 more ($499.99, about £385/AU$725). As mentioned before, I'm still wrapping up our AMD Radeon RX 7800 XT review, but that card looks to give you about 10% to 15% better performance for an 11% premium, it might be the better value for those who want a midrange card with marginally better performance.
AMD Radeon RX 7700 XT: Specs
(Image credit: Future / John Loeffler)
AMD Radeon RX 7700 XT: Design
In terms of design, there's not much to say about the AMD RX 7700 XT other than to check with the third-party manufacturers of the card you want to buy, since there is no reference card for the RX 7700 XT.
One thing to note though is that no matter which card you go with, to my knowledge no AMD RX 7700 XT card will require a 12HPWR cable to power it, so two 8-pin cables should be all you need.
(Image credit: Future / John Loeffler)
AMD Radeon RX 7700 XT: Chipset & features
The AMD Radeon RX 7700 XT, along with the RX 7800 XT, finally brings AMD's Navi 32 GPU to the desktop market after nearly a year of waiting, and overall I can say it's been worth the wait.
The RX 7700 XT features a slightly cut-down version of the GPU than the one used in the RX 7800 XT, so there's not too much difference between the two. The biggest though will be what's not on the GPU die itself, namely the amount of VRAM.
The RX 7700 XT features 12GB, which is the bare minimum for effective 1440p gameplay, and unlike the largely disappointing RTX 4060 Ti, AMD at least made sure to include enough VRAM to be effective and gave the card a wide enough memory bus to give it the texture bandwidth necessary to play at this level without needing to rely on the assistance of upscaling tech.
Speaking of upscaling tech, along with the announcement of the RX 7800 XT and RX 7700 XT, AMD unveiled FSR 3 which should definitely help AMD level the playing field with Nvidia, but since that's a software-driven tool, rather than being tied to the driver and the RDNA 3 hardware itself, I'll save a deep dive into that for another time, but just know that you should be able to start leveraging that tech soon as well.
If there's a knock on the RX 7700 XT here, it's its power consumption. At 245W, its rated power draw is high for a 1440p card, especially when Nvidia is able to make do with 200W for the RTX 4070 and 160W for the RTX 4060 Ti.
(Image credit: Future / John Loeffler)
AMD Radeon RX 7700 XT: Performance
In terms of performance, the AMD Radeon RX 7700 XT is pretty much what we want to see in the gen-on-gen performance increase of a midrange card.
Starting with synthetic benchmarks, the RX 7700 XT scored about 28% better in 1080p performance than the RX 6700 XT, about 35% better in 1440p performance, and about 29% better in terms of 4K performance, or about 30% better than the RX 6700 XT overall.
Against the Nvidia RTX 4070, meanwhile, the RX 7700 XT only scored about 7.25% lower overall in synthetic performance while costing about 25% less, making it a very compelling challenger to Nvidia's best midrange offering.
The same goes for gaming performance, which is really what everyone is interested in here. The story is much as you'd expect: AMD performs as well or better in rasterization performance while falling behind when it comes to ray tracing and upscaling performance against competing RTX 4060 Ti and RTX 4070 cards.
Most interesting perhaps is the RX 7700 XT's performance vis a vis the RTX 4070, where the RX 7700 XT averaged 103 fps at 1080p, 77 fps at 1440p, and 48 fps at 4K, compared to the RTX 4070's 117 fps, 88 fps, and 52 fps, respectively. These are very close, and for many gamers this will be a practical tie depending on their rigs, so if you're looking for the best cheap graphics card for 1440p and 4K gaming, the RX 7700 XT is definitely one to consider.
(Image credit: Future / John Loeffler)
Should you buy the AMD Radeon RX 7700 XT?
Buy it if...
You want a great 1440p graphics card The RX 7700 XT is a fantastic card for 1440p gaming, especially for the price.
You don't care about ray tracing
As with any AMD graphics card, if you don't really care about ray tracing, you can pretty much skip Nvidia's premium offerings.
Don't buy it if...
You have a bit more room in your budget If you've got some extra money to spend, the RTX 4070 is still likely to be the best option for 1440p gaming.
AMD Radeon RX 7700 XT: Also consider
If my AMD Radeon RX 7700 XT review has you considering other options, here are two more graphics cards to consider.
How I tested the AMD Radeon RX 7700 XT
I spent about a week with the RX 7700 XT
I focused mostly on gaming, since that is what AMD Radeon graphics cards are primarily used for
I used our standard battery of benchmark tests and personal gameplay experience
Test System Specs
These are the specs for the test system used for this review:
I spent about a week with the AM Radeon RX 7700 XT running benchmark tests and playing Baldur's Gate 3 like everyone else is doing right now.
AMD Radeon cards are overwhelmingly used for gaming purposes, so I focused my efforts on determining how good of a gaming graphics card it is.
I've been a PC gamer my whole life and I've spent the past few years extensively benchmarking gaming hardware for a living, so I know how a graphics card at this level is supposed to perform given its price as well as the manufacturer's past product launches.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.