Organizer
Gadget news
Hyte Y70 Touch review: leveling up the premium PC case
6:00 pm | November 13, 2023

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: | Comments: Off

Hyte Y70 Touch: Two-minute review

To say the Hyte Y70 Touch is the culmination of a major PC building trend we've seen over the past few years feels like it doesn't do this case justice.

Back in ye olden dæges, even the best PC case was just an aluminum box with a panel that would open up to let you dig into the guts of your computer. But like every other kind of electronic device, it was purposefully designed to hide everything on the inside. After all, who wants to look at printed circuit boards, soldered ICs, and maybe a dusty fan mounted onto the CPU that 95% of people never bothered to clean?

We've come a long way since then, and as PC enthusiasts and gamers invest serious money into their builds, the impulse to show it all off became too great to ignore. We've since introduced plastic side panels, then tempered glass, and in the last five or six years, we've started to see cases that resemble the kind of glass housing once reserved for museums so builders could show off their hard work.

What they haven't really done before, however, is have an off-the-shelf case that integrates functionality into this cutaway case design, since glass is glass and it's only really good for looking through and providing a small measure of physical protection. 

Hyte, on the other hand, decided to swap out the corner panel of its already excellent Y60 PC case for a 4K interactive touchscreen that has literally stopped several of my coworkers in their tracks this past week so they could gawk at the video wallpaper, clock, and Twitch chat window embedded into the touchscreen along the corner edge of the Y70.

To be fair, they weren't around to watch me stumble through the process of setting up the touchscreen, since it's really just a second Windows display like any one of the best monitors you'll find, though its 1100x3840p resolution makes it a meh-level second display without Hyte's Nexus software. 

Once you install and run the software, it will do all the hard work of actually configuring the display to embed widgets, system information like CPU temperature, and even app shortcuts to put your favorite software a quick touch away.

Of course, to get the case's display to work, it needs to be connected to your graphics card via a DisplayPort connection, so you'll need a graphics card capable of multimonitor support as well as a free DisplayPort output. Nearly all of the best graphics cards, and even most of the best cheap graphics cards, will come with at least two DisplayPort connections, but if you're already running a multimonitor setup, you will likely need to do some cable juggling to make sure your graphics card can run the touchpanel.

Image 1 of 6

The Hyte Y70 Touch on a desk displaying its touchscreen

(Image credit: Future / John Loeffler)
Image 2 of 6

The Hyte Y70 Touch on a desk displaying its touchscreen

(Image credit: Future / John Loeffler)
Image 3 of 6

The Hyte Y70 Touch on a desk displaying its touchscreen

(Image credit: Future / John Loeffler)
Image 4 of 6

The Hyte Y70 Touch on a desk displaying its touchscreen

(Image credit: Future / John Loeffler)
Image 5 of 6

The Hyte Y70 Touch on a desk displaying its touchscreen

(Image credit: Future / John Loeffler)
Image 6 of 6

The Hyte Y70 Touch on a desk displaying its touchscreen

(Image credit: Future / John Loeffler)

As for what you can do with the touch panel, there's actually quite a bit, including adding widgets for quicker access or displaying system information, or possibly taking a break from playing the best PC games to play a Tetris-style brick dropper instead.

And while this might feel a bit gimmicky, the program shortcuts are an absolute lifesaver as someone who has dozens of windows open on my desktop at any given time. 

Finding the shortcut to launch Photoshop on my desktop comes in varying degrees of difficulty depending on how much junk I've dropped onto my desktop over the past few weeks. Being able to turn slightly and touch the Hyte Y70 Touch's display to bring up photoshop is the kind of small thing that adds up to minutes and hours of reclaimed time over weeks and months of use.

The Hyte Y70 Touch on a desk displaying its touchscreen

(Image credit: Future / John Loeffler)

The Nexus software is more or less fine, and it comes with a number of presets that you can run as live backgrounds on the display. One thing to note is that you need to use this software to setup the touch display properly, so once you've built the PC and have successfully booted it up, download and install the software before you do anything else and work through Hyte Nexus.

It's through this software as well that you can build up pages of widgets, turning the Hyte Y70's front-corner panel into something with a smartphone-like interface, making it about as intuitive as it gets.

The Hyte Y70 Touch on a desk displaying its touchscreen

(Image credit: Future / John Loeffler)

When it comes to actual hardware, the case doesn't come with any fans, but it has room to install up to 10 chassis fans, as well as a 360mm long x 125mm thick radiator on the side and another 360mm long x 68mm thick radiator along the top. You have a lot of options for cooling the rig, but definitely focus on intaking air from the bottom and back of the case, as the front and side walls are glass, so you'll have to be conscientious about proper airflow.

Fortunately, as a dual-chamber case, many of the hottest components are separated to allow for easier cooling and air circulation. The biggest and most obvious way this is done is through the vertically mounted GPU thanks to an included riser. It doesn't hurt that it also shows off your GPU.

In terms of rear capacity, there are two internal drive bays that can fit a pair of the best hard drives at 3.5-inches, or up to four 2.5-inch SATA SSDs. The PSU bay is roomy as well, making cable management easier than with a tighter mid-or-full tower case, which can jam even the best PSUs.

There's more than enough room in the front chamber, so whichever of the best graphics cards you install, the Y70's 16.6-inch GPU clearance is more than enough. It is also able to vertically mount up to a four-slot card, so if that Nvidia Titan RTX refresh ever comes along, you just might be able to fit it in here. 

This extra capacity also makes cable management a much easier problem to solve, and even though this is a very premium PC case, it's incredibly user-and-newbie friendly. You'll still have to know where and how you should install various case components like fans and lighting, but the easy-open case gives you all the room you need to work, even if you barely know what you're doing.

And while I am going to rave about this case from here to CES and beyond, it's not all lovely touchscreens and roomy interior. If there's one complaint I have with this case, it would be its price. Coming in at $359.99 / £349.99 (about AU$560), this is significantly more expensive than the Hyte Y60 or Lian-Li O11 Vision, which comes in at just $139.

Of course, none of those cases have a 4K touchscreen interface built-in, so the price isn't unreasonable for what you're getting, but this is a premium case nonetheless, so those on a budget might want to shop around for something more in line with their budget.

Hyte Y70 Touch: Price & availability

  • How much does it cost? $359.99 / £349.99 (about AU$560)
  • When and where can you get it? Available in US right now, with UK and Australia availability coming in December

The Hyte Y70 Touch is available in the US right now for $359.99, with a December launch planned for the UK and Australia. The UK retail price will be £349.99, and should sell for about AU$560 in Australia.

This is a roughly 80% price increase over the Hyte Y60, though that case does not include a touch display. 

Hyte Y70 Touch: Specs

The Hyte Y70 Touch on a desk displaying its touchscreen

(Image credit: Future / John Loeffler)

Should you buy the Hyte Y70 Touch?

Buy the Hyte Y70 Touch if...

You want an absolute showpiece
The Hyte Y70 Touch will draw a crowd if you let it. I know from personal experience.

You want to have a host of functions and apps at your fingertips
The 4K touchscreen on the Y70 makes it easy to pull up apps, track Twitch chat, and monitor system conditions with ease.

Don't buy it if...

You're on a budget
This is a very premium PC case, so if money's tight, forget about the touchscreen and opt for the Hyte Y60 instead.

You have limited desk space
This is a honking big PC case. If your desk looks like the aftermath of Verdun, you might want to go for something with a smaller footprint.

Hyte Y70 Touch: Also consider

If my Hyte Y70 Touch review has you looking for other options, here are two more PC cases to consider...

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed November 2023

Seagate FireCuda 540 review: showcasing the power of PCIe 5.0
5:00 pm | November 12, 2023

Author: admin | Category: Computers Computing Computing Components Gadgets | Comments: Off

Seagate FireCuda 540: 90-second review

The Seagate FireCuda 540 is one of the first high-profile Gen 5 NVMe SSDs to hit the market for those who are running one of the best processors for gaming and best motherboards that are PCIe 5.0 compatible. 

It’s more than just a successor to the widely popular Seagate FireCuda 530 from a couple of years ago, though, it’s a drive that shows you exactly what the next generation is capable of, but at a cost. 

The greatest cost will be to your wallet, because this flagship Gen 5 NVMe SSD carries a particularly high MSRP regardless of where you are in the world. That’s because current motherboard support for Gen 5 NVMe SSDs is limited to the most recent AMD CPU generation with socket AM5 and the current crop of Intel LGA 1700 options. You’re paying a pretty penny to be an early adopter without factoring in the cost in, ultimately. 

Speaking of cost, the Seagate FireCuda 540 currently retails for $189.99 (around £150 / AU$300) for 1TB and $319.99 (approximately £260 / $AU$500) for 2TB which is far from cheap. 

For context, you can currently find Gen 4 alternatives such as the aforementioned FireCuda 530 and the excellent Kingston Fury Renegade SSD for a fraction of the price for around a 30% performance reduction. That’s long been the case with early adoption, however, you’re paying a premium to be on the bleeding edge, and the results do speak for themselves. 

That’s because the Seagate FireCuda 540 absolutely lives up to its claims of 10,000 MB/s sequential performance with both its reads and writes when plugged into to a Gen 5 compatible NVMe M.2 port. The question remains of whether you need this level of sequential performance right now, or if you just want to be ready for when some of the best PC games will take advantage. 

Considering just how slow the adoption of Gen 4 SSDs were to the mainstream, having launched in 2019 before being adopted in 2020 and 2021 for PC and PS5, we could be waiting a couple of years to really see software push this hardware in any meaningful way. Still, the Seagate FireCuda 540 is undoubtably a top contender for best SSD of 2023, even if it still feels a bit ahead of its time.

A Seagate FireCuda 540 SSD slotted into a motherboard

(Image credit: Seagate)

Seagate FireCuda 540: Price & availability

  • How much does it cost? Starting at $189.99 (around £150 / AU$300)
  • When is it available? Available now
  • Where can you get it? Available in the US, UK, and Australia

The Seagate FireCuda 540 is currently available in the US, the UK, and Australia with a starting price of $189.99 (around £150 / AU$300) for 1TB and $319.99 (approximately £260 / $AU$500) for 2TB. 

For contrast, that’s a similar price that you will expect to pay for Gen 5 offerings such as the Corsair MP700 which carries an MSRP of $179.99 (about £147.99 / AU$280) for 1TB and $284.99 (around £230 / AU$430) for 2TB. Simply put, these Gen 5 drives are by no means cheap and you are paying a premium when getting in on the ground floor.

Seagate FireCuda 540: Specs

Seagate FireCuda 540: Design & features

Things are kept simple in terms of the physical and visual design of the Seagate FireCuda 540, and that’s for the best considering it will live under a motherboard heatsink from the second it’s installed into your PC. A simple sticker covers the controller and the DRAM with the company’s logo and the name of the drive itself on one side and that’s your lot. 

More interesting is the choice of controller itself. That’s because the Seagate FireCuda 540 is running the Phison E26 controller, which is significantly faster than the already excellent Phison E18 controller as featured in many of the best M.2 SSDs to hit the market over the last four years. That’s only one side of the story, though, because further adding to the lightning-fast performance here is the  232-Layer Micron TLC flash memory on board coupled with LPDDR4 DRAM cache for short term memory reads, as many top-end drives include. 

While this is a Gen 5x4 drive through and through, it’s also backwards compatible with older 4x4 NVMe slots as well and will cap the performance out at around 7,300 MB/s read and write respectively, however, if you don’t own a PCIe 5.0-ready motherboard then you’re burning money buying one of these to serve for this reason. The rated write endurance is also solid and roughly works out blow-for-blow with the capacity itself. Our review unit is rated at 2000TB for 2TB, however, the 1TB variant is good for 1000TB which is pretty decent overall.

A Seagate FireCuda 540 SSD slotted into a motherboard

(Image credit: Seagate)

Seagate FireCuda 540: Performance

It may cost a pretty penny but there’s no denying that the Seagate FireCuda 540 absolutely excels as at the top-end of what NVMe drives are physically capable of in 2023. In our industry-standard tests, such as with CrystalDiskMark 8, the drive was absolutely able to hit the quoted performance caps by delivering 10,092.67 MB/s read and 10,144.55 MB/s writes respectively which absolutely blows even the leading Gen 4.0 models out of the water. 

This is further compounded by the random 4K read and writes as well as the Seagate FireCuda 540 offered up 625.68 and 476 read and write respectively, which is among the best I’ve personally seen from an NVMe SSD in all my years of testing. AnvilPro further highlighted the finesse of this drive with a total overall score of 30,163.68 which I hadn’t personally seen go so high in all my many reviews of NVMe hardware. Generally speaking, a top-end Gen 4 drive would output around 25,000, so that’s a good 20% increase straight out of the gate. 

Furthermore, in our in-house 25GB file transfer test, the Seagate FireCuda 540 further shined with a time of just 16 seconds which comprised a total of 3,716 files including documents, videos, photos, and programs. That’s seriously fast, and goes to show the power of the Phison E26 controller in tandem with the 232-Layer Micron TLC flash memory. 

In terms of real-world file transfer times, from a Gen 4 NVMe drive over to the Seagate FireCuda 540 is equally impressive. This can be evidenced with Assassin’s Creed Valhalla’s mammoth file size of 158.78 GB transferring to the Gen 5 model in just 58 seconds. Similarly, smaller titles such as Deathloop with its 30.98 file size made the jump in only 11 seconds which works out to around 3GB/sec. That’s blisteringly fast, and further cements the sequential prowess of this drive in action. 

There is a caveat, though, and that’s the fact that no PC games right now are really optimized for blisteringly quick sequential performance on offer right now. That isn’t to say that future releases won’t be able to take advantage, but games don’t even need a Gen 4 yet to run optimally barring a small list of exceptions. What you’re ultimately buying is futureproofing and insurance so that bigger games that benefit from an SSD will run flawlessly for extra overhead instead of being an essential purchase right now.

A Seagate FireCuda 540 SSD slotted into a PS5

(Image credit: Seagate)

Should you buy the Seagate FireCuda 540?

Buy the Seagate FireCuda 540 if...

You want a futureproofed machine
By purchasing a Seagate FireCuda 540 with a PCIe 5.0 motherboard, you’ve guaranteeing your system will be able to run the future of titles which may need a faster drive than what’s available to most users.

You want leading sequential performance from your NVMe SSD The Seagate FireCuda 540 is unrivaled in its sequential performance excelling up to 10,000 MB/s read and write respectively which few NVMe drives can boast right now. 

You want good value for money
While incredibly impressive, the Seagate FireCuda 540 isn’t exactly an essential purchase for anyone on PC right now apart from enthusiasts. No software really needs or benefits from 10,000 MB/s right now, but that could change in the near future.

Seagate FireCuda 540: Also consider

First reviewed November 2023

Gigabyte Radeon RX 7700 XT Gaming OC review: great performance for the price
11:58 pm | October 31, 2023

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: | Comments: Off

Gigabyte Radeon RX 7700 XT Gaming OC: Two-minute review

The Gigabyte Radeon RX 7700 XT Gaming OC is the best version of a difficult card to recommend generally, but it goes a good way towards ameliorating the biggest issue I had with the AMD Radeon RX 7700 XT: its price.

The Gigabyte Radeon RX 7700 XT Gaming OC is available for $439.99 (about £360/AU$695), which is only $10 less than AMD's official MSRP for the RX 7700 XT, so it's not the biggest savings here, but it does make this card at least somewhat more competitively priced with the Nvidia GeForce RTX 4060 Ti, which comes in at $399.99 (about £320/AU$630).

However, it's not just a price cut off the reference MSRP from AMD that makes the Gigabyte RX 7700 XT card a good bargain. You also get some extra perks over AMD's reference specs to make it more enticing as well, making it one of the best graphics card options for midrange gamers on a tighter budget.

A Gigabyte Radeon RX 7700 XT Gaming OC on a desk

(Image credit: Future / John Loeffler)

Starting with the design, you get a triple-fan design that definitely helps thermal performance, which isn't egregious on the RX 7700 XT to begin with. There is no reference card for the RX 7700 XT, mind you, but given that the AMD Radeon RX 7800 XT does have a reference card that sports a dual-fan design, you do get something over the higher-tier AMD card.

That's not nothing, and the card itself isn't so long that it can't fit inside a typical midtower PC case. The RX 7700 XT does require a good bit more power than the RTX 4060 Ti (245W to the 4060 Ti's 160W), so it needs two 8-pin power connectors to run it. On the other hand, it doesn't require a 16-pin power cable like the rest of Nvidia's reference RTX 4000-series cards.

A Gigabyte Radeon RX 7700 XT Gaming OC on a desk

(Image credit: Future / John Loeffler)

The Gigabyte card also lacks any real RGB lighting beyond the Gigabyte logo along the top edge of the card, which is either a good thing or a bad thing, depending on your perspective, but it's good to have options regardless. Non-RGB fans will appreciate the more subdued aesthetics of this GPU for sure.

A Gigabyte Radeon RX 7700 XT Gaming OC on a desk

(Image credit: Future / John Loeffler)

In terms of ports, you have your standard 2 x HDMI 2.1 and 2 x DisplayPort 2.1 output on most AMD RX 7000-series cards, so you can hook it up to several of the best gaming monitors of your choosing.

A Gigabyte Radeon RX 7700 XT Gaming OC on a desk

(Image credit: Future / John Loeffler)

Performance-wise, you can read more about the individual benchmarks in my RX 7700 XT review, and for the most part, the Gigabyte RX 7700 XT Gaming OC card performs a few percentage points better than the XFX Speedster QICK319 RX 7700 XT Black card given that it has about 100MHz higher game clock and a roughly 55MHz faster boost clock.

The difference is only going to be a few fps depending on the game you're playing, but given the Gigabyte card is cheaper, you're really getting extra FPS for less money, which is a fantastic deal no matter how you look at it.

A Gigabyte Radeon RX 7700 XT Gaming OC on a desk

(Image credit: Future / John Loeffler)

In the end, then, the Gigabyte Radeon RX 7700 XT Gaming OC makes a strong case for the RX 7700 XT, especially if spending north of $400 is really stretching your budget to the max. My original criticism that the RX 7700 XT is just too close in price to the AMD RX 7800 XT to make it the best 1440p graphics card to buy still applies to this card, but Gigabyte at least offers more than a non-OC card at a better price to make it a much more palatable purchase if you can't go for the RX 7800 XT.  

Gigabyte Radeon RX 7700 XT Gaming OC: Price & availability

A Gigabyte Radeon RX 7700 XT Gaming OC on a desk

(Image credit: Future / John Loeffler)
  • How much does it cost? $439.99 (about £360/AU$640)
  • When is it available? Available now
  • Where can you get it? Available in the US, UK, and Australia

The Gigabyte Radeon RX 7700 XT Gaming OC is available now for $439.99 (about £360/AU$695). This is cheaper even than the AMD reference spec's MSRP of $449.99, and offers a better value by giving you some extra performance thanks to its factory overclocking.

It also brings you closer in price to the Nvidia RTX 4060 Ti while generally outperforming it. All in all, this is still too expensive to be the best cheap graphics card on the market, but it's definitely the best cheap midrange graphics card you're going to find.

Gigabyte Radeon RX 7700 XT Gaming OC: Specs

A Gigabyte Radeon RX 7700 XT Gaming OC on a desk

(Image credit: Future / John Loeffler)

Should you buy the Gigabyte Radeon RX 7700 XT Gaming OC?

A Gigabyte Radeon RX 7700 XT Gaming OC on a desk

(Image credit: Future / John Loeffler)

Buy it if...

You want great 1440p performance on a tighter budget
This card offers great 1440p performance for the price, especially if you can't stretch your budget to the RX 7800 XT.

You want some extra overclocked performance for free
Normally, OC cards cost more than the reference card, but this one actually costs less than AMD's official MSRP.

Don't buy it if...

You can stretch your budget to get the AMD RX 7800 XT
With the AMD RX 7800 XT offering such incredible performance, if you can stretch your budget to get that card (especially the Gigabyte Radeon RX 7800 XT Gaming OC), you absolutely should.

You want better content creation performance
If you're a content creator working with 3D rendering or other GPU intensive creative workloads, chances are an Nvidia card is going to offer much better performance than anything AMD can offer.

Gigabyte Radeon RX 7700 XT Gaming OC: Also consider

How I tested the Gigabyte Radeon RX 7700 XT Gaming OC

  • I spent about three weeks with the Gigabyte Radeon RX 7700 XT Gaming OC
  • I used it to play games, produce and edit creative content, and more
  • I used our standard battery of benchmarking tools to test it

I spent about three weeks with the Gigabyte Radeon RX 7700 XT Gaming OC, running my standard suite of benchmarks as well as assessing its general performance in real-world use cases.

I paid special attention to its gaming performance, since this is specifically targeting gamers, and paid less attention to its content creation performance since non-Radeon Pro cards are generally not marketed for those purposes.

I've been a computer hardware reviewer for years now and have tested all the latest graphics cards of the past several generations as well as having nearly a decade of computer science education, so I know my way around this kind of hardware. What's more, as a lifelong gamer, I know what to expect from a graphics card at this price point in terms of gaming performance.


We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed October 2023

Intel Core i9-14900K review: more of a Raptor Lake overclock than a refresh
4:00 pm | October 17, 2023

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Intel Core i9-14900K: Two-minute review

The Intel Core i9-14900K is a hard chip to justify, which is a weird thing to say about a processor that is arguably the best Intel has ever put out.

With very little fanfare to herald its arrival following the announcement of Intel Meteor Lake at Intel Innovation in September 2023 (and confirmation that Intel Meteor Lake is coming to desktop in 2024), Intel's 14th-generation flagship processor cannot help but draw parallels to the 11th-gen Rocket Lake chips that immediately preceded Intel Alder Lake.

The Core i9-11900K was something of a placeholder in the market until Intel could launch Alder Lake at the end of 2021. Those processors featured a new hybrid architecture and a more advanced 10nm process that helped propel Intel back to the top of our best processor list, despite strong competition from AMD.

With Intel Raptor Lake Refresh, we're back in placeholder territory, unfortunately. The performance gains here are all but non-existent, so we are essentially waiting on Meteor Lake while the i9-14900K absolutely guzzles electricity and runs hot enough to boil water under just about any serious workload with very little extra performance over the Intel Core i9-13900K to justify the upgrade.

The problem for the Core i9-14900K is that you can still get the i9-13900K.

It's not that the Core i9-14900K isn't a great processor; again, it's unquestionably the best Intel processor for the consumer market in terms of performance. It beats every other chip I tested in most categories with the exception of some multitasking workflows and average gaming performance, both of which it comes in as a very close runner-up. On top of that, at $589, it's the same price as the current Intel flagship, the Intel Core i9-13900K (assuming the i9-14900K matches the i9-13900K's £699 / AU$929 sale price in the UK and Australia).

The problem for the Core i9-14900K is two-fold: you can still get the i9-13900K and will be able to for a long while yet at a lower price, and the Intel Core i7-14700K offers performance so close to the 14th-gen flagship at a much lower price that the 14900K looks largely unnecessary by comparison. Essentially, If you've got an i7-13700K or i9-13900K, there's is simply nothing for you here.

If you're on an 11th-gen chip or older, or you've got an AMD Ryzen processor and you're looking to switch, this chip will be the last one to use the LGA 1700 socket, so when Meteor Lake-S comes out in 2024 (or even Lunar Lake-S, due out at the end of 2024 or early 2025), you won't be able to upgrade to that processor with an LGA 1700 motherboard. In other words, upgrading to an LGA 1700 for this chip is strictly a one-shot deal.

The only people who might find this chip worth upgrading to are those currently using a 12th-gen processor who skipped the 13th-gen entirely, or someone using a 13th-gen core i5 who wants that extra bit of performance and doesn't mind dropping $589 on a chip they might be upgrading from again in a year's time, which isn't going to be a whole lot of people. 

Unfortunately, at this price, it'll be better to save your money and wait for Meteor Lake or even Lunar Lake to drop next year and put the $589 you'd spend on this chip towards the new motherboard and CPU cooler you'll need once those chips are launched.

An Intel Core i9-14900K with its promotional packaging

(Image credit: Future / John Loeffler)

Intel Core i9-14900K: Price & availability

  • How much does it cost? US MSRP $589 (about £470/AU$855)
  • When is it out? October 17, 2023
  • Where can you get it? You can get it in the US, UK, and Australia

The Intel Core i9-14900K is available as of October 17, 2023, for a US MSRP of $589 (about £470/AU$855), which is the same as the Intel Core i9-13900K it is replacing. We don't have confirmation on UK and Australia pricing yet, though I've asked Intel for clarification and will update this review if and when I hear back from the company. If the 14900K keeps the same UK and Australia pricing as the Core i9-13900K, however, it'll sell for £699/AU$929 in the UK and Australia respectively.

Meanwhile, this is still cheaper than most of AMD's rival chips in this tier, the AMD Ryzen 9 7950X3D, AMD Ryzen 9 7950X, and AMD Ryzen 9 7900X3D, with only the AMD Ryzen 9 7900X coming in cheaper than the i9-14900K. 

This does make the Core i9-14900K the better value against these chips, especially given the level of performance on offer, but it's ultimately too close to the 13900K performance-wise to make this price meaningful, as a cheaper 13900K will offer an even better value against AMD's Ryzen 9 lineup.

  • Price score: 3 / 5

A masculine hand holding an Intel Core i9-14900K

(Image credit: Future / John Loeffler)

Intel Core i9-14900K: Specs & features

  • Faster clock speeds than i9-13900K
  • Some additional AI-related features

The Intel Core i9-14900K is the final flagship using Intel's current architecture, so it makes sense that there is very little in the way of innovation over the Intel Core i9-13900K.

Using the same 10nm Intel 7 process node as its predecessor and with the same number of processor cores (8 P-cores/16 E-cores), threads (32), and cache (32MB total L2 cache plus additional 36MB L3 cache), the only real improvement with the 14900K in terms of specs are its faster clock speeds.

All cores get a 0.2GHz increase to their base frequencies, while the P-core turbo boost clock increases to 5.6GHz and the E-core turbo clock bumps up to 4.4GHz from the 13900K's 5.4GHz P-Core turbo clock and 4.3GHz E-core turbo clock.

While those clock speeds are the official max turbo clocks for the two types of cores, the Core i9-14900K and Intel Core i7-14700K have something called Turbo Boost Max Technology 3.0, which increases the frequency of the best-performing core in the chip and gives it even more power within the power and thermal limits. That gets the Core i9-14900K up to 5.8GHz turbo clock on specific P-cores while active.

Additionally, an exclusive feature of the Core i9 is an additional Ludicrous-Speed-style boost called Intel Thermal Velocity Boost. This activates if there is still power and thermal headroom on a P-core that is already being boosted by the Turbo Boost Max Technology 3.0, and this can push the core as high as 6.0GHz, though these aren't typical operating conditions.

Both of these technologies are present in the 13900K as well, but the 14900K bumps up the maximum clock speeds of these modes slightly, and according to Intel, that 6.0GHz clock speed makes this the world's fastest processor. While that might technically be true, that 6.0GHz is very narrowly used so in practical terms, the P-Core boost clock is what you're going to see almost exclusively under load.

The Core i9-14900K has the same 125W TDP as the 13900K and the same 253W maximum turbo power as well, though power draw in bursts of less than 10ms can go far higher.

If this reads like a Redditor posting about their successful overclocking setup, then you pretty much get what this chip is about. If you're looking for something innovative about this chip, I'll say it again, you're going to have to wait for Meteor Lake.

The Core i9-14900K also has support for discrete Wi-Fi 7 and Bluetooth 5.4 connectivity, as does the rest of the 14th-gen lineup, as well as support for discrete Thunderbolt 5, both of which are still a long way down the road.

The only other thing to note is that there have been some AI-related inclusions that are going to be very specific to AI workloads that almost no one outside of industry and academia is going to be running. If you're hoping for AI-driven innovations for everyday consumers, let's say it once more, with feeling: You're going to have to wait for—

  • Chipset & features score: 3.5 / 5

An Intel Core i9-14900K slotted into a motherboard

(Image credit: Future / John Loeffler)

Intel Core i9-14900K: Performance

  • Best-in-class performance, but only by a hair
  • Gets beat by AMD Ryzen 7 7800X3D and i7-14700K in gaming performance
  • Runs even hotter than the i9-13900K

If you took any elite athlete who's used to setting records in their sport, sometimes they break their previous record by a lot, and sometimes it's by milliseconds or fractions of an inch. It's less sexy, but it still counts, and that's really what we get here with the Intel i9-14900K.

On pretty much every test I ran on it, the Core i9-14900K edged out its predecessor by single digits, percentage-wise, which is a small enough difference that a background application can fart and cause just enough of a dip in performance that the 14900K ends up losing to the 13900K. 

I ran these tests more times than I can count because I had to be sure that something wasn't secretly messing up my results, and they are what they are. The Core i9-14900K does indeed come out on top, but it really is a game of inches at this point.

Image 1 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 3 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 4 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 5 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 6 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 7 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 8 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 9 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 10 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 11 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 12 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 13 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

Across all synthetic performance and productivity benchmarks, the Core i9-14900K comes out on top, with the notable exception of Geekbench 6.1's multi-core performance test, where the AMD Ryzen 9 7950X scores substantially higher, and the Passmark Performance Test's overall CPU score, which puts the AMD Ryzen 9 7950X and Ryzen 9 7950X3D significantly higher. Given that all 16 cores of the 7950X and 7950X3D are full-throttle performance cores, this result isn't surprising.

Other than that though, it's the 14900K all the way, with a 5.6% higher geometric average on single-core performance than the 13900K. For multi-core performance, the 14900K scores a 3.1% better geometric average, and in productivity workloads, it scores a 5.3% better geometric average than its predecessor.

Against the AMD Ryzen 9 7950X, the Core i9-14900K scores about 13% higher in single-core performance, about 1% lower in multi-core performance, and 5% better in productivity performance.

Image 1 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 3 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 4 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 5 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 6 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 7 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

Creative benchmarks reveal something of a mixed bag for the Core i9-14900K. In all cases, it beats its predecessor by between 2.6% to as much as 10.9%. Against the AMD Ryzen 9 7950X and 7950X3D, the Core i9-14900K consistently loses out when it comes to rendering workloads like Blender and V-Ray 5, but beats the two best AMD processors by just as much in photo and video editing. And since 3D rendering is almost leaning heavily on a GPU rather than the CPU, AMD's advantage here is somewhat muted in practice.

Image 1 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 2 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 3 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 4 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 5 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 6 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)

Gaming is another area where Intel had traditionally done well thanks to its strong single-core performance over AMD, but all that flipped with the introduction of AMD's 3D V-Cache. 

While the Intel Core i9-14900K barely moves the needle from its predecessor's performance, it really doesn't matter, since the AMD Ryzen 7 7800X3D manages to ultimately score an overall victory and it's not very close. The Core i9-14900K actually manages a tie for fourth place with the Intel Core i7-13700K, with the Core i7-14700K edging it out by about 4 fps on average.

Image 1 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

Of course, all this performance requires power, and lots of it. The Core i9-14900K pretty much matched the maximum recorded power draw of the Core i9-13900K, with less of a watt's difference between the two, 351.097W to 351.933, respectively.

The Core i9-14900K still managed to find a way to run hotter than its predecessor, however; something I didn't really think was possible. But there it is, the 14900K maxing out at 105ºC, three degrees hotter than the 13900K's max. It's the hottest I've ever seen a CPU run, and I'm genuinely shocked it was allowed to run so far past its official thermal limit without any overclocking on my part.

  • Performance: 3.5 / 5

A masculine hand holding an Intel Core i9-14900K

(Image credit: Future / John Loeffler)

Intel Core i9-14900K: Verdict

  • The best chip for dedicated performance like video editing and productivity
  • There are better gaming processors out there for cheaper
  • The Intel Core i7-14700K offers a far better value
Image 1 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 3 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 4 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 5 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 6 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 7 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

In the final assessment then, the Core i9-14900K does manage to win the day, topping the leaderboard by enough of a margin to be a clear winner, but close enough that it isn't the cleanest of wins. 

Overall, its single-core and productivity performance are its best categories, slightly faltering in creative workloads, and coming up short enough on gaming that it's not the chip I would recommend as a gaming CPU.

Like all Core i9s before it, the 14900K is the worst value of Intel's 14th-gen launch lineup, but it's better than its predecessor for the time being (though that advantage won't last very long at all), and it does manage to be a better value proposition than the Ryzen 9 7950X and Ryzen 9 7950X3D, while matching the Ryzen 7 7800X3D, so all in all, not too bad for an enthusiast chip.

Still, the Intel Core i7-14700K is right there, and its superior balance of price and performance makes the Intel Core i9-14900K a harder chip to recommend than it should be.

Should you buy the Intel Core i9-14900K?

Buy the Intel Core i9-14900K if...

Don't buy it if...

Also Consider

If my Intel Core i9-14900K review has you considering other options, here are two processors to consider... 

How I tested the Intel Core i9-14900K

  • I spent nearly two weeks testing the Intel Core i9-14900K
  • I ran comparable benchmarks between this chip and rival flagship processors
  • I gamed with this chip extensively
Test System Specs

These are the specs for the test system used for this review:

Intel Motherboard: MSI MPG Z790E Tomahawk Wifi
AMD Motherboard: ASRock X670E Steel Legend
CPU Cooler:
MSI MAG Coreliquid E360 AIO
Memory:
32GB SK Hynix DDR5-4800
SSD: Samsung 990 Pro
PSU: Thermaltake PF3 1050W ATX 3.0
Case: Praxis Wetbench

I spent about two weeks testing the Intel Core i9-14900K and its competition, using it mostly for productivity and content creation, with some gaming thrown in as well.

I used the standard battery of synthetic benchmarks I use for processor testing, and ran the same tests on rival chips from AMD as well as the other 14th-gen chips in the Raptor Lake Refresh launch lineup and 13th-generation Raptor Lake processors. For Intel chips, I used the same motherboard, RAM, SSD, and graphics card to ensure I was isolating just the CPU's performance across every chip. For AMD chips, I used a comparable AM5 motherboard so differences in the motherboard configuration and circuitry are mitigated to the largest extent possible.

I've been testing and reviewing computer hardware for years now, and with an extensive background in computer science, I know processors in and out, and I use that knowledge to ensure every chip is thoroughly tested.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed October 2023

Intel Core i7-14700K review: salvaging Raptor Lake Refresh with i9-13900K performance
4:00 pm |

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Intel Core i7-14700K: One-minute review

The Intel Core i7-14700K is the workhorse CPU in the Intel's 14th generation launch line-up, and like any good workhorse, it's going to be the one to do the heavy lifting for this generation of processors. Fortunately for Intel, the Core i7-14700K succeeds in keeping Raptor Lake Refresh from being completely forgettable.

Of all the chips launched on October 17, 2023, the Core i7-14700K is the only one to get a substantive spec upgrade over its predecessor as well as a slight cut in price to just $409 (about £325/AU$595), which is $10 less than the Intel Core i7-13700K it replaces.

So what do you get for $10 less? Gen-on-gen, you don't get a whole lot of improvement (about 6% better performance overall compared to the 13700K), but that figure can be deceiving, since the Core i7-13700K was at the top of our best processor list for a reason. 

With the 13700K's performance being within striking distance of the Intel Core i9-13900K, that 6% improvement for the 14700K effectively closes the gap, putting the 14700K within just 3% of the 13900K overall, and even allowing it to pull ahead in average gaming performance, losing out to only the AMD Ryzen 7 7800X3D.

Fortunately for Intel, the Core i7-14700K succeeds in keeping Raptor Lake Refresh from being completely forgetable.

In terms of productivity and general performance, the Core i7-14700K shines as well, going toe to toe with the best AMD processors like the AMD Ryzen 9 7950X and AMD Ryzen 9 7950X3D, giving it a very strong claim on being the best Intel processor processor for most people.

Given its excellent mix of performance and price, the Intel Core i7-14700K could very well be the last Intel chip of the LGA 1700 epoch that anyone should consider buying, especially if you're coming from a 12th-gen chip. 

With the Core i9-13900K outperforming the Intel Core i9-12900K by as much as 25% in some workloads, someone coming off an i9-12900K or lower will find it hard to believe that an i7 could perform this well, but that's where we're at. And with the i7-14700K coming in about 30% cheaper than the Intel Core i9-14900K, while still managing to come remarkably close in terms of its performance, the Intel Core i7-14700K is the Raptor Lake Refresh chip to buy if you're going to buy one at all.

An Intel Core i7-14700K with its promotional packaging

(Image credit: Future / John Loeffler)

Intel Core i7-14700K: Price & availability

  • How much does it cost? US MSRP $409 (about £325/AU$595)
  • When is it out? October 17, 2023
  • Where can you get it? You can get it in the US, UK, and Australia

The Intel Core i7-14700K is available on October 17, 2023, with a US MSRP of $409 (about £325/AU$595), which is a slight decrease from its predecessor's MSRP of $419 (about £335/AU$610), and about 31% lower than the Intel Core i9-14900K and 32% percent lower than the AMD Ryzen 9 7950X. 

It's also cheaper than the AMD Ryzen 7 7800X3D, and just $10 more expensive than the AMD Ryzen 7 7700X, putting it very competitively priced against processors in its class.

The comparisons against the Core i9 and Ryzen 9 are far more relevant, however, since these are the chips that the Core i7-14700K are competing against in terms of performance, and in that regard, the Intel Core i7-14700K is arguably the best value among consumer processors currently on the market.

  • Price score: 4 / 5

Intel Core i7-14700K: Specs & features

  • Four additional E-Cores
  • Slightly faster clock speeds
  • Increased Cache
  • Discrete Wi-Fi 7 and Thunderbolt 5 support

The Intel Core i7-14700K is the only processor from Intel's Raptor Lake Refresh launch line-up to get a meaningful spec upgrade.

Rather than the eight performance and eight efficiency cores like the i7-13700K, the i7-14700K comes with eight performance cores and 12 efficiency cores, all running with a slightly higher turbo boost clock for extra performance. The i7-14700K also has something called Turbo Boost Max Technology 3.0, which is a mouthful but also gives the best performing P-core an extra bump up to 5.6GHz so long as the processor is within power and thermal limits.

The increased core count also adds 7MB of additional L2 cache for the efficiency cores to use, further improving their performance over the 13700K's, as well as four additional processing threads for improved multitasking.

It has the same TDP of 125W and same Max Turbo Power rating of 253W as the 13700K, with the latter being the upper power limit of sustained (greater than one second) power draw for the processor. This ceiling can be breached, however, and processing cores can draw much more power in bursts as long as 10ms when necessary.

There is also support for discrete Wi-Fi 7 and Bluetooth 5.4 connectivity, as well as discrete Thunderbolt 5 wired connections, so there is a decent bit of future proofing in its specs.

  • Chipset & features score: 4 / 5

An Intel Core i7-14700K slotted into a motherboard

(Image credit: Future / John Loeffler)

Intel Core i7-14700K: Performance

  • Outstanding performance on par with the i9-13900K
  • Best gaming performance of any Intel processor
  • More power hungry than predecessor, so also runs hotter

The Intel Core i7-14700K is arguably the best performing midrange processor on the market, coming within striking distance of the Core i9-13900K and Ryzen 9 7950X across most workloads, including very strong multi core performance thanks to the addition of four extra efficiency cores.

Image 1 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 2 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 3 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 4 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 5 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 6 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 7 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 8 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 9 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 10 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 11 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 12 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 13 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)

The strongest synthetic benchmarks for the 14700K are single core workloads, which puts it effectively level with the Core i9-13900K and often beating the Ryzen 9 7950X and 7950X3D chips handily. 

This translates into better dedicated performance, rather than multitasking, but even there the Core i7-14700K does an admirable just keeping pace with chips with much higher core counts.

Image 1 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 2 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 3 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 4 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 5 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 6 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 7 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)

In creative workloads, the 14700K also performs exceptionally well, beating out the 13900K on everything except 3D model rendering, which is something that is rarely given to a CPU to do any when even the best cheap graphics cards can process Blender or V-Ray 5 workloads many times faster than even the best CPU can.

Image 1 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 2 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 3 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 4 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 5 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 6 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)

In gaming performance, the Core i7-14700K scores a bit of an upset over its launch sibling, the i9-14900K, besting it in gaming performance overall, though it has to be said that it got some help from a ridiculously-high average fps in Total War: Warhammer III's battle benchmark. In most cases, the i7-14700K came up short of the 13900K and 14900K, but not by much.

And while it might be tempting to write off Total War: Warhammer III as an outlier, one of the biggest issues with the Core i9's post-Alder Lake is that they are energy hogs and throttle under load quickly, pretty much by design. 

In games like Total War: Warhammer III where there are a lot of tiny moving parts to keep track of, higher clock speeds don't necessarily help. When turbo clocks kick into high gear and cause throttling, the back-and-forth between throttled and not-throttled can be worse over the course of the benchmark than the cooler but consistent Core i7s, which don't have to constantly ramp up and ramp down. 

So the 14700K isn't as much of an outlier as it looks, especially since the 13700K also excels at Total War: Warhammer III, and it too beats the two Core i9s. Total War: Warhammer III isn't the only game like this, and so there are going to be many instances where the cooler-headed 14700K steadily gets the work done while the hot-headed i9-13900K and 14900K sprint repeatedly, only to effectively tire themselves out for a bit before kicking back up to high gear.

Image 1 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

The additional efficiency cores might not draw as much power as the performance cores, but the additional power is still noticeable. The 14700K pulls down nearly 30W more watts than the 13700K, though it is still a far cry from the Core i9-13900K's power usage.

This additional power also means that the Core i7-14700K runs much hotter than its predecessor, maxing out at 100ºC, triggering the CPU to throttle on occasion. This is something that the i7-13700K didn't experience during my testing at all, so you'll need to make sure your cooling solution is up to the task here.

  • Performance: 4.5 / 5

An Intel Core i7-14700K with its promotional packaging

(Image credit: Future / John Loeffler)

Intel Core i7-14700K: Verdict

  • Fantastic single-core performance
  • Intel's best gaming processor, and second overall behind the Ryzen 7 7800X3D
  • Best value of any midrange processor
Image 1 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 2 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 3 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 4 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 5 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 6 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 7 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)

Ultimately, the Intel Core i7-14700K is the best processor in the Raptor Lake Refresh line-up, offering very competitive performance for a better price than its predecessor and far better one than comparable chips one tier higher in the stack.

It's not without fault, though. It's not that much better than the i7-13700K, so everything I'm saying about the i7-14700K might reasonably apply to its predecessor as well. And honestly, the i7-14700K doesn't have too high a bar to clear to standout from its launch siblings, so it's performance might only look as good in comparison to the i9 and i5 standing behind it.

But, the numbers don't lie, and the Intel Core i7-14700K displays flashes of brilliance that set it apart from its predecessor and vault it into competition with the top-tier of CPUs, and that's quite an achievement independent of how the rest of Raptor Lake Refresh fares. 

A masculine hand holding an Intel Core i7-14700K

(Image credit: Future / John Loeffler)

Should you buy the Intel Core i7-14700K?

Buy the Intel Core i7-14700K if...

Don't buy it if...

Also Consider

If my Intel Core i7-14700K review has you considering other options, here are two processors to consider... 

How I tested the Intel Core i7-14700K

  • I spent nearly two weeks testing the Intel Core i7-14700K
  • I ran comparable benchmarks between this chip and rival midrange processors
  • I gamed with this chip extensively
Test System Specs

These are the specs for the test system used for this review:

Intel Motherboard: MSI MPG Z790E Tomahawk Wifi
AMD Motherboard: ASRock X670E Steel Legend
CPU Cooler:
MSI MAG Coreliquid E360 AIO
Memory:
32GB SK Hynix DDR5-4800
SSD: Samsung 990 Pro
PSU: Thermaltake PF3 1050W ATX 3.0
Case: Praxis Wetbench

I spent about two weeks testing the Intel Core i7-14700K and its competition, primarily for productivity work, gaming, and content creation.

I used a standard battery of synthetic benchmarks that tested out the chip's single core, multi core, creative, and productivity performance, as well as built-in gaming benchmarks to measure its gaming chops. 

I then ran the same tests on rival chips from AMD as well as the other 14th-gen chips in the Raptor Lake Refresh launch line-up and 13th-generation Raptor Lake processors. For Intel chips, I used the same motherboard, RAM, SSD, and graphics card to ensure I was isolating just the CPU's performance across every chip. For AMD chips, I used a comparable AM5 motherboard so differences in the motherboard configuration and circuitry are mitigated to the largest extent possible.

I've been testing and reviewing computer hardware for years now, and with an extensive background in computer science, I know processors in and out, and I use that knowledge to ensure every chip is thoroughly tested.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed October 2023

Intel Core i5-14600K review: wait for Meteor Lake
4:00 pm |

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Intel Core i5-14600K: Two-minute review

The Intel Core i5-14600K is not the kind of processor you're really going to want to upgrade to, despite technically offering the best value of any processor I've tested.

First, the good. This is one of the best processor values you're going to find on the market, no matter what happens with the price of its predecessor. Currently, it has the best performance for its $319 price tag (about £255/AU$465), and AMD's competing Ryzen 5 7600X isn't all that close. If you're looking to get the most bang for your buck today, then the Intel Core i5-14600K is it.

In terms of performance, this isn't a bad chip at all; I'd even say it's a great one if you take its predecessor out of the running, which will inevitably happen as its last remaining stock gets bought up. It doesn't have the performance of the Intel Core i7-14700K, but that's a workhorse chip, not the kind that's meant to power the best computers for the home or the best budget gaming PCs as these chips start making their way into prebuilt systems in the next couple of months.

For a family computer or one that's just meant for general, every day use, then this chip is more than capable of handling whatever y'll need it for. It can even handle gaming fairly well thanks to its strong single core performance. So, on paper at least, the Core i5-14600K is the best Intel processor for the mainstream user as far as performance goes.

The real problem with the i5-14600K is that its performance is tragically close to the Core i5-13600K's. And even though the MSRP of the Intel Core i5-13600K is technically higher than that of the Core i5-14600K, it's not going to remain that way for very long at all.

The real problem with the i5-14600K, and one that effectively sinks any reason to buy it, is that its performance is tragically close to the Core i5-13600K's.

As long as the i5-13600K is on sale, it will be the better value, and you really won't even notice a difference between the two chips in terms of day-to day-performance.

That's because there's no difference between the specs of the 14600K vs 13600K, other than a slightly faster turbo clock speed for the 14600K's six performance cores.

While this does translate into some increased performance, it comes at the cost of higher power draw and temperature. During testing, this chip hit a maximum temperature of 101ºC, which is frankly astounding for an i5. And I was using one of the best CPU coolers around, the MSI MAG Coreliquid E360 AIO, which should be more than enough to keep the temperature in check to prevent throttling.

Image 1 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 2 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 3 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 4 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 5 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 6 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 7 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 8 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 9 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 10 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 11 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 12 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 13 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)

Looking at the chip's actual performance, the Core i5-14600K beats the AMD Ryzen 5 7600X and the Intel Core i5-13600K in single core performance, multi core performance, and with productivity workloads, on average. Other than its roughly 44% better average multi core performance against the Ryzen 5 7600X, the Core i5-14600K is within 3% to 4% of its competing chips.

Image 1 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 2 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 3 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 4 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 5 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 6 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 7 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)

In creative workloads, the Core i5-14600K again manages to outperform the Ryzen 5 7600X by about 31% on average, but it's just 2.4% better than its predecessor, and none of these chips are especially great at creative content work. If you're messing around with family albums or cutting up TikTok videos, any one of these chips could do that fairly easily. For heavier-duty workloads like video encoding and 3D rendering, the Intel chips hold up better than the mainstream Ryzen 5, but these chips really aren't practical for that purpose.

Image 1 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 2 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 3 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 4 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 5 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 6 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)

On the gaming front, it's more of the same, though now at least the Ryzen 5 7600X is back in the mix. Overall, the Core i5-14600K beats its 13th-gen predecessor and AMD's rival chip by about 2.1% and 3.2% respectively.

Image 1 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

All of this comes at the cost of higher power draw and hotter CPU temperatures, though, which isn't good especially for getting so little in return. What you really have here is an overclocked i5-13600K, and you can do that yourself and save some money by buying the 13600K when it goes on sale, which is will.

An Intel Core i5-14600K against its promotional packaging

(Image credit: Future / John Loeffler)

Intel Core i5-14600K: Price & availability

  • How much does it cost? US MSRP $319 (about £255/AU$465)
  • When is it out? October 17, 2023
  • Where can you get it? You can get it in the US, UK, and Australia

The Intel Core i5-14600K is available in the US, UK, and Australia as of October 17, 2023, for an MSRP of $319 (about £255/AU$465). 

This is a slight $10 price drop from its predecessor, which is always good thing, and comes in about $20 (about £15/AU$30) more than the AMD Ryzen 5 7600X, so fairly middle of the pack price-wise.

In terms of actual value, as it goes to market, this chip has the highest performance for its price of any chip in any product tier, but only by a thin margin, and one that is sure to fall very quickly once the price on the 13600K drops by even a modest amount.

Intel Core i5-14600K: Specs

Intel Core i5-14600K: Verdict

  • Best performance for the price of any chip tested...
  • ...but any price drop in the Core i5-13600K will put the 14600K in second place
  • Not really worth upgrading to with the Core i7-14700K costing just $90 more
Image 1 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 2 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 3 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 4 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 5 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 6 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 7 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)

Ultimately, the market served by this chip specifically is incredibly narrow, and like the rest of the Raptor Lake Refresh line-up, this is the last hurrah for the Intel LGA 1700 socket.

That means if you go out and buy a motherboard and CPU cooler just for the 14th-gen, it's a one time thing, since another generation on this platform isn't coming. It doesn't make sense to do that, so, if you're upgrading from anything earlier than the 12th-gen, it just makes so much more sense to wait for Meteor Lake to land in several months time and possibly get something really innovative.

If you're on a 12th-gen chip and you can't wait for Meteor Lake next year, the smartest move is to buy the i7-14700K instead, which at least gives you i9-13900K-levels of performance for just $90 more than the i5-14600K.

Ultimately, this chip is best reserved for prebuilt systems like the best all-in-one computers at retailers like Best Buy, where you will use the computer for a reasonable amount of time, and then when it becomes obsolete, you'll go out and buy another computer rather than attempt to upgrade the one you've got.

In that case, buying a prebuilt PC with an Intel Core i5-14600K makes sense, and for that purpose, this will be a great processor. But if you're looking to swap out another Intel LGA 1700 chip for this one, there are much better options out there.

Should you buy the Intel Core i5-14600K?

Buy the Intel Core i5-14600K if...

Don't buy it if...

Also Consider

If my Intel Core i5-14600K review has you considering other options, here are two processors to consider... 

How I tested the Intel Core i5-14600K

  • I spent nearly two weeks testing the Intel Core i5-14600K
  • I ran comparable benchmarks between this chip and rival midrange processors
  • I gamed with this chip extensively
Test System Specs

These are the specs for the test system used for this review:

Intel Motherboard: MSI MPG Z790E Tomahawk Wifi
AMD Motherboard: Gigabyte Aorus X670E Extreme
CPU Cooler:
MSI MAG Coreliquid E360 AIO
Memory:
32GB SK Hynix DDR5-4800
SSD: Samsung 990 Pro
PSU: Thermaltake PF3 1050W ATX 3.0
Case: Praxis Wetbench

I spent about two weeks testing the Intel Core i5-14600K and its competition, primarily for productivity work, gaming, and content creation.

I used a standard battery of synthetic benchmarks that tested out the chip's single core, multi core, creative, and productivity performance, as well as built-in gaming benchmarks to measure its gaming chops. 

I then ran the same tests on rival chips from AMD as well as the other 14th-gen chips in the Raptor Lake Refresh launch lineup and 13th-generation Raptor Lake processors. For Intel chips, I used the same motherboard, RAM, SSD, and graphics card to ensure I was isolating just the CPU's performance across every chip. For AMD chips, I used a comparable AM5 motherboard so differences in the motherboard configuration and circuitry are mitigated to the largest extent possible.

I've been testing and reviewing computer hardware for years now, and with an extensive background in computer science, I know processors in and out, and I use that knowledge to ensure every chip is thoroughly tested.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed October 2023

Intel Arc A770 review: a great 1440p graphics card for those on a budget
4:00 pm | October 16, 2023

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Intel Arc A770: One-minute review

The Intel Arc A770 has had quite a journey since its release back on October 12, 2022, and fortunately, it has been a positive one for Intel despite a somewhat rocky start.

Right out the gate, I'll say that if you are looking for one of the best cheap graphics cards for 1440p gaming, this card definitely needs to be on your list. It offers great 1440p performance for most modern PC titles that most of us are going to be playing and it's priced very competitively against its rivals. 

Where the card falters, much like with my Intel Arc A750 review earlier this year, is with older DirectX 9 and DirectX 10 titles, and this really does hurt its overall score in the end. Which is a shame, since for games released in the last five or six years, this card is going to surprise a lot of people who might have written it off even six months ago.

Intel's discrete graphics unit has been working overtime on its driver for this card, providing regular updates that continue to improve performance across the board, though some games benefit more than others. 

Naturally, a lot of emphasis is going to be put on more recently released titles. And even though Intel has also been paying attention to shoring up support for older games as well, if you're someone with an extensive back catalog of DX9 and DX10 titles from the mid-2000s that you regularly return to, then this is not the best graphics card for your needs. Nvidia and AMD drivers carry a long legacy of support for older titles that Intel will honestly never be able to match.

But if what you're looking for is the best 1440p graphics card to play the best PC games of the modern era but you're not about to plop down half a grand on a new GPU, then the Intel Arc A770 is going to be a very solid pick with a lot more to offer than many will probably realize.

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Intel Arc A770: Price & availability

  • How much is it? US MSRP for 16GB card: $349 (about £280/AU$510); for 8GB card: $329 (about £265/AU$475)
  • When was it released? It went on sale on October 12, 2022
  • Where can you buy it? Available in the US, UK, and Australia

The Intel Arc A770 is available now in the US, UK, and Australia, with two variants: one with 16GB GDDR6 VRAM and an official US MSRP of $349 (about £280/AU$510), and one with 8GB GDDR6 VRAM and an official MSRP of $329 (about £265/AU$475).

Those are the launch MSRPs from October 2022, of course, and the cards have come down considerably in price in the year since their release, and you can either card for about 20% to 25% less than that. This is important, since the Nvidia GeForce RTX 4060 and AMD Radeon RX 7600 are very close to the 16GB Arc A770 cards in terms of current prices, and offer distinct advantages that will make potential buyers want to go with the latter rather than the former.

But those decisions are not as cut and dry as you might think, and Intel's Arc A770 holds up very well against modern midrange offerings, despite really being a last-gen card. And, currently, the 16GB variant is the only 1440p card that you're going to find at this price, even among Nvidia and AMD's last-gen offerings like the RTX 3060 Ti and AMD Radeon RX 6750 XT. So for 1440p gamers on a very tight budget, this card fills a very vital niche, and it's really the only card that does so.

  • Price score: 4/5

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Intel Arc A770: Design

  • Intel's Limited Edition reference card is gorgeous
  • Will fit most gaming PC cases easily
Intel Arc A770 Limited Edition Design Specs

Slot size: Dual slot
Length: 11.02 inches | 280mm
Height: 4.53 inches | 115mm
Cooling: Dual fan
Power Connection: 1 x 8-pin and 1 x 6-pin
Video outputs: 3 x DisplayPort 2.0, 1 x HDMI 2.1

The Intel Arc A770 Limited Edition that I'm reviewing is Intel's reference model that is no longer being manufactured, but you can still find some stock online (though at what price is a whole other question). 

Third-party partners include ASRock, Sparkle, and Gunnir. Interestingly, Acer also makes its own version of the A770 (the Acer Predator BiFrost Arc A770), the first time the company has dipped its toe into the discrete graphics card market.

All of these cards will obviously differ in terms of their shrouds, cooling solutions, and overall size, but as far as Intel's Limited Edition card goes, it's one of my favorite graphics cards ever in terms of aesthetics. If it were still easily available, I'd give this design five out of five, hands down, but most purchasers will have to opt for third-party cards which aren't nearly as good-looking, as far as I'm concerned, so I have to dock a point for that.

It's hard to convey from just the photos of the card, but the black finish on the plastic shroud of the card has a lovely textured feel to it. It's not quite velvety, but you know it's different the second you touch it, and it's something that really stands out from every other card I've reviewed.

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

The silver trim on the card and the more subtle RGB lighting against a matte black shroud and fans really bring a bit of class to the RGB graphics card I typically see. The twin fans aren't especially loud (not any more so than other dual-fan cards, at least), and the card feels thinner than most other similar cards I've reviewed and used, whether or not the card is thinner in fact.

The power connector is an 8-pin and 6-pin combo, so you'll have a pair of cables dangling from the card which may or may not affect the aesthetic of your case, but at least you won't need to worry about a 12VHPWR or 12-pin adapter like you do with Nvidia's RTX 4000-series and 3000-series cards.

You're also getting three DisplayPort 2.0 outputs and an HDMI 2.1 output, which puts it in the same camp as Nvidia's recent GPUs, but can't match AMD's recent move to DisplayPort 2.1, which will enable faster 8K video output. As it stands, the Intel Arc A770 is limited to 8K@60Hz, just like Nvidia. Will you be doing much 8K gaming on a 16GB card? Absolutely not, but as we get more 8K monitors next year, it'd be nice to have an 8K desktop running at 165Hz, but that's a very speculative prospect at this point, so it's probably not anything anyone looking at the Arc A770 needs to be concerned about.

  • Design Score: 4 / 5

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Intel Arc A770: Specs & features

  • Good hardware AI cores for better XeSS upscaling
  • Fast memory for better 1440p performance

Intel's Xe HPG architecture inside the Arc A770 introduces a whole other way to arrange the various co-processors that make up a GPU, adding a third, not very easily comparable set of specs to the already head-scratching differences between Nvidia and AMD architectures.

Intel breaks up its architecture into "render slices", which contain 4 Xe Cores, which each contain 128 shaders, a ray tracing processor, and 16 matrix processors (which are directly comparable to Nvidia's vaunted tensor cores at least), which handle graphics upsampling and machine learning workflows. Both 8GB and 16GB versions of the A770 contain eight render slices for a total of 4096 shaders, 32 ray processors, and 512 matrix processors.

The ACM-G10 GPU in the A770 runs at 2,100MHz base frequency with a 2,400MHz boost frequency, with a slightly faster memory clock speed (2,184MHz) for the 16GB variant than the 8GB variant's 2,000MHz. This leads to an effective memory speed of 16 Gbps for the 8GB card and 17.5 Gbps for the 16GB.

With a 256-bit memory bus, this gives the Arc A770 a much wider lane for high-resolution textures to be processed through, reducing bottlenecks and enabling faster performance when gaming at 1440p and higher resolutions thanks to a 512 GB/s and 559.9 GB/s memory bandwidth for the 8GB and 16GB cards, respectively.

All of this does require a good bit of power, though, and the Arc A770 has a TDP of 225W, which is higher than most 1440p cards on the market today.

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

As far as features all this hardware empowers, there's a lot to like here. The matrix cores are leveraged to great effect by Intel's XeSS graphics upscaling tech found in a growing number of games, and this hardware advantage generally outperforms AMD's FSR 2.0, which is strictly a software-based upscaler.

XeSS does not have frame generation though, and the matrix processors in the Arc A770 are not nearly as mature as Nvidia's 3rd and 4th generation tensor cores found in the RTX 3000-series and RTX 4000-series, respectively.

The Arc A770 also has AV1 hardware-accelerated encoding support, meaning that streaming videos will look far better than those with only software encoding at the same bitrate, making this a compelling alternative for video creators who don't have the money to invest in one of Nvidia's 4000-series GPUs.

  • Specs & features: 3.5 / 5

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Intel Arc A770: Performance

  • Great 1440p performance
  • Intel XeSS even allows for some 4K gaming
  • DirectX 9 and DirectX 10 support lacking, so older games will run poorly
  • Resizable BAR is pretty much a must

At the time of this writing, Intel's Arc A770 has been on the market for about a year, and I have to admit, had I gotten the chance to review this card at launch, I would probably have been as unkind as many other reviewers were.

As it stands though, the Intel Arc A770 fixes many of the issues I found when I reviewed the A750, but some issues still hold this card back somewhat. For starters, if you don't enable Resizable BAR in your BIOS settings, don't expect this card to perform well at all. It's an easy enough fix, but one that is likely to be overlooked, so it's important to know that going in.

Image 1 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 2 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 3 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 4 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 5 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 6 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 7 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 10 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 11 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 12 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 13 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 14 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 15 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)

In synthetic benchmarks, the A770 performed fairly well against the current crop of graphics cards, despite its effectively being a last-gen card. It is particularly strong competition against the Nvidia RTX 4060 Ti across multiple workloads, and it even beats the 4060 Ti in a couple of tests.

Its Achilles Heel, though, is revealed in the PassMark 3D Graphics test. Whereas 3DMark tests DirectX 11 and DirectX 12 workloads, Passmark's test also runs DirectX 9 and DirectX 10 workflows, and here the Intel Arc A770 simply can't keep up with AMD and Nvidia.

Image 1 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 2 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 3 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 4 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 5 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 6 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 7 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 10 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 11 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 12 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 13 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 14 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 15 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 16 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 17 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 18 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 19 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 20 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 21 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 22 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 23 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 24 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)

In non-ray-traced and native-resolution gaming benchmarks, the Intel Arc A770 managed to put up some decent numbers against the competition. At 1080p, the Arc A770 manages an average of 103 fps with an average minimum fps of 54. At 1440p, it averages 78 fps, with an average minimum of 47, and even at 4K, the A770 manages an average of 46 fps, with an average minimum of 27 fps.

Image 1 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 2 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 3 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 4 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 5 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 6 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 7 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 10 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 11 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 12 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)

Turn on ray tracing, however, and these numbers understandably tank, as they do for just about every card below the RTX 4070 Ti and RX 7900 XT. Still, even here, the A770 does manage an average fps of 41 fps, with an average minimum of 32 fps) at 1080p with ray tracing enabled, which is technically still playable performance. Once you move up to 1440p and 4K, however, your average title isn't going to be playable at native resolution with ray tracing enabled.

Image 1 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 2 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 3 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 4 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 5 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 6 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 7 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)

Enter Intel XeSS. When set to "Balanced", XeSS turns out to be a game changer for the A770, getting it an average framerate of 66 fps (with an average minimum of 46 fps) at 1080p, an average of 51 fps (with an average minimum of 38 fps) at 1440p, and an average 33 fps (average minimum 26 fps) at 4K with ray tracing maxed out.

While the 26 fps average minimum fps at 4K means it's really not playable at that resolution even with XeSS turned on, with settings tweaks, or more modest ray tracing, you could probably bring that up into the low to high 30s, making 4K games playable on this card with ray tracing turned on. 

That's something the RTX 4060 Ti can't manage thanks to its smaller frame buffer (8GB VRAM), and while the 16GB RTX 4060 Ti could theoretically perform better (I have not tested the 16GB so I cannot say for certain), it still has half the memory bus width of the A770, leading to a much lower bandwidth for larger texture files to pass through.

This creates an inescapable bottleneck that the RTX 4060 Ti's much larger L2 cache can't adequately compensate for, and so takes it out of the running as a 4K card. When tested, very few games managed to maintain playable frame rates even without ray tracing unless you dropped the settings so low as to not make it worth the effort. The A770 16GB, meanwhile, isn't technically a 4K card, but it can still dabble at that resolution with the right settings tweaks and still look reasonably good.

Image 1 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 2 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 3 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 4 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 5 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 6 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 7 of 9

Final performance scores for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 9

Final performance scores for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 9

Final performance scores for the Intel Arc A770

(Image credit: Future / Infogram)

All told, then, the Intel Arc A770 turns out to be a surprisingly good graphics card for modern gaming titles that can sometimes even hold its own against the Nvidia RTX 4060 Ti. It can't hold a candle to the RX 7700 XT or RTX 4070, but it was never meant to, and given that those cards cost substantially more than the Arc A770, this is entirely expected.

Its maximum observed power draw of 191.909W is pretty high for the kind of card the A770 is, but it's not the most egregious offender in that regard. All this power meant that keeping it cool was a struggle, with its maximum observed temperature hitting about 74 ºC.

Among all the cards tested, the Intel Arc A770 was at nearly the bottom of the list with the RX 6700 XT, so the picture for this card might have been very different had it launched three years ago and it had to compete with the RTX 3000-series and RX-6000 series exclusively. In the end, this card performs like a last-gen card, because it is. 

Despite that, it still manages to be a fantastic value on the market right now given its low MSRP and fairly solid performance, rivaling the RTX 4060 Ti on the numbers. In reality though, with this card selling for significantly less than its MSRP, it is inarguably the best value among midrange cards right now, and it's not even close.

  • Performance score: 3.5 / 5

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Should you buy the Intel Arc A770?

Buy the Intel Arc A770 if...

Don't buy it if...

Also Consider

If my Intel Arc A770 review has you considering other options, here are two more graphics cards for you to consider.

How I tested the Intel Arc A770

  • I spent several days benchmarking the card, with an additional week using it as my primary GPU
  • I ran our standard battery of synthetic and gaming benchmarks 
Test Bench

These are the specs for the test system used for this review:
CPU: Intel Core i9-13900K
CPU Cooler: 
Cougar Poseidon GT 360 AIO Cooler
Motherboard: MSI MPG Z790E Tomahawk Wifi
Memory: 
64GB Corsair Dominator Platinum RGB DDR5-6000
SSD: Samsung 990 Pro
PSU: Thermaltake PF3 1050W ATX 3.0
Case: Praxis Wetbench

I spent about two weeks with the Intel Arc A770 in total, with a little over half that time using it as my main GPU on my personal PC. I used it for gaming, content creation, and other general-purpose use with varying demands on the card.

I focused mostly on synthetic and gaming benchmarks since this card is overwhelmingly a gaming graphics card. Though it does have some video content creation potential, it's not enough to dethrone Nvidia's 4000-series GPUs, so it isn't a viable rival in that sense and wasn't tested as such.

I've been reviewing computer hardware for years now, with an extensive computer science background as well, so I know how graphics cards like this should perform at this tier.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

  • First reviewed October 2023
« Previous Page