Organizer
Gadget news
Lenovo Legion Go review: this is the true Steam Deck contender
5:00 pm | November 10, 2023

Author: admin | Category: Computers Consoles & PC Gadgets Gaming | Tags: , , | Comments: Off

Lenovo Legion Go: Two-minute review

The Lenovo Legion Go is the latest in the PC handheld market trend, following Valve’s Steam Deck and the Asus ROG Ally. Though my expectations were suitably tempered, getting to fully test out the portable gaming machine has convinced me of its superiority compared to ROG Ally and even compared to the gold-standard Steam Deck.

At first glance, it’s almost laughably large and weighs far more than its competition. Normally this would mean that its portability is shot, but Lenovo was ingenious in this regard and included a built-in kickstand right in the back. It’s a simple feature but absolutely game-changing, as it allows for long sessions without suffering fatigue from having to hold it. 

It also means that if you want to use it either for gaming or a PC replacement, there’s no need to purchase a separate docking station. It won’t be replacing the best gaming laptops or best gaming PCs anytime soon but it still adds more flexibility to this device. The portable runs on Windows 11 and, unlike the ROG Ally, this version of the OS is fully optimized for the Legion Go making for an incredibly smooth and perfectly intuitive UI.

The side controllers can also be detached a lá the Nintendo Switch joy-con style and wielded in each hand or attached to a piece and made into its own controller. You can also take one of the controllers and activate FPS Mode with the click of a switch, allowing for precision control with shooters and other genres that benefit from the best gaming mouse.

There are plenty of buttons scattered throughout the system as well, which are all fully customizable, and even a touchpad. It can get overwhelming, as it feels like everywhere your fingers go, there’s a button to press, but nothing activates without you setting it.

Performance-wise, the Legion Go can handle a wide variety of titles, from less demanding ones like Teenage Mutant Ninja Turtles: Shredder's Revenge or AAA titles like Cyberpunk 2077 and Baldur’s Gate 3. What’s impressive is how the portable can juggle multiple games at once without a hint of slowdown, and how easy it is to switch between said games, even if they’re from different PC storefronts. There is some slowdown and slight freezing in between gameplay, especially loading, but the gameplay itself remains smooth as butter for the most part.

Of course, the Lenovo Legion Go’s Achilles heel is its terrible battery life. You’ll only be getting a few hours of gameplay at most unless you turn down the settings significantly. But there’s hardly any point when the sole purpose of a PC handheld is to play the best PC games the way they’re meant to be played, so best keep the charger handy for this one.

Lenovo Legion Go: Price and availability

Spec sheet

Here is the Lenovo Legion Go configuration sent to TechRadar for review:

CPU: AMD Ryzen Z1 Extreme
Graphics: AMD RDNA Graphics
RAM: 16GB LPDDR5X (7500Mhz)
Screen: 8.8-inch QHD+, 144Hz, 500 nits, 97% DCI-P3 color gamut
Storage: Up to 1TB M.2 2242 SSD
Ports: 2x USB Type-C ports, 3.5mm headphone jack, microSD slot
Connectivity: 802.11ax 2x2 Wi-Fi + Bluetooth 5.1
Weight: 1.88 lbs (854 g)
Size: 11.76 x 5.16 x 1.60 inches (298.83 x 131 x 40.7 mm; W x D x H)

The Lenovo Legion Go starts at $699.99 / £700 (inc. VAT) / AU$1,299 with availability in the US, UK, and Australia. In the US market, there are two models, with the base version being an already steep $699.99 and including 16GB RAM and 512GB of storage. The more expensive $749.99 ups the storage to 1TB. Lenovo has stated that it plans on releasing cheaper models using the AMD Ryzen Z1 CPU in the future.

The UK only has the 512GB model for the same price as the US version, meaning that UK buyers are actually paying more. In Australia, there are two modes with the cheaper version coming with 256GB of storage and the pricier version equipped with 512GB of storage space.

Lenovo Legion Go: Design

Lenovo Legion Go on wooden table

(Image credit: Future)

Your first thought while looking at the Lenovo Legion Go is how large and weighty it is compared to its competitors, which is more than enough for it to be off-putting. However, there are several benefits to this. The first being that it ventilates much better than smaller handhelds like the ROG Ally, which meant I was able to game for long periods without dealing with any overheating issues, even when pushing through with more graphically and performance-intense titles.

Lenovo knew that it had to offset the weight issue of the portable, which is where the kickstand comes in. It’s built into the handheld and of very good quality, both the stand itself and the hinges. This lets you rest the Legion Go on reasonably-flat surface without the need for a separate docking station, while you use the portable as a gaming device or desktop replacement.

The bulky side controllers aren’t just window dressing either, as they’re detachable controllers similar to the Nintendo Switch’s joy cons. And like the joy cons, you can use them with each in one hand while you have the main display supported by the kickstand, though there are no motion sensors in them. There’s also a handy LED light ring around each joystick that indicates the controller’s current power and connection state.

Image 1 of 6

Lenovo Legion Go on wooden table

(Image credit: Future)
Image 2 of 6

Lenovo Legion Go on wooden table

(Image credit: Future)
Image 3 of 6

Lenovo Legion Go on wooden table

(Image credit: Future)
Image 4 of 6

Lenovo Legion Go on wooden table

(Image credit: Future)
Image 5 of 6

The Lenovo Legion Go gaming handheld.

(Image credit: Future)
Image 6 of 6

The Lenovo Legion Go gaming handheld.

(Image credit: Future)

If that isn’t enough, one of the controllers can be mounted on a base, and then activated with the click of a switch into FPS Mode. This mode allows for precise gameplay akin to a PC mouse, perfect for first and third-person shooters, as well as other genres that work best with a mouse.

The display is an absolutely gorgeous 8.8-inch QHD+ and comes with a great refresh rate of 144Hz, perfect for most gamers’ needs. It also has a surprisingly high 97% of the DCI-P3 color gamut, which shows how eye-catching and vivid the colors are. It sports an excellent touchscreen too, which pairs perfectly with the well-optimized OS.

And though Windows 11 is far more functional here than with the ROG Ally, its limitations show that when it comes to an optimized OS made solely for its PC handheld, Steam Deck is still king in that regard. For instance, when booting up
the Steam Deck for the first time, setup is so refreshingly simple and takes a mere minute. But the Legion Go's Windows 11 forces you to suffer through the same setup as any other Windows PC or laptop.

There's also the issue of Legion Space, which is pretty useless. Unlike ROG Ally's Armoury Crate CE which lets you at least log in directly to your storefronts of choice, Space gives you that illusion at first and then opens up a webpage. You have to install your storefronts first, then access them either through that or Legion Space. But at that point, the latter is useless.

Lenovo Legion Go: Performance

Lenovo Legion Go on wooden table

(Image credit: Future)

Thanks to its impressive specs, especially the miracle of the AMD Ryzen Z1 Extreme with AMD RDNA Graphics, the Lenovo Legion Go is an incredibly powerful handheld that’s capable of handling anything from 16-bit indie games to the most demanding of AAA titles. 

While there are longer load times for more demanding games, and even rare instances of brief freezing, during actual gameplay the experience is a smooth one that’s coupled with some truly impressive graphics. You have to be patient with the handheld but it returns the favor when you finally get to your title.

For instance, I tested out the Legion Go with Forza Horizon 5 on both Low and Medium settings. Though the game recommended Low and the framerate was indeed averaging around 59fps, I found that it ran quite well on Medium with ray tracing turned on, averaging out at a still solid 51fps. 

I was blown away by how beautiful the car, physics, and environments were while racing – it felt like I was gaming on a laptop for a moment. This was all done on the maximum resolution by the way, and I never felt the need to turn it down, though the option is there in the menu along with decreasing the refresh rate and more.

Lenovo Legion Go on wooden table

(Image credit: Future)

The audio quality is excellent as well, surprisingly so. Testing out the speakers with Forza Horizon 5, as music and sound design are vital to a racing sim, you could hear the roar of the car’s engine just over the commentating and fast-paced music with such clarity that I once again forgot that I was playing on a handheld.

The controls themselves are incredible, with the analog sticks moving the car with remarkable precision. They also feel good to use for long durations thanks to the high-quality padding on each one. They're hall effect joysticks which, according to Lenovo, ensures no joystick drift and minimal dead zones. As a bonus, they have an LED light ring, which alerts you to your controllers’ remaining battery power and connection. It’s particularly handy as an easy and immediate way to discern that information without having to check through the menu.

There are several of what Lenovo calls Thermal Modes, which control how powerful the performance is compared to the fan volume, similar to a gaming laptop. The highest performance mode is meant for a plugged-in experience, though you can still use it with battery-only power, and there’s also a balanced mode that’s meant for switching between tasks and a quiet mode that works like a battery-saving mode. You can of course customize your own mode too.

There’s also a separate menu option to maximize fan speeds, and it works wonders in keeping the whole system cool. The ventilation in general is impressive, with a smart design that keeps the majority of the heat away from where your fingers rest. It's most likely due to what Lenovo calls its Coldfront thermal technology, which features a liquid crystal polymer 79-blade fan.

Lenovo Legion Go: Battery Life

Lenovo Legion Go on wooden table

(Image credit: Future)

Just as with every other PC handheld, the Lenovo Legion Go’s Achilles' heel is its abysmal battery life. You’ll only be getting a few hours of gameplay at most before it shuts off unless you turn down the settings significantly. But what’s the point in that, when you’re buying a portable like this to play AAA titles at gloriously high settings?

And like the Nintendo Switch, the controllers are separate entities to be charged as well. Though everything can be charged at once, the two additional accessories increase the charging time.

Should you buy the Lenovo Legion Go?

Buy the Lenovo Legion Go if...

Don't buy it if...

First reviewed November 2023

MediaTek Dimensity 9300 announced with big-core only CPU, boosted GPU with ray-tracing
5:29 pm | November 6, 2023

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

The latest flagship chipset from MediaTek is here with the Dimensity 9300 and it aims to go head to head with Qualcomm’s Snapdragon 8 Gen 3 powering the incoming crop of Android flagships from Chinese smartphone makers. Dimensity 9300 brings an all-big core CPU design Dimensity 9300 is built on TSMC’s third-generation 4nm+ process node and its standout feature is the all-big core CPU design. You get a prime Cotex-X4 core clocked at 3.25GHz alongside 3x Cortex-X4 cores @ 2.85GHz and 4x Cortex-A720 cores @ 2.0GHz all based on the Armv9 architecture. MediaTek claims Dimensity 9300...

Apple MacBook Pro 14-inch M3 Max (2023) review: the Mac gaming rig is here
5:00 pm |

Author: admin | Category: Computers Computing Gadgets Laptops Macbooks | Tags: , , , | Comments: Off

Two-minute review

Spec Sheet

Here is the MacBook Pro (M3 Max, 2023) configuration sent to TechRadar for review:

CPU: Apple M3 Max (16-core)
Graphics: Integrated 40-core GPU
RAM: 64GB [Unified LPDDR5]
Screen: 14.2-inch, 3024 x 1964 Liquid Retina XDR display 600 nits (1,600 peak nits with HDR content) brightness, wide color P3 gamut
Storage: 2TB SSD
Ports: 3x Thunderbolt 4 (USB-C), 3.5mm headphone jack, MagSafe 3 charging port, SDXC, HDMI
Connectivity: Wi-Fi 6e, Bluetooth 5.3
Camera: 1080p FaceTime HD webcam
Weight: 3.6 pounds / 1.24kg
Size: 12.31 x 8.71 x 0.61 inches / 31.26 x 22.12 x 1.55cm; W x D x H)

The story of the new MacBook Pro 14 is less about a new laptop on the block than it is about Apple showcasing the raw power of Apple's newest silicon, the M3 chip. Stuffed inside my brooding Space Black portable is Apple's apex M3 processor, the M3 Max. I tell you this so that you don't mistakenly expect that your $1,599 / £1,699 / AU$2,699 MacBook Pro 14 with an M3 chip will provide the same performance as what's cooking on my $4,299 / £4,399 / AU$7,249 review unit.

The base-model M3 will still support hardware-based ray tracing and mesh shading. It'll still have that blazing-fast neural network. But you'll have many fewer CPU and GPU cores, and much less memory; the M3 Max model has 64GB. You're buying a casually powerful Pro system. The M3 Max MacBook Pro came to play hard and work hard (it's tough to say which it'll go at harder).

MacBook Pro 14 M3 Max (2023) REVIEW

(Image credit: Future / Lance Ulanoff)

From the impressive design and materials (aluminum chassis with a brand-new anodizing technique for the Space Black finish that finally cuts down on fingerprints) to the expressive keyboard that is now my favorite MacBook typing experience, to a versatile macOS Sonoma platform that supports every activity from entertainment and gaming to email, web browsing, and intense photo and video work, there is not one hint of performance disappointment in this system.

It's without a doubt the best MacBook I've ever used, and I think it stands a chance of giving some of the best gaming laptops a real run for their money.

This is not, by any stretch of the imagination, an affordable laptop, and if you're looking for thin, light, relatively budget-friendly, and are not working on massive CAD files, 4K video streams, or playing the latest AAA games, then perhaps the still-stellar MacBook Air M2 (no M3 yet) is more your style, or even the MacBook Pro 14 with M3. As mentioned, that model starts at $1,599 / £1,699 / AU$2,699 – that's $100 cheaper than the 14-inch MacBook Pro with M2 in the US, but is also a reminder that there's no $1,299 tier in MacBook Pro space. 

There's little doubt in my mind that the complete lineup of MacBook Pro 14 M3 machines, from the base M3 to the M3 Pro and this M3 Max, will take their places among the best laptops money can buy. And, yes, the MacBook Pro 14 M3 Max could  also snare a spot on our best gaming laptops list.

MacBook Pro 14-inch (M3 Max, 2023) review: Price and availability

MacBook Pro 14 M3 Max (2023) REVIEW

(Image credit: Future / Lance Ulanoff)
  • M3 range starts at $1,599 / £1,699 / AU$2,699
  • Tested model costs $4,299 / £4,399 / AU$7,249
  • No 13-inch option (which would have cost less)

Apple announced the new MacBook Pro 14-inch range at its October 30 Scary Fast event, alongside new 16-inch MacBook Pros. The 14-inch MacBook Pro now comes with a choice of M3, M3 Pro, and M3 Max chips, the latest generation of Apple's own silicon. Meanwhile, the 16-inch MacBook Pro is only available with the higher-end (and more expensive) M3 Pro and M3 Max. There's also a new iMac 24 running on the base M3 SoC.

Preorders are live now, and the new M3 and M3 Pro MacBooks will go on sale and ship from November 7, while the M3 Max models will begin shipping later in November.

The MacBook Pro 14-inch M3 range starts at $1,599 / £1,699 / AU$2,699 (it's worth noting that Apple has discontinued the 13-inch MacBook Pro). As configured, our Space Black MacBook Pro 14-inch with an M3 Max SoC, 64GB of RAM and a 2TB hard drive has a list price of $4,299 / £4,399 / AU$7,249.

  • Price score: 4/5

MacBook Pro 14-inch (M3 Max, 2023) review: Design

  • Same design
  • More power squeezed into the same space and weight
  • An awesome new color option

Apple has changed virtually nothing about the MacBook Pro design from the 14-inch model it launched earlier this year with an M2 chip. The dimensions are the same, with a thickness of 0.61 inches / 1.55cm, a width of 12.31 inches / 31.26cm, and a depth of 8.71 inches / 22.12cm.

The weight is roughly the same, though the M3 Max 14-inch MacBook Pro is, at 3.6lbs / 1.24kg, the heaviest of the 14-inch bunch.

The screen size is the same, and on the M3 Max and M3 Pro 14-inch MacBooks the port placement is unchanged from the previous generation, as are the number and types of ports. You get three USB-C Thunderbolt 4 ports, a 3.5mm headphone jack, an HDMI port, an SD card slot, and a MagSafe charge port (if you opt for the base M3 model you only get two Thunderbolt ports)

If you stack the MacBook Air 13-inch M2 on top of the MacBook Pro 14-inch M3 Max, the latter doesn't look that much larger, but it is substantially thicker and heavier. When I opened it up to reveal that familiar Liquid Retina XDR display and backlit Magic Keyboard, I noted that the keyboard and trackpad are, from a size perspective, exactly the same as on the MacBook Air. Apple uses the extra chassis space on the Pro to accommodate a six-speaker system that's split to sit on either side of the keyboard; the larger chassis also provides just a bit more space to rest your palms.

Image 1 of 7

MacBook Pro 14 M3 Max (2023) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 2 of 7

MacBook Pro 14 M3 Max (2023) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 3 of 7

MacBook Pro 14 M3 Max (2023) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 4 of 7

MacBook Pro 14 M3 Max (2023) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 5 of 7

MacBook Pro 14 M3 Max (2023) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 6 of 7

MacBook Pro 14 M3 Max (2023) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 7 of 7

MacBook Pro 14 M3 Max (2023) REVIEW

(Image credit: Future / Lance Ulanoff)

As with the previous 14-inch MacBook Pro, the matte keyboard feels as good as it looks. It's expansive, and there's enough key travel to make every touch sure and satisfying; it's a pleasure to type on. The power button still doubles as a Touch ID biometric scanner, which I use to unlock the laptop and sign into various online services. I still hope for the day that Apple introduces Face ID to the FaceTime camera notch that sits at the top of the display. 

But enough about everything that's the same. I want to talk about the new Space Black finish. Sure, Apple has done colorful and even inspired finishes before, but I'd argue there's never been anything quite like the new Space Black finish on this new MacBook Pro 14 M3 Max (the 14-inch M3 Pro and M3 Max MacBooks are available in Space Black or silver, while the M3 model comes in Space Grey or silver). 

It's not just black – it's a light-swallowing black. I noticed this when trying to photograph the new laptop, and watched as it basically devoured my studio lighting. The surface is just shy of being matte black, and that low reflectivity really stops the light from bouncing back at you. The new color gives the laptop a bold, aggressive, and no-nonsense look. I think any gamer would be proud to cart this laptop into their next tournament.

Apple has developed a new anodizing process for the Space Black color to create a fingerprint-resistant surface, and I can report that it did repel most of my handprints. That said, I have dry hands, and I did note that the sweatier the palm, the more visible the marks left on the laptop's surface, although even those fingerprints were faint. Just remember that this is a fingerprint-resistant MacBook Pro, not a fingerprint-proof one.

  • Design score: 5/5

MacBook Pro 14-inch (M3 Max, 2023) review: Display

  • Same resolution
  • Still excellent
  • It's brighter! (With standard imagery)

MacBook Pro 14 M3 Max (2023) REVIEW

(Image credit: Future / Lance Ulanoff)

In typical fashion, Apple has managed to not change anything about its MacBook Pro Liquid Retina XDR display, but has still managed to squeeze some extra performance out of it thanks to the new and more efficient 3-nanometer M3 Max chip. 

The screen has the exact same resolution as the last display panel (3024 x 1964), and the same one million-to-1 contrast ratio. Even the same peak brightness of 1,600 nits with HDR content is unchanged, although for day-to-day brightness with standard content we now get 600 nits, as opposed to the 500 nits on the last MacBook Pro. 

MacBook Pro 14 M3 Max (2023) in use

(Image credit: Future)

In real-world use, I found that the MacBook Pro 14-inch with M3 Max is quite capable of beating back even direct sunlight; I'm convinced I could work pretty much anywhere on this laptop.

Overall, this is a beautiful screen. Thanks to bright colors and inky blacks, everything on it gets a premium look. Do I mind the FaceTime camera notch? Not really. Video usually plays in letterbox format and well below it, and it doesn't interfere with the business part of apps and web browsing. Even when I played games – and I played a lot of them – I didn't notice it.

MacBook Pro 14 M3 Max (2023) REVIEW

(Image credit: Future / Lance Ulanoff)
  • Display score: 4.5/5

MacBook Pro 14-inch (M3 Max, 2023) review: Performance

  • Apple silicon at its finest
  • Good luck finding a task it can't handle
  • AAA gaming can chew through battery life
Benchmarks

Here’s how the MacBook Pro (M3 Max, 2023) performed in our suite of benchmark tests:

Cinebench R24CPU: Single-core: 140; Multi-core: 1,588, GPU: 12791; MP Ratio: 10.94
Geekbench 6 Single-core: 3,160; Multi-core: 21,236; GPU Metal: 158,215; OpenCL: 92,159
Battery life: 10 to 12 hours with mixed use

MacBook Pro 14 M3 Max (2023) in use

(Image credit: Future)

I really like the way Apple makes its chip series more powerful. It uses a standardized architecture, and then wraps more and more cores around it. The benefit is that all systems running the base 3-nanometer process M3 SoC share the same impressive features, but some perform faster than others.

While the bare-bones M3 in the base-model MacBook Pro 14-inch (with one fewer Thunderbolt ports) has an 8-core CPU (four efficiency cores and four performance cores), and a 10-core GPU, the M3 Max chip in the machine I tested has a 16-core CPU and a 40-core GPU. According to Geekbench 6, the system is running a 4.1GHz (single-core) and an estimated 3.3GHz (multi-core).

I ran a lot of benchmarks for raw performance scores, because that's what you do. Unsurprisingly, the GeekBench 6 numbers were startling, and while Apple has taken pains to compare the base M3 to the three-year-old M1 performance, comparing my MacBook Air M2 to the M3 Max was a real eye-opener. Granted, the M3 Max and the base M2 are not really directly comparable, but I think these figures do give you a sense of why you might pay so much for an M3 Max system stuffed with, in my case, 64GB of unified memory (you can, by the way, get a more expensive system with up to 128GB of unified memory and 8TB of storage).

MacBook Pro 14 M3 Max Geekbench Benchmarks

MacBook Pro 14 M3 Max Geekbench Benchmarks (Image credit: Future)

It's easy to forget that Apple silicon is running on the ARM-64 platform, and that not all MacOS apps run natively on it. The reason I often forget this? Everything works. There's never been a moment in my three years of experience with Apple silicon where the MacBook throws up its digital hands and says, "Sorry, I can't run this app." Part of this is down to the rapid adoption of Apple silicon by Apple partners, and also because the Rosetta 2 system (which can translate between x86 code and Apple silicon) runs quietly in the background, managing all apps that are still looking for an x86 platform.

Okay, the MacBook Pro 14 M3 Max is not perfect on the compatibility front. The x86-compatible Steam, which I used for most of my games, did crash. But weirdly, so did iMovie, repeatedly, and that's an ARM native, and later the ARM-friendly Adobe Photoshop 2024. At least the system as a whole never crashes, and doesn't even know the meaning of a blue screen.

Since we're mostly not thinking about compatibility, we can just focus on performance, and the M3 Max is stunning. To be clear, I'm not a professional video editor or doctor analyzing 3D MRI scans, but I did my best to press this system and found it shrugged off all tasks. I opened 40 or so browser tabs on both Safari and Chrome (normally a soul-crushing task for any system), launched Apple TV+, installed Steam, and then played Tomb Raider Legacy. I might as well have been composing something in Notes (oh, wait, I was doing that, too). I loaded up FinalCut Pro with 4K 30fps video as well as some 4K 24fps ProRes HDR content, and edited and manipulated them with ease.

While not visually evident, I think it's also safe to assume that some of the system's speed and ease with all these apps – often running concurrently – is the new Dynamic Caching technology. This is essentially a more efficient way of using available memory. Instead of X number of registers always being used for the same task, the system only applies the memory needed for each, explicit task. The result is a lot less wasted memory and more left over for managing other critical tasks.

Apple spent considerable time during its Scary Fast event telling us how it engineered the new M3 SoC with features specifically designed to handle graphics-intensive tasks like, obviously, AAA games. Hardware-based ray tracing and mesh shading might improve how some of your most expensive apps look, but we all know that it's really all about gaming.

Image 1 of 6

MacBook Pro 14 M3 Max (2023) in use

Shadow of Tomb Raider (note framerate in upper left). (Image credit: Future)
Image 2 of 6

MacBook Pro 14 M3 Max (2023) in use

Shadow of Tomb Raider benchmark test (Image credit: Future)
Image 3 of 6

MacBook Pro 14 M3 Max (2023) in use

Running FinalCut Pro and editing multiple 4K videos. (Image credit: Future)
Image 4 of 6

MacBook Pro 14 M3 Max (2023) in use

Lies of P with Benchmark window open (Image credit: Future)
Image 5 of 6

MacBook Pro 14 M3 Max (2023) in use

Rise of the Tomb Raider gameplay. (Image credit: Future)
Image 6 of 6

MacBook Pro 14 M3 Max (2023) in use

Lies of P Benchmark window (Image credit: Future)

Naturally, I played some games. First, a few hours of the engaging Rise of Tomb Raider, which I will note is not that easy when you're using the keyboard. The eight-year-old game looked good, and gameplay was smooth and immersive. I usually wore my AirPods Pro (they connected instantly) so as to not annoy people around me.

Next, I installed Lies of P, a brand-new game seemingly inspired by Pinocchio, that is at home on all major consoles and now, thanks to Steam, the MacBook Pro, too.

It's a beautiful and quietly atmospheric game that starts in an old, deserted train station. Everything is rendered in such exquisite detail and, thanks to all the M3 Max's onboard graphics power, every surface looked about as real as they can in a game of this nature. 

The system seemed to keep up with the action quite well (I played this game with a Bluetooth-connected PlayStation 5 controller; the system supports Bluetooth 5.3, which has just 100ms of latency). I used Terminal for a real-time view of Frame rates and found that, depending on the action, they bounced between 30 and 60fps. Action generally looked smooth in most sequences, including fast-paced puppet-on-puppet battles. 

l also played Shadow of the Tomb Raider at the highest possible resolution of 3024 x 1964, and with every atmospheric element turned to the absolute highest. At times, the fans were so loud that they drowned out the game sounds, but the gameplay and graphics were all at their cinematic best, and in the game's benchmarks I could achieve 108fps at 1920 x 1200 mode and 56fps at the highest, native resolution settings. Pretty impressive.

When I cranked all of Total WarHammer III settings to, where possible, ultra, (with 1920 x 1220 resolution), the fan churned on high, and there was some object (or sprite) flickering in the benchmark test. But the detail was all there, and the system reported an average frame rate of 56.1. Then I reran the test at the MacBook Pro's highest native resolution. The gameplay looked even better, naturally, though, the fps dropped to 33.8.

I won't claim to be a hardcore gamer, but it's clear to me that game developers are now thinking about the Mac as a viable platform, using the Game Porting Toolkit Apple released at WWDC 2023 to bring AAA games to the platform on the same date they arrive on your best console. It's not just that the games arrive on the Mac; it's that they're as playable and as immersive as anything on a Windows 11 gaming rig.

Overall, a quick look at all the benchmarks comparing the M1 Max to this M3 Max system shows a quantum leap across every aspect of performance. And, yes, the single number that is lower, AI Turn Time in Civilization VI, is also an improvement, as it shows the system taking less time than before to make that turn.

MacBook Pro 14 M3 max benchmarks

(Image credit: Future)
  • Performance score: 5/5

MacBook Pro 14-inch (M3 Max, 2023) review: Audio and video

MacBook Pro 14 M3 Max (2023) in use

(Image credit: Future)

Thanks to the larger system chassis, Apple fits three speakers on either side of the keyboard that can produce loud, clear sound. I played a wide variety of music, video, and gaming content through them. It all sounded great, with voices sharp and high notes clear as a bell. What this sound system lacks, though, is any discernable bass. Now, I wouldn't really expect the MacBook Pro 14 M3 Max's relatively tiny speakers to provide chest-thumping sound. Still, when I played White Stripes Seven Nation Army and Eminem's Lose Yourself I was struck by how flat some of the drums and backbeats sounded. It's not completely devoid of the richness necessary to deliver a nice drum solo, but I found the base side a bit hollow, robbing the tunes of their head-banging essence.

Remarkably, the MacBook Pro 14 still ships with a 3.5mm headphone jack. I'm sure audio and video professionals use it in their work, but for most people, the support you'll find for your best AirPods Pros (especially the head-tracking spatial audio) will be more than enough in-ear audio support.

The MacBook Pro 14 M3 Max comes equipped with the same 1080 FaceTime camera as its predecessor. I can tell you that it gives your callers a nice clear view of you and, thanks to the new native Sonoma webcam features, I can use gestures to set off fireworks, drop confetti, pop up thumbs-up emojis, and release balloons during any video call. My wife wasn't as amused as I thought she'd be.

MacBook Pro 14-inch (M3 Max, 2023) review: Battery life

  • Rated for 18 hours
  • Lasted in our tests over 12 hours with varied use
  • Charges quickly

You may have read some reports that the new MacBook Pro can manage up to 22 hours of battery life. That's the promise for the 14-inch M3 model; however, for my more powerful and more power-hungry M3 Max 14-inch MacBook Pro, the maximum I can expect is 18 hours, and that's only if I do nothing but, say, stream virtually all episodes of Ted Lasso. The number drops down to 12 hours if I'm browsing the web over Wi-Fi. And, in my experience, the duration truly plummets if you play a AAA game like Lies of P or even Tomb Raider Legacy on battery power.

When I started playing the latter game I had about 73% battery life left. Within a couple of hours, it was below 20%. It's clear that the MacBook Pro 14 M3 Max will give you all the gaming power you want and need (I usually played in High Power mode), but there's probably also an assumption that you're playing while plugged in.

My average battery life has been roughly 12 hours of mixed use, which is a little bit less than I was expecting from this more efficient 3-nanometer SoC.

I do have some good news. Fast charging works as promised, and I topped off to 50% in 30 minutes using the included 96W charge adapter and the woven black USB-C-to-MagSafe cable that strikes a discordant note when plugged into the perfectly white adapter (I'm not sure why Apple didn't make that Space Black too).

  • Battery life score: 4/5

Should you buy the MacBook Pro 14-inch (M3 Max, 2023)?

Buy it if…

 Don’t buy it if…

First reviewed November 2023

Also consider...

If our Apple MacBook Pro 14-inch M3 Max (2023) review has you considering other options, here are three more laptops to consider...  

Testing scorecard

How we test

I've spent decades reviewing Apple products, including many of its laptops and desktop systems (I've used Macs on and off since 1985). 

For this review, I spent many hours with Apple's newest MacBook Pro and what it says is the most powerful silicon it has ever produced. I did my best to run it through a variety of tasks and played multiple games on it. I also ran a battery of benchmark tests to assess raw performance. We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained, regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

Intel Core i5-14600K review: wait for Meteor Lake
4:00 pm | October 17, 2023

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Intel Core i5-14600K: Two-minute review

The Intel Core i5-14600K is not the kind of processor you're really going to want to upgrade to, despite technically offering the best value of any processor I've tested.

First, the good. This is one of the best processor values you're going to find on the market, no matter what happens with the price of its predecessor. Currently, it has the best performance for its $319 price tag (about £255/AU$465), and AMD's competing Ryzen 5 7600X isn't all that close. If you're looking to get the most bang for your buck today, then the Intel Core i5-14600K is it.

In terms of performance, this isn't a bad chip at all; I'd even say it's a great one if you take its predecessor out of the running, which will inevitably happen as its last remaining stock gets bought up. It doesn't have the performance of the Intel Core i7-14700K, but that's a workhorse chip, not the kind that's meant to power the best computers for the home or the best budget gaming PCs as these chips start making their way into prebuilt systems in the next couple of months.

For a family computer or one that's just meant for general, every day use, then this chip is more than capable of handling whatever y'll need it for. It can even handle gaming fairly well thanks to its strong single core performance. So, on paper at least, the Core i5-14600K is the best Intel processor for the mainstream user as far as performance goes.

The real problem with the i5-14600K is that its performance is tragically close to the Core i5-13600K's. And even though the MSRP of the Intel Core i5-13600K is technically higher than that of the Core i5-14600K, it's not going to remain that way for very long at all.

The real problem with the i5-14600K, and one that effectively sinks any reason to buy it, is that its performance is tragically close to the Core i5-13600K's.

As long as the i5-13600K is on sale, it will be the better value, and you really won't even notice a difference between the two chips in terms of day-to day-performance.

That's because there's no difference between the specs of the 14600K vs 13600K, other than a slightly faster turbo clock speed for the 14600K's six performance cores.

While this does translate into some increased performance, it comes at the cost of higher power draw and temperature. During testing, this chip hit a maximum temperature of 101ºC, which is frankly astounding for an i5. And I was using one of the best CPU coolers around, the MSI MAG Coreliquid E360 AIO, which should be more than enough to keep the temperature in check to prevent throttling.

Image 1 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 2 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 3 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 4 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 5 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 6 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 7 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 8 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 9 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 10 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 11 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 12 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 13 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)

Looking at the chip's actual performance, the Core i5-14600K beats the AMD Ryzen 5 7600X and the Intel Core i5-13600K in single core performance, multi core performance, and with productivity workloads, on average. Other than its roughly 44% better average multi core performance against the Ryzen 5 7600X, the Core i5-14600K is within 3% to 4% of its competing chips.

Image 1 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 2 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 3 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 4 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 5 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 6 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 7 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)

In creative workloads, the Core i5-14600K again manages to outperform the Ryzen 5 7600X by about 31% on average, but it's just 2.4% better than its predecessor, and none of these chips are especially great at creative content work. If you're messing around with family albums or cutting up TikTok videos, any one of these chips could do that fairly easily. For heavier-duty workloads like video encoding and 3D rendering, the Intel chips hold up better than the mainstream Ryzen 5, but these chips really aren't practical for that purpose.

Image 1 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 2 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 3 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 4 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 5 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 6 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)

On the gaming front, it's more of the same, though now at least the Ryzen 5 7600X is back in the mix. Overall, the Core i5-14600K beats its 13th-gen predecessor and AMD's rival chip by about 2.1% and 3.2% respectively.

Image 1 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

All of this comes at the cost of higher power draw and hotter CPU temperatures, though, which isn't good especially for getting so little in return. What you really have here is an overclocked i5-13600K, and you can do that yourself and save some money by buying the 13600K when it goes on sale, which is will.

An Intel Core i5-14600K against its promotional packaging

(Image credit: Future / John Loeffler)

Intel Core i5-14600K: Price & availability

  • How much does it cost? US MSRP $319 (about £255/AU$465)
  • When is it out? October 17, 2023
  • Where can you get it? You can get it in the US, UK, and Australia

The Intel Core i5-14600K is available in the US, UK, and Australia as of October 17, 2023, for an MSRP of $319 (about £255/AU$465). 

This is a slight $10 price drop from its predecessor, which is always good thing, and comes in about $20 (about £15/AU$30) more than the AMD Ryzen 5 7600X, so fairly middle of the pack price-wise.

In terms of actual value, as it goes to market, this chip has the highest performance for its price of any chip in any product tier, but only by a thin margin, and one that is sure to fall very quickly once the price on the 13600K drops by even a modest amount.

Intel Core i5-14600K: Specs

Intel Core i5-14600K: Verdict

  • Best performance for the price of any chip tested...
  • ...but any price drop in the Core i5-13600K will put the 14600K in second place
  • Not really worth upgrading to with the Core i7-14700K costing just $90 more
Image 1 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 2 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 3 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 4 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 5 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 6 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 7 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)

Ultimately, the market served by this chip specifically is incredibly narrow, and like the rest of the Raptor Lake Refresh line-up, this is the last hurrah for the Intel LGA 1700 socket.

That means if you go out and buy a motherboard and CPU cooler just for the 14th-gen, it's a one time thing, since another generation on this platform isn't coming. It doesn't make sense to do that, so, if you're upgrading from anything earlier than the 12th-gen, it just makes so much more sense to wait for Meteor Lake to land in several months time and possibly get something really innovative.

If you're on a 12th-gen chip and you can't wait for Meteor Lake next year, the smartest move is to buy the i7-14700K instead, which at least gives you i9-13900K-levels of performance for just $90 more than the i5-14600K.

Ultimately, this chip is best reserved for prebuilt systems like the best all-in-one computers at retailers like Best Buy, where you will use the computer for a reasonable amount of time, and then when it becomes obsolete, you'll go out and buy another computer rather than attempt to upgrade the one you've got.

In that case, buying a prebuilt PC with an Intel Core i5-14600K makes sense, and for that purpose, this will be a great processor. But if you're looking to swap out another Intel LGA 1700 chip for this one, there are much better options out there.

Should you buy the Intel Core i5-14600K?

Buy the Intel Core i5-14600K if...

Don't buy it if...

Also Consider

If my Intel Core i5-14600K review has you considering other options, here are two processors to consider... 

How I tested the Intel Core i5-14600K

  • I spent nearly two weeks testing the Intel Core i5-14600K
  • I ran comparable benchmarks between this chip and rival midrange processors
  • I gamed with this chip extensively
Test System Specs

These are the specs for the test system used for this review:

Intel Motherboard: MSI MPG Z790E Tomahawk Wifi
AMD Motherboard: Gigabyte Aorus X670E Extreme
CPU Cooler:
MSI MAG Coreliquid E360 AIO
Memory:
32GB SK Hynix DDR5-4800
SSD: Samsung 990 Pro
PSU: Thermaltake PF3 1050W ATX 3.0
Case: Praxis Wetbench

I spent about two weeks testing the Intel Core i5-14600K and its competition, primarily for productivity work, gaming, and content creation.

I used a standard battery of synthetic benchmarks that tested out the chip's single core, multi core, creative, and productivity performance, as well as built-in gaming benchmarks to measure its gaming chops. 

I then ran the same tests on rival chips from AMD as well as the other 14th-gen chips in the Raptor Lake Refresh launch lineup and 13th-generation Raptor Lake processors. For Intel chips, I used the same motherboard, RAM, SSD, and graphics card to ensure I was isolating just the CPU's performance across every chip. For AMD chips, I used a comparable AM5 motherboard so differences in the motherboard configuration and circuitry are mitigated to the largest extent possible.

I've been testing and reviewing computer hardware for years now, and with an extensive background in computer science, I know processors in and out, and I use that knowledge to ensure every chip is thoroughly tested.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed October 2023

Intel Core i7-14700K review: salvaging Raptor Lake Refresh with i9-13900K performance
4:00 pm |

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Intel Core i7-14700K: One-minute review

The Intel Core i7-14700K is the workhorse CPU in the Intel's 14th generation launch line-up, and like any good workhorse, it's going to be the one to do the heavy lifting for this generation of processors. Fortunately for Intel, the Core i7-14700K succeeds in keeping Raptor Lake Refresh from being completely forgettable.

Of all the chips launched on October 17, 2023, the Core i7-14700K is the only one to get a substantive spec upgrade over its predecessor as well as a slight cut in price to just $409 (about £325/AU$595), which is $10 less than the Intel Core i7-13700K it replaces.

So what do you get for $10 less? Gen-on-gen, you don't get a whole lot of improvement (about 6% better performance overall compared to the 13700K), but that figure can be deceiving, since the Core i7-13700K was at the top of our best processor list for a reason. 

With the 13700K's performance being within striking distance of the Intel Core i9-13900K, that 6% improvement for the 14700K effectively closes the gap, putting the 14700K within just 3% of the 13900K overall, and even allowing it to pull ahead in average gaming performance, losing out to only the AMD Ryzen 7 7800X3D.

Fortunately for Intel, the Core i7-14700K succeeds in keeping Raptor Lake Refresh from being completely forgetable.

In terms of productivity and general performance, the Core i7-14700K shines as well, going toe to toe with the best AMD processors like the AMD Ryzen 9 7950X and AMD Ryzen 9 7950X3D, giving it a very strong claim on being the best Intel processor processor for most people.

Given its excellent mix of performance and price, the Intel Core i7-14700K could very well be the last Intel chip of the LGA 1700 epoch that anyone should consider buying, especially if you're coming from a 12th-gen chip. 

With the Core i9-13900K outperforming the Intel Core i9-12900K by as much as 25% in some workloads, someone coming off an i9-12900K or lower will find it hard to believe that an i7 could perform this well, but that's where we're at. And with the i7-14700K coming in about 30% cheaper than the Intel Core i9-14900K, while still managing to come remarkably close in terms of its performance, the Intel Core i7-14700K is the Raptor Lake Refresh chip to buy if you're going to buy one at all.

An Intel Core i7-14700K with its promotional packaging

(Image credit: Future / John Loeffler)

Intel Core i7-14700K: Price & availability

  • How much does it cost? US MSRP $409 (about £325/AU$595)
  • When is it out? October 17, 2023
  • Where can you get it? You can get it in the US, UK, and Australia

The Intel Core i7-14700K is available on October 17, 2023, with a US MSRP of $409 (about £325/AU$595), which is a slight decrease from its predecessor's MSRP of $419 (about £335/AU$610), and about 31% lower than the Intel Core i9-14900K and 32% percent lower than the AMD Ryzen 9 7950X. 

It's also cheaper than the AMD Ryzen 7 7800X3D, and just $10 more expensive than the AMD Ryzen 7 7700X, putting it very competitively priced against processors in its class.

The comparisons against the Core i9 and Ryzen 9 are far more relevant, however, since these are the chips that the Core i7-14700K are competing against in terms of performance, and in that regard, the Intel Core i7-14700K is arguably the best value among consumer processors currently on the market.

  • Price score: 4 / 5

Intel Core i7-14700K: Specs & features

  • Four additional E-Cores
  • Slightly faster clock speeds
  • Increased Cache
  • Discrete Wi-Fi 7 and Thunderbolt 5 support

The Intel Core i7-14700K is the only processor from Intel's Raptor Lake Refresh launch line-up to get a meaningful spec upgrade.

Rather than the eight performance and eight efficiency cores like the i7-13700K, the i7-14700K comes with eight performance cores and 12 efficiency cores, all running with a slightly higher turbo boost clock for extra performance. The i7-14700K also has something called Turbo Boost Max Technology 3.0, which is a mouthful but also gives the best performing P-core an extra bump up to 5.6GHz so long as the processor is within power and thermal limits.

The increased core count also adds 7MB of additional L2 cache for the efficiency cores to use, further improving their performance over the 13700K's, as well as four additional processing threads for improved multitasking.

It has the same TDP of 125W and same Max Turbo Power rating of 253W as the 13700K, with the latter being the upper power limit of sustained (greater than one second) power draw for the processor. This ceiling can be breached, however, and processing cores can draw much more power in bursts as long as 10ms when necessary.

There is also support for discrete Wi-Fi 7 and Bluetooth 5.4 connectivity, as well as discrete Thunderbolt 5 wired connections, so there is a decent bit of future proofing in its specs.

  • Chipset & features score: 4 / 5

An Intel Core i7-14700K slotted into a motherboard

(Image credit: Future / John Loeffler)

Intel Core i7-14700K: Performance

  • Outstanding performance on par with the i9-13900K
  • Best gaming performance of any Intel processor
  • More power hungry than predecessor, so also runs hotter

The Intel Core i7-14700K is arguably the best performing midrange processor on the market, coming within striking distance of the Core i9-13900K and Ryzen 9 7950X across most workloads, including very strong multi core performance thanks to the addition of four extra efficiency cores.

Image 1 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 2 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 3 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 4 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 5 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 6 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 7 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 8 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 9 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 10 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 11 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 12 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 13 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)

The strongest synthetic benchmarks for the 14700K are single core workloads, which puts it effectively level with the Core i9-13900K and often beating the Ryzen 9 7950X and 7950X3D chips handily. 

This translates into better dedicated performance, rather than multitasking, but even there the Core i7-14700K does an admirable just keeping pace with chips with much higher core counts.

Image 1 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 2 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 3 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 4 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 5 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 6 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 7 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)

In creative workloads, the 14700K also performs exceptionally well, beating out the 13900K on everything except 3D model rendering, which is something that is rarely given to a CPU to do any when even the best cheap graphics cards can process Blender or V-Ray 5 workloads many times faster than even the best CPU can.

Image 1 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 2 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 3 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 4 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 5 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 6 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)

In gaming performance, the Core i7-14700K scores a bit of an upset over its launch sibling, the i9-14900K, besting it in gaming performance overall, though it has to be said that it got some help from a ridiculously-high average fps in Total War: Warhammer III's battle benchmark. In most cases, the i7-14700K came up short of the 13900K and 14900K, but not by much.

And while it might be tempting to write off Total War: Warhammer III as an outlier, one of the biggest issues with the Core i9's post-Alder Lake is that they are energy hogs and throttle under load quickly, pretty much by design. 

In games like Total War: Warhammer III where there are a lot of tiny moving parts to keep track of, higher clock speeds don't necessarily help. When turbo clocks kick into high gear and cause throttling, the back-and-forth between throttled and not-throttled can be worse over the course of the benchmark than the cooler but consistent Core i7s, which don't have to constantly ramp up and ramp down. 

So the 14700K isn't as much of an outlier as it looks, especially since the 13700K also excels at Total War: Warhammer III, and it too beats the two Core i9s. Total War: Warhammer III isn't the only game like this, and so there are going to be many instances where the cooler-headed 14700K steadily gets the work done while the hot-headed i9-13900K and 14900K sprint repeatedly, only to effectively tire themselves out for a bit before kicking back up to high gear.

Image 1 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

The additional efficiency cores might not draw as much power as the performance cores, but the additional power is still noticeable. The 14700K pulls down nearly 30W more watts than the 13700K, though it is still a far cry from the Core i9-13900K's power usage.

This additional power also means that the Core i7-14700K runs much hotter than its predecessor, maxing out at 100ºC, triggering the CPU to throttle on occasion. This is something that the i7-13700K didn't experience during my testing at all, so you'll need to make sure your cooling solution is up to the task here.

  • Performance: 4.5 / 5

An Intel Core i7-14700K with its promotional packaging

(Image credit: Future / John Loeffler)

Intel Core i7-14700K: Verdict

  • Fantastic single-core performance
  • Intel's best gaming processor, and second overall behind the Ryzen 7 7800X3D
  • Best value of any midrange processor
Image 1 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 2 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 3 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 4 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 5 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 6 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 7 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)

Ultimately, the Intel Core i7-14700K is the best processor in the Raptor Lake Refresh line-up, offering very competitive performance for a better price than its predecessor and far better one than comparable chips one tier higher in the stack.

It's not without fault, though. It's not that much better than the i7-13700K, so everything I'm saying about the i7-14700K might reasonably apply to its predecessor as well. And honestly, the i7-14700K doesn't have too high a bar to clear to standout from its launch siblings, so it's performance might only look as good in comparison to the i9 and i5 standing behind it.

But, the numbers don't lie, and the Intel Core i7-14700K displays flashes of brilliance that set it apart from its predecessor and vault it into competition with the top-tier of CPUs, and that's quite an achievement independent of how the rest of Raptor Lake Refresh fares. 

A masculine hand holding an Intel Core i7-14700K

(Image credit: Future / John Loeffler)

Should you buy the Intel Core i7-14700K?

Buy the Intel Core i7-14700K if...

Don't buy it if...

Also Consider

If my Intel Core i7-14700K review has you considering other options, here are two processors to consider... 

How I tested the Intel Core i7-14700K

  • I spent nearly two weeks testing the Intel Core i7-14700K
  • I ran comparable benchmarks between this chip and rival midrange processors
  • I gamed with this chip extensively
Test System Specs

These are the specs for the test system used for this review:

Intel Motherboard: MSI MPG Z790E Tomahawk Wifi
AMD Motherboard: ASRock X670E Steel Legend
CPU Cooler:
MSI MAG Coreliquid E360 AIO
Memory:
32GB SK Hynix DDR5-4800
SSD: Samsung 990 Pro
PSU: Thermaltake PF3 1050W ATX 3.0
Case: Praxis Wetbench

I spent about two weeks testing the Intel Core i7-14700K and its competition, primarily for productivity work, gaming, and content creation.

I used a standard battery of synthetic benchmarks that tested out the chip's single core, multi core, creative, and productivity performance, as well as built-in gaming benchmarks to measure its gaming chops. 

I then ran the same tests on rival chips from AMD as well as the other 14th-gen chips in the Raptor Lake Refresh launch line-up and 13th-generation Raptor Lake processors. For Intel chips, I used the same motherboard, RAM, SSD, and graphics card to ensure I was isolating just the CPU's performance across every chip. For AMD chips, I used a comparable AM5 motherboard so differences in the motherboard configuration and circuitry are mitigated to the largest extent possible.

I've been testing and reviewing computer hardware for years now, and with an extensive background in computer science, I know processors in and out, and I use that knowledge to ensure every chip is thoroughly tested.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed October 2023

Intel Core i9-14900K review: more of a Raptor Lake overclock than a refresh
4:00 pm |

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Intel Core i9-14900K: Two-minute review

The Intel Core i9-14900K is a hard chip to justify, which is a weird thing to say about a processor that is arguably the best Intel has ever put out.

With very little fanfare to herald its arrival following the announcement of Intel Meteor Lake at Intel Innovation in September 2023 (and confirmation that Intel Meteor Lake is coming to desktop in 2024), Intel's 14th-generation flagship processor cannot help but draw parallels to the 11th-gen Rocket Lake chips that immediately preceded Intel Alder Lake.

The Core i9-11900K was something of a placeholder in the market until Intel could launch Alder Lake at the end of 2021. Those processors featured a new hybrid architecture and a more advanced 10nm process that helped propel Intel back to the top of our best processor list, despite strong competition from AMD.

With Intel Raptor Lake Refresh, we're back in placeholder territory, unfortunately. The performance gains here are all but non-existent, so we are essentially waiting on Meteor Lake while the i9-14900K absolutely guzzles electricity and runs hot enough to boil water under just about any serious workload with very little extra performance over the Intel Core i9-13900K to justify the upgrade.

The problem for the Core i9-14900K is that you can still get the i9-13900K.

It's not that the Core i9-14900K isn't a great processor; again, it's unquestionably the best Intel processor for the consumer market in terms of performance. It beats every other chip I tested in most categories with the exception of some multitasking workflows and average gaming performance, both of which it comes in as a very close runner-up. On top of that, at $589, it's the same price as the current Intel flagship, the Intel Core i9-13900K (assuming the i9-14900K matches the i9-13900K's £699 / AU$929 sale price in the UK and Australia).

The problem for the Core i9-14900K is two-fold: you can still get the i9-13900K and will be able to for a long while yet at a lower price, and the Intel Core i7-14700K offers performance so close to the 14th-gen flagship at a much lower price that the 14900K looks largely unnecessary by comparison. Essentially, If you've got an i7-13700K or i9-13900K, there's is simply nothing for you here.

If you're on an 11th-gen chip or older, or you've got an AMD Ryzen processor and you're looking to switch, this chip will be the last one to use the LGA 1700 socket, so when Meteor Lake-S comes out in 2024 (or even Lunar Lake-S, due out at the end of 2024 or early 2025), you won't be able to upgrade to that processor with an LGA 1700 motherboard. In other words, upgrading to an LGA 1700 for this chip is strictly a one-shot deal.

The only people who might find this chip worth upgrading to are those currently using a 12th-gen processor who skipped the 13th-gen entirely, or someone using a 13th-gen core i5 who wants that extra bit of performance and doesn't mind dropping $589 on a chip they might be upgrading from again in a year's time, which isn't going to be a whole lot of people. 

Unfortunately, at this price, it'll be better to save your money and wait for Meteor Lake or even Lunar Lake to drop next year and put the $589 you'd spend on this chip towards the new motherboard and CPU cooler you'll need once those chips are launched.

An Intel Core i9-14900K with its promotional packaging

(Image credit: Future / John Loeffler)

Intel Core i9-14900K: Price & availability

  • How much does it cost? US MSRP $589 (about £470/AU$855)
  • When is it out? October 17, 2023
  • Where can you get it? You can get it in the US, UK, and Australia

The Intel Core i9-14900K is available as of October 17, 2023, for a US MSRP of $589 (about £470/AU$855), which is the same as the Intel Core i9-13900K it is replacing. We don't have confirmation on UK and Australia pricing yet, though I've asked Intel for clarification and will update this review if and when I hear back from the company. If the 14900K keeps the same UK and Australia pricing as the Core i9-13900K, however, it'll sell for £699/AU$929 in the UK and Australia respectively.

Meanwhile, this is still cheaper than most of AMD's rival chips in this tier, the AMD Ryzen 9 7950X3D, AMD Ryzen 9 7950X, and AMD Ryzen 9 7900X3D, with only the AMD Ryzen 9 7900X coming in cheaper than the i9-14900K. 

This does make the Core i9-14900K the better value against these chips, especially given the level of performance on offer, but it's ultimately too close to the 13900K performance-wise to make this price meaningful, as a cheaper 13900K will offer an even better value against AMD's Ryzen 9 lineup.

  • Price score: 3 / 5

A masculine hand holding an Intel Core i9-14900K

(Image credit: Future / John Loeffler)

Intel Core i9-14900K: Specs & features

  • Faster clock speeds than i9-13900K
  • Some additional AI-related features

The Intel Core i9-14900K is the final flagship using Intel's current architecture, so it makes sense that there is very little in the way of innovation over the Intel Core i9-13900K.

Using the same 10nm Intel 7 process node as its predecessor and with the same number of processor cores (8 P-cores/16 E-cores), threads (32), and cache (32MB total L2 cache plus additional 36MB L3 cache), the only real improvement with the 14900K in terms of specs are its faster clock speeds.

All cores get a 0.2GHz increase to their base frequencies, while the P-core turbo boost clock increases to 5.6GHz and the E-core turbo clock bumps up to 4.4GHz from the 13900K's 5.4GHz P-Core turbo clock and 4.3GHz E-core turbo clock.

While those clock speeds are the official max turbo clocks for the two types of cores, the Core i9-14900K and Intel Core i7-14700K have something called Turbo Boost Max Technology 3.0, which increases the frequency of the best-performing core in the chip and gives it even more power within the power and thermal limits. That gets the Core i9-14900K up to 5.8GHz turbo clock on specific P-cores while active.

Additionally, an exclusive feature of the Core i9 is an additional Ludicrous-Speed-style boost called Intel Thermal Velocity Boost. This activates if there is still power and thermal headroom on a P-core that is already being boosted by the Turbo Boost Max Technology 3.0, and this can push the core as high as 6.0GHz, though these aren't typical operating conditions.

Both of these technologies are present in the 13900K as well, but the 14900K bumps up the maximum clock speeds of these modes slightly, and according to Intel, that 6.0GHz clock speed makes this the world's fastest processor. While that might technically be true, that 6.0GHz is very narrowly used so in practical terms, the P-Core boost clock is what you're going to see almost exclusively under load.

The Core i9-14900K has the same 125W TDP as the 13900K and the same 253W maximum turbo power as well, though power draw in bursts of less than 10ms can go far higher.

If this reads like a Redditor posting about their successful overclocking setup, then you pretty much get what this chip is about. If you're looking for something innovative about this chip, I'll say it again, you're going to have to wait for Meteor Lake.

The Core i9-14900K also has support for discrete Wi-Fi 7 and Bluetooth 5.4 connectivity, as does the rest of the 14th-gen lineup, as well as support for discrete Thunderbolt 5, both of which are still a long way down the road.

The only other thing to note is that there have been some AI-related inclusions that are going to be very specific to AI workloads that almost no one outside of industry and academia is going to be running. If you're hoping for AI-driven innovations for everyday consumers, let's say it once more, with feeling: You're going to have to wait for—

  • Chipset & features score: 3.5 / 5

An Intel Core i9-14900K slotted into a motherboard

(Image credit: Future / John Loeffler)

Intel Core i9-14900K: Performance

  • Best-in-class performance, but only by a hair
  • Gets beat by AMD Ryzen 7 7800X3D and i7-14700K in gaming performance
  • Runs even hotter than the i9-13900K

If you took any elite athlete who's used to setting records in their sport, sometimes they break their previous record by a lot, and sometimes it's by milliseconds or fractions of an inch. It's less sexy, but it still counts, and that's really what we get here with the Intel i9-14900K.

On pretty much every test I ran on it, the Core i9-14900K edged out its predecessor by single digits, percentage-wise, which is a small enough difference that a background application can fart and cause just enough of a dip in performance that the 14900K ends up losing to the 13900K. 

I ran these tests more times than I can count because I had to be sure that something wasn't secretly messing up my results, and they are what they are. The Core i9-14900K does indeed come out on top, but it really is a game of inches at this point.

Image 1 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 3 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 4 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 5 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 6 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 7 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 8 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 9 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 10 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 11 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 12 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 13 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

Across all synthetic performance and productivity benchmarks, the Core i9-14900K comes out on top, with the notable exception of Geekbench 6.1's multi-core performance test, where the AMD Ryzen 9 7950X scores substantially higher, and the Passmark Performance Test's overall CPU score, which puts the AMD Ryzen 9 7950X and Ryzen 9 7950X3D significantly higher. Given that all 16 cores of the 7950X and 7950X3D are full-throttle performance cores, this result isn't surprising.

Other than that though, it's the 14900K all the way, with a 5.6% higher geometric average on single-core performance than the 13900K. For multi-core performance, the 14900K scores a 3.1% better geometric average, and in productivity workloads, it scores a 5.3% better geometric average than its predecessor.

Against the AMD Ryzen 9 7950X, the Core i9-14900K scores about 13% higher in single-core performance, about 1% lower in multi-core performance, and 5% better in productivity performance.

Image 1 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 3 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 4 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 5 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 6 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 7 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

Creative benchmarks reveal something of a mixed bag for the Core i9-14900K. In all cases, it beats its predecessor by between 2.6% to as much as 10.9%. Against the AMD Ryzen 9 7950X and 7950X3D, the Core i9-14900K consistently loses out when it comes to rendering workloads like Blender and V-Ray 5, but beats the two best AMD processors by just as much in photo and video editing. And since 3D rendering is almost leaning heavily on a GPU rather than the CPU, AMD's advantage here is somewhat muted in practice.

Image 1 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 2 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 3 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 4 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 5 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 6 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)

Gaming is another area where Intel had traditionally done well thanks to its strong single-core performance over AMD, but all that flipped with the introduction of AMD's 3D V-Cache. 

While the Intel Core i9-14900K barely moves the needle from its predecessor's performance, it really doesn't matter, since the AMD Ryzen 7 7800X3D manages to ultimately score an overall victory and it's not very close. The Core i9-14900K actually manages a tie for fourth place with the Intel Core i7-13700K, with the Core i7-14700K edging it out by about 4 fps on average.

Image 1 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

Of course, all this performance requires power, and lots of it. The Core i9-14900K pretty much matched the maximum recorded power draw of the Core i9-13900K, with less of a watt's difference between the two, 351.097W to 351.933, respectively.

The Core i9-14900K still managed to find a way to run hotter than its predecessor, however; something I didn't really think was possible. But there it is, the 14900K maxing out at 105ºC, three degrees hotter than the 13900K's max. It's the hottest I've ever seen a CPU run, and I'm genuinely shocked it was allowed to run so far past its official thermal limit without any overclocking on my part.

  • Performance: 3.5 / 5

A masculine hand holding an Intel Core i9-14900K

(Image credit: Future / John Loeffler)

Intel Core i9-14900K: Verdict

  • The best chip for dedicated performance like video editing and productivity
  • There are better gaming processors out there for cheaper
  • The Intel Core i7-14700K offers a far better value
Image 1 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 3 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 4 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 5 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 6 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 7 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

In the final assessment then, the Core i9-14900K does manage to win the day, topping the leaderboard by enough of a margin to be a clear winner, but close enough that it isn't the cleanest of wins. 

Overall, its single-core and productivity performance are its best categories, slightly faltering in creative workloads, and coming up short enough on gaming that it's not the chip I would recommend as a gaming CPU.

Like all Core i9s before it, the 14900K is the worst value of Intel's 14th-gen launch lineup, but it's better than its predecessor for the time being (though that advantage won't last very long at all), and it does manage to be a better value proposition than the Ryzen 9 7950X and Ryzen 9 7950X3D, while matching the Ryzen 7 7800X3D, so all in all, not too bad for an enthusiast chip.

Still, the Intel Core i7-14700K is right there, and its superior balance of price and performance makes the Intel Core i9-14900K a harder chip to recommend than it should be.

Should you buy the Intel Core i9-14900K?

Buy the Intel Core i9-14900K if...

Don't buy it if...

Also Consider

If my Intel Core i9-14900K review has you considering other options, here are two processors to consider... 

How I tested the Intel Core i9-14900K

  • I spent nearly two weeks testing the Intel Core i9-14900K
  • I ran comparable benchmarks between this chip and rival flagship processors
  • I gamed with this chip extensively
Test System Specs

These are the specs for the test system used for this review:

Intel Motherboard: MSI MPG Z790E Tomahawk Wifi
AMD Motherboard: ASRock X670E Steel Legend
CPU Cooler:
MSI MAG Coreliquid E360 AIO
Memory:
32GB SK Hynix DDR5-4800
SSD: Samsung 990 Pro
PSU: Thermaltake PF3 1050W ATX 3.0
Case: Praxis Wetbench

I spent about two weeks testing the Intel Core i9-14900K and its competition, using it mostly for productivity and content creation, with some gaming thrown in as well.

I used the standard battery of synthetic benchmarks I use for processor testing, and ran the same tests on rival chips from AMD as well as the other 14th-gen chips in the Raptor Lake Refresh launch lineup and 13th-generation Raptor Lake processors. For Intel chips, I used the same motherboard, RAM, SSD, and graphics card to ensure I was isolating just the CPU's performance across every chip. For AMD chips, I used a comparable AM5 motherboard so differences in the motherboard configuration and circuitry are mitigated to the largest extent possible.

I've been testing and reviewing computer hardware for years now, and with an extensive background in computer science, I know processors in and out, and I use that knowledge to ensure every chip is thoroughly tested.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed October 2023

Intel Arc A770 review: a great 1440p graphics card for those on a budget
4:00 pm | October 16, 2023

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Intel Arc A770: One-minute review

The Intel Arc A770 has had quite a journey since its release back on October 12, 2022, and fortunately, it has been a positive one for Intel despite a somewhat rocky start.

Right out the gate, I'll say that if you are looking for one of the best cheap graphics cards for 1440p gaming, this card definitely needs to be on your list. It offers great 1440p performance for most modern PC titles that most of us are going to be playing and it's priced very competitively against its rivals. 

Where the card falters, much like with my Intel Arc A750 review earlier this year, is with older DirectX 9 and DirectX 10 titles, and this really does hurt its overall score in the end. Which is a shame, since for games released in the last five or six years, this card is going to surprise a lot of people who might have written it off even six months ago.

Intel's discrete graphics unit has been working overtime on its driver for this card, providing regular updates that continue to improve performance across the board, though some games benefit more than others. 

Naturally, a lot of emphasis is going to be put on more recently released titles. And even though Intel has also been paying attention to shoring up support for older games as well, if you're someone with an extensive back catalog of DX9 and DX10 titles from the mid-2000s that you regularly return to, then this is not the best graphics card for your needs. Nvidia and AMD drivers carry a long legacy of support for older titles that Intel will honestly never be able to match.

But if what you're looking for is the best 1440p graphics card to play the best PC games of the modern era but you're not about to plop down half a grand on a new GPU, then the Intel Arc A770 is going to be a very solid pick with a lot more to offer than many will probably realize.

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Intel Arc A770: Price & availability

  • How much is it? US MSRP for 16GB card: $349 (about £280/AU$510); for 8GB card: $329 (about £265/AU$475)
  • When was it released? It went on sale on October 12, 2022
  • Where can you buy it? Available in the US, UK, and Australia

The Intel Arc A770 is available now in the US, UK, and Australia, with two variants: one with 16GB GDDR6 VRAM and an official US MSRP of $349 (about £280/AU$510), and one with 8GB GDDR6 VRAM and an official MSRP of $329 (about £265/AU$475).

Those are the launch MSRPs from October 2022, of course, and the cards have come down considerably in price in the year since their release, and you can either card for about 20% to 25% less than that. This is important, since the Nvidia GeForce RTX 4060 and AMD Radeon RX 7600 are very close to the 16GB Arc A770 cards in terms of current prices, and offer distinct advantages that will make potential buyers want to go with the latter rather than the former.

But those decisions are not as cut and dry as you might think, and Intel's Arc A770 holds up very well against modern midrange offerings, despite really being a last-gen card. And, currently, the 16GB variant is the only 1440p card that you're going to find at this price, even among Nvidia and AMD's last-gen offerings like the RTX 3060 Ti and AMD Radeon RX 6750 XT. So for 1440p gamers on a very tight budget, this card fills a very vital niche, and it's really the only card that does so.

  • Price score: 4/5

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Intel Arc A770: Design

  • Intel's Limited Edition reference card is gorgeous
  • Will fit most gaming PC cases easily
Intel Arc A770 Limited Edition Design Specs

Slot size: Dual slot
Length: 11.02 inches | 280mm
Height: 4.53 inches | 115mm
Cooling: Dual fan
Power Connection: 1 x 8-pin and 1 x 6-pin
Video outputs: 3 x DisplayPort 2.0, 1 x HDMI 2.1

The Intel Arc A770 Limited Edition that I'm reviewing is Intel's reference model that is no longer being manufactured, but you can still find some stock online (though at what price is a whole other question). 

Third-party partners include ASRock, Sparkle, and Gunnir. Interestingly, Acer also makes its own version of the A770 (the Acer Predator BiFrost Arc A770), the first time the company has dipped its toe into the discrete graphics card market.

All of these cards will obviously differ in terms of their shrouds, cooling solutions, and overall size, but as far as Intel's Limited Edition card goes, it's one of my favorite graphics cards ever in terms of aesthetics. If it were still easily available, I'd give this design five out of five, hands down, but most purchasers will have to opt for third-party cards which aren't nearly as good-looking, as far as I'm concerned, so I have to dock a point for that.

It's hard to convey from just the photos of the card, but the black finish on the plastic shroud of the card has a lovely textured feel to it. It's not quite velvety, but you know it's different the second you touch it, and it's something that really stands out from every other card I've reviewed.

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

The silver trim on the card and the more subtle RGB lighting against a matte black shroud and fans really bring a bit of class to the RGB graphics card I typically see. The twin fans aren't especially loud (not any more so than other dual-fan cards, at least), and the card feels thinner than most other similar cards I've reviewed and used, whether or not the card is thinner in fact.

The power connector is an 8-pin and 6-pin combo, so you'll have a pair of cables dangling from the card which may or may not affect the aesthetic of your case, but at least you won't need to worry about a 12VHPWR or 12-pin adapter like you do with Nvidia's RTX 4000-series and 3000-series cards.

You're also getting three DisplayPort 2.0 outputs and an HDMI 2.1 output, which puts it in the same camp as Nvidia's recent GPUs, but can't match AMD's recent move to DisplayPort 2.1, which will enable faster 8K video output. As it stands, the Intel Arc A770 is limited to 8K@60Hz, just like Nvidia. Will you be doing much 8K gaming on a 16GB card? Absolutely not, but as we get more 8K monitors next year, it'd be nice to have an 8K desktop running at 165Hz, but that's a very speculative prospect at this point, so it's probably not anything anyone looking at the Arc A770 needs to be concerned about.

  • Design Score: 4 / 5

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Intel Arc A770: Specs & features

  • Good hardware AI cores for better XeSS upscaling
  • Fast memory for better 1440p performance

Intel's Xe HPG architecture inside the Arc A770 introduces a whole other way to arrange the various co-processors that make up a GPU, adding a third, not very easily comparable set of specs to the already head-scratching differences between Nvidia and AMD architectures.

Intel breaks up its architecture into "render slices", which contain 4 Xe Cores, which each contain 128 shaders, a ray tracing processor, and 16 matrix processors (which are directly comparable to Nvidia's vaunted tensor cores at least), which handle graphics upsampling and machine learning workflows. Both 8GB and 16GB versions of the A770 contain eight render slices for a total of 4096 shaders, 32 ray processors, and 512 matrix processors.

The ACM-G10 GPU in the A770 runs at 2,100MHz base frequency with a 2,400MHz boost frequency, with a slightly faster memory clock speed (2,184MHz) for the 16GB variant than the 8GB variant's 2,000MHz. This leads to an effective memory speed of 16 Gbps for the 8GB card and 17.5 Gbps for the 16GB.

With a 256-bit memory bus, this gives the Arc A770 a much wider lane for high-resolution textures to be processed through, reducing bottlenecks and enabling faster performance when gaming at 1440p and higher resolutions thanks to a 512 GB/s and 559.9 GB/s memory bandwidth for the 8GB and 16GB cards, respectively.

All of this does require a good bit of power, though, and the Arc A770 has a TDP of 225W, which is higher than most 1440p cards on the market today.

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

As far as features all this hardware empowers, there's a lot to like here. The matrix cores are leveraged to great effect by Intel's XeSS graphics upscaling tech found in a growing number of games, and this hardware advantage generally outperforms AMD's FSR 2.0, which is strictly a software-based upscaler.

XeSS does not have frame generation though, and the matrix processors in the Arc A770 are not nearly as mature as Nvidia's 3rd and 4th generation tensor cores found in the RTX 3000-series and RTX 4000-series, respectively.

The Arc A770 also has AV1 hardware-accelerated encoding support, meaning that streaming videos will look far better than those with only software encoding at the same bitrate, making this a compelling alternative for video creators who don't have the money to invest in one of Nvidia's 4000-series GPUs.

  • Specs & features: 3.5 / 5

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Intel Arc A770: Performance

  • Great 1440p performance
  • Intel XeSS even allows for some 4K gaming
  • DirectX 9 and DirectX 10 support lacking, so older games will run poorly
  • Resizable BAR is pretty much a must

At the time of this writing, Intel's Arc A770 has been on the market for about a year, and I have to admit, had I gotten the chance to review this card at launch, I would probably have been as unkind as many other reviewers were.

As it stands though, the Intel Arc A770 fixes many of the issues I found when I reviewed the A750, but some issues still hold this card back somewhat. For starters, if you don't enable Resizable BAR in your BIOS settings, don't expect this card to perform well at all. It's an easy enough fix, but one that is likely to be overlooked, so it's important to know that going in.

Image 1 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 2 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 3 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 4 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 5 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 6 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 7 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 10 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 11 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 12 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 13 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 14 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 15 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)

In synthetic benchmarks, the A770 performed fairly well against the current crop of graphics cards, despite its effectively being a last-gen card. It is particularly strong competition against the Nvidia RTX 4060 Ti across multiple workloads, and it even beats the 4060 Ti in a couple of tests.

Its Achilles Heel, though, is revealed in the PassMark 3D Graphics test. Whereas 3DMark tests DirectX 11 and DirectX 12 workloads, Passmark's test also runs DirectX 9 and DirectX 10 workflows, and here the Intel Arc A770 simply can't keep up with AMD and Nvidia.

Image 1 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 2 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 3 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 4 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 5 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 6 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 7 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 10 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 11 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 12 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 13 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 14 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 15 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 16 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 17 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 18 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 19 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 20 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 21 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 22 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 23 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 24 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)

In non-ray-traced and native-resolution gaming benchmarks, the Intel Arc A770 managed to put up some decent numbers against the competition. At 1080p, the Arc A770 manages an average of 103 fps with an average minimum fps of 54. At 1440p, it averages 78 fps, with an average minimum of 47, and even at 4K, the A770 manages an average of 46 fps, with an average minimum of 27 fps.

Image 1 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 2 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 3 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 4 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 5 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 6 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 7 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 10 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 11 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 12 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)

Turn on ray tracing, however, and these numbers understandably tank, as they do for just about every card below the RTX 4070 Ti and RX 7900 XT. Still, even here, the A770 does manage an average fps of 41 fps, with an average minimum of 32 fps) at 1080p with ray tracing enabled, which is technically still playable performance. Once you move up to 1440p and 4K, however, your average title isn't going to be playable at native resolution with ray tracing enabled.

Image 1 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 2 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 3 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 4 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 5 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 6 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 7 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)

Enter Intel XeSS. When set to "Balanced", XeSS turns out to be a game changer for the A770, getting it an average framerate of 66 fps (with an average minimum of 46 fps) at 1080p, an average of 51 fps (with an average minimum of 38 fps) at 1440p, and an average 33 fps (average minimum 26 fps) at 4K with ray tracing maxed out.

While the 26 fps average minimum fps at 4K means it's really not playable at that resolution even with XeSS turned on, with settings tweaks, or more modest ray tracing, you could probably bring that up into the low to high 30s, making 4K games playable on this card with ray tracing turned on. 

That's something the RTX 4060 Ti can't manage thanks to its smaller frame buffer (8GB VRAM), and while the 16GB RTX 4060 Ti could theoretically perform better (I have not tested the 16GB so I cannot say for certain), it still has half the memory bus width of the A770, leading to a much lower bandwidth for larger texture files to pass through.

This creates an inescapable bottleneck that the RTX 4060 Ti's much larger L2 cache can't adequately compensate for, and so takes it out of the running as a 4K card. When tested, very few games managed to maintain playable frame rates even without ray tracing unless you dropped the settings so low as to not make it worth the effort. The A770 16GB, meanwhile, isn't technically a 4K card, but it can still dabble at that resolution with the right settings tweaks and still look reasonably good.

Image 1 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 2 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 3 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 4 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 5 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 6 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 7 of 9

Final performance scores for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 9

Final performance scores for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 9

Final performance scores for the Intel Arc A770

(Image credit: Future / Infogram)

All told, then, the Intel Arc A770 turns out to be a surprisingly good graphics card for modern gaming titles that can sometimes even hold its own against the Nvidia RTX 4060 Ti. It can't hold a candle to the RX 7700 XT or RTX 4070, but it was never meant to, and given that those cards cost substantially more than the Arc A770, this is entirely expected.

Its maximum observed power draw of 191.909W is pretty high for the kind of card the A770 is, but it's not the most egregious offender in that regard. All this power meant that keeping it cool was a struggle, with its maximum observed temperature hitting about 74 ºC.

Among all the cards tested, the Intel Arc A770 was at nearly the bottom of the list with the RX 6700 XT, so the picture for this card might have been very different had it launched three years ago and it had to compete with the RTX 3000-series and RX-6000 series exclusively. In the end, this card performs like a last-gen card, because it is. 

Despite that, it still manages to be a fantastic value on the market right now given its low MSRP and fairly solid performance, rivaling the RTX 4060 Ti on the numbers. In reality though, with this card selling for significantly less than its MSRP, it is inarguably the best value among midrange cards right now, and it's not even close.

  • Performance score: 3.5 / 5

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Should you buy the Intel Arc A770?

Buy the Intel Arc A770 if...

Don't buy it if...

Also Consider

If my Intel Arc A770 review has you considering other options, here are two more graphics cards for you to consider.

How I tested the Intel Arc A770

  • I spent several days benchmarking the card, with an additional week using it as my primary GPU
  • I ran our standard battery of synthetic and gaming benchmarks 
Test Bench

These are the specs for the test system used for this review:
CPU: Intel Core i9-13900K
CPU Cooler: 
Cougar Poseidon GT 360 AIO Cooler
Motherboard: MSI MPG Z790E Tomahawk Wifi
Memory: 
64GB Corsair Dominator Platinum RGB DDR5-6000
SSD: Samsung 990 Pro
PSU: Thermaltake PF3 1050W ATX 3.0
Case: Praxis Wetbench

I spent about two weeks with the Intel Arc A770 in total, with a little over half that time using it as my main GPU on my personal PC. I used it for gaming, content creation, and other general-purpose use with varying demands on the card.

I focused mostly on synthetic and gaming benchmarks since this card is overwhelmingly a gaming graphics card. Though it does have some video content creation potential, it's not enough to dethrone Nvidia's 4000-series GPUs, so it isn't a viable rival in that sense and wasn't tested as such.

I've been reviewing computer hardware for years now, with an extensive computer science background as well, so I know how graphics cards like this should perform at this tier.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

  • First reviewed October 2023
Samsung Exynos 2400 detailed – 70% faster CPU, Xclipse 940 GPU with AMD RDNA 3 graphics
11:53 am | October 6, 2023

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

Samsung Semiconductor held its System LSI Tech Day event in San Jose, California where we got an early glimpse of the upcoming Exynos 2400 flagship mobile chipset. Exynos 2400 will likely arrive on the Galaxy S24 series, serving as a successor to Exynos 2200 with promises of a massive leap in performance. Samsung claims Exynos 2400 offers a 1.7x increase in CPU performance and a 14.7x boost in AI performance compared to the Exynos 2200. Samsung introduced a new AI tool designed for upcoming smartphones with updated text-to-image AI generation which was demoed on stage. Exynos...

AMD Ryzen 9 7900X3D review: a fantastic premium performer, but its price holds it back
8:17 pm | September 13, 2023

Author: admin | Category: Computers Gadgets | Tags: , , , | Comments: Off

AMD Ryzen 9 7900X3D: One-minute review

The AMD Ryzen 9 7900X3D is the middle child of the current 3D V-Cache processors from Team Red alongside the 7800X3D and the 7950X3D. It launched alongside the rest of the line back in February of this year and offers heightened gaming performance, but comes at a price. 

Without a doubt, it is one of the best processors for gaming on the market. But even as gamers are going to be able to get the most out of this chip, it's productivity performance isn't too bad either. 

Armed with a significantly lower TDP than the rest of the current AMD Zen 4 lineup, the AMD Ryzen 9 7900X3D packs in 12 cores and 24 threads on a 120W TDP with a base clock speed of 4.4 GHz out of the box, and that's honestly the core appeal of this chip. 

It's more power efficient and offers better raw gaming performance than its non-3D counterpart, but the addition of AMD's 3D V-cache means it can hold up with far pricier processors as well. 

It should be stated that overall, you're falling into one of two camps with the AMD Ryzen 9 7900X3D, as it is impressive for gaming, but won't necessarily set the world on fire with the creativity or productivity side of things at the higher end of the spectrum. 

The raw gaming performance at its $599 / £479.99 / AU$859.99 price point is decent, but chances are if you're spending this much on a CPU purely for gaming, you could argue that an extra $100 / £130 / AU$279 for the top-end 7950X3D could be a better bet instead. 

AMD Ryzen 9 7900X3D: Price and availability

  • Comparable price to the Intel Core i9-13900K
  • $50 /  £50 / AU$64 more than base 7900X

The AMD Ryzen 9 7900X3D was released on February 28, 2023, and currently retails for $599 / £479.99 / AU$859.99. 

That's around $100 / £130 / AU$279 less than the flagship AMD Ryzen 9 7950X3D which features 16 cores and 32 threads. As a point of comparison, this AMD processor comes in a little cheaper than the Intel Core i9-13900K in the UK and Australia, where it currently sells at £699 / AU$929, and is just $10 more expensive in the US. 

That is only one side of the story, though. That's because the AMD Ryzen 9 7900X3D requires an upgrade to the latest AM5 socket, which means an entirely new motherboard as well as the exclusive use of DDR5 RAM, and the best DDR5 RAM isn't cheap (even if it has come down in price). 

Essentially, you'll be building an entirely new system around the chip as there's no more backward compatibility with AM4 as we saw with the two previous Ryzen processor generations (though the best CPU coolers for AM4 processors will still work with the new AMD chips). 

This is owing to AMD's transition from a PGA to LGA socket, which just means that the processor no longer has pins the way previous generations did, much like with the best Intel processors

  • Price score: 3.5 / 5

AMD Ryzen 9 7900X3D: Chipset & features

Close up on the Ryzen 9 7900X3D

(Image credit: Future)
  • Improved power efficiency 
  • Zen 4 3D V-cache for under $600 / £500 / AU$900

The AMD Ryzen 9 7900X3D features a lot of the same broad strokes as its non-3D variant. You're getting the same 12 cores and 24 threads on the AM5 socket with a total boost clock of up to 5.6GHz. The core difference here, however, is the 3D V-Cache which doubles the stock version's 64MB L3 Cache for a total of 128MB. 

The higher the L3 cache is, the better gaming or intensive processing workloads can perform, that's because it's the largest level of cache available on a processor.

Added cache aside, the AMD Ryzen 9 7900X3D is also significantly more power-efficient than any current non-3D Zen 4 processors available, as it clocks in with a Thermal Design Power (TDP) of 120W, which is much lower than the substantially higher 170W of its stock variant. 

While a higher TDP usually relates to higher performance, the inclusion of the added 3D V-cache means that the processor can access a larger pool of superfast cache memory, which is even more useful when gaming than just throwing raw power at the problem. With its own dedicated extra cache, there are fewer fetch operations to the PC's main memory, so the chip runs more efficiently, and potentially cooler under load. 

This is reflected when contrasted against the AMD Ryzen 9 7900X's core clock speed of 4.7 GHz to the 3D variant's 4.4 GHz. It's a little slower out of the box despite the overclocking potential being the same, however, the AMD Ryzen 9 7900X3D is still far faster than any of the current Alder Lake or Raptor Lake processors in terms of the raw speed. 

Ultimately, the reduced memory latency means that you're getting a chip that runs cooler, draws less power, and performs better thanks to the addition of the second generation of AMD's V-cache. 

  • Design & features score: 4 / 5

AMD Ryzen 9 7900X3D: Performance

AMD Ryzen 9 7900X3D up close

(Image credit: Future)

You won't be shocked to hear that the AMD Ryzen 9 7900X3D is one of the most capable CPUs for gaming that I've ever used, holding its own against the flagship 7950X and the Intel Core i9-13900K. 

This is evidenced by some of the most impressive synthetic scores to date in industry-standard programs such as GeekBench 6, PCMark10, and Cinebench R23, among others, and you can see how the Ryzen 9 7900X3D compares to competing high-end processors below. 

Where the AMD Ryzen 9 7900X3D falls behind the Intel Core i9-13900K and the 7950X3D in terms of the productivity benchmarks, the gap is greatly closed with the raw gaming performance. Turning to the gaming benchmarks, this chip's 3D V-cache makes all the difference in demanding titles such as F1 2022, Returnal, and Total War: Warhammer 3

As with our other CPU reviews, the games tested in the AMD Ryzen 9 7900X3D review are tested at 1080p at the lowest graphics settings in order to isolate the processor's contribution to gaming performance. Below, you can see how this chipset compares to the best AMD processor and best Intel processor respectively. 

Compared to the more expensive chipsets, the AMD Ryzen 9 7900X3D absolutely holds its own with the 7950X3D and the 13900K, with the largest gap visible seen with how AMD's flagship handles Returnal. This is likely due to the fact that the 7950X3D utilizes an additional four cores and eight threads, and Total War series has always been Intel's strongest gaming benchmark, which remains the case here. 

Still, with the Ryzen 9 7900X3D, we're still talking about an absolute powerhouse of a CPU, with framerates well above 100fps in demanding games, and upwards of 400fps in tamer titles. Realistically, you can expect this chip to be an absolute behemoth for 1080p, though you'll get diminishing returns at 1440p and 4K if you don't have the beefiest video card in your rig that can keep up with the processor. 

Overall, the AMD Ryzen 9 7900X3D is an impressive processor for the money which is definitely geared more toward gaming than productivity or creativity tasks. If you're purely interested in playing games then this processor offers strong price-to-performance at the $600 / £480 / AU$860 mark, but with the Ryzen 9 7950X3D so close in price, a lot of buyers out there are likely to be torn. 

  • Performance score: 4 / 5

Should you buy the AMD Ryzen 9 7900X3D?

Buy it if...

Don't buy it if...

Also Consider

If my AMD Ryzen 9 7900X3D review has you considering other options, here are two more processors to consider.

Intel Core i9-13900K
There's very little that we can fault the Raptor Lake flagship on with its performance. That's due to the excellent Raptor Cove and Gracemont cores with its hybrid architecture that makes it a processor that's difficult to beat outside of its expensive price point. 

Read the full 5-star Intel Core i9-13900K review

How I tested the AMD Ryzen 9 7900X3D

  • Used in main gaming PC rig for almost a month 
  • Played a variety of titles including those benchmarked 
  • Industry standard synthetic benchmark tests 
Test system specs

CPU cooler: NZXT Kraken Elite 360
GPU: Nvidia RTX 4090
DDR5 RAM: 32GB (2 x 16GB) Kingston Fury Beast RGB @ 6,000 MHz
Motherboard: Gigabyte X670 Aorus Elite AX
SSD: Seagate FireCuda 530 2TB
PSU: Corsair RM1000x
Case: NZXT H9 Flow

I tested the AMD Ryzen 9 7900X3D inside of a newly built machine utilizing Kingston Fury Beast DDR5 RAM, an Nvidia RTX 4090, and a brand new RM1000X PSU. The chip was utilized heavily for gaming in the benchmarked titles as well as in games such as Mortal Kombat 11, Cyberpunk 2077, and Tekken 7.

I've also been using the machine as my main computer for both work and play and have racked up dozens of hours word processing as well as with media playback. Through the real-world testing, the benchmarking, and the stress testing, I came to my four-star conclusion on the AMD Ryzen 9 7900X3D as a recommended CPU for gaming.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed September 2023

AMD Radeon RX 7800 XT review: pulling an otherwise knockout, midrange punch
12:10 am | September 9, 2023

Author: admin | Category: Computers Gadgets | Tags: , , , | Comments: Off

AMD Radeon RX 7800 XT: Two-minute review

To say I've been looking forward to the AMD Radeon RX 7800 XT for over a year is an understatement, and if I were to judge this card on its merits, I have to say that this is easily one of the best graphics card releases we've gotten out of this generation. My heart, though, knows that it should have been even better, so I can't help but feel slightly disappointed.

Released right on the heels of Labor Day here in the US, getting this card properly tested was obviously going to be a heavy lift, so when my preliminary benchmark numbers showed it edging out the Nvidia GeForce RTX 4070 by about 2% overall (while not getting as badly crushed by Nvidia's midrange rival in ray-tracing performance as during the previous generation), I figured this card was going to be an easy one to review.

Coming in at $499.99 (about £380/AU$725) compared to the RTX 4070's MSRP of $599.99 (about £460/AU$870), that roughly 17% price difference in AMD's favor is going to make a world of difference for a lot of gamers out there looking to upgrade to a current-gen midrange card.

In addition to fantastic 1440p gaming performance and even very respectable 4K gaming performance (thanks in no small part to the 16GB VRAM and 256-bit memory bus), ray tracing performance has gotten better as AMD's ray accelerators have improved and a host of new anti-latency and upscaling features make this pretty much the best 1440p graphics card on the market, hands down.

An AMD Radeon RX 7800 XT on a table

(Image credit: Future / John Loeffler)

So why does my heart ache having done a very intense week's worth of testing on this card?

Well, the single biggest negative in this card's column is that there is very little gen-on-gen improvement in terms of its rasterization performance over the AMD Radeon RX 6800 XT. 

Now the RX 7800 XT does have things that the RX 6800 XT doesn't have, namely AI accelerator cores that can power more advanced AI workloads like upscaling and other generative AI processes, and the 7800 XT does feature much better ray tracing performance than its predecessor, so calling these cards essentially the same would be factually and substantively wrong.

But rasterization is AMD Radeon's bread-and-butter, and by that metric, you only really get about 12% and 5% better gaming performance at 1080p and 1440p, respectively, and there's essentially no difference at 4K. If you don't care about ray tracing or running Stable Diffusion-like AI models (which you're likely to use Nvidia hardware for anyway), then this card is going to feel much more like a refresh of the RX 6800 XT, or even the RX 6850 XT that we didn't get a year ago.

And for that, the RX 7800 XT leaves me somewhat disappointed. If you aren't upgrading from an RX 6800 XT (which you shouldn't be doing even if this card was a true gen-on-gen successor like the fantastic AMD Radeon RX 7700 XT is to the AMD Radeon RX 6700 XT), then none of this is really going to matter to you. 

I'd still tell you to buy the RX 7800 XT over the RX 6800 XT and even the RTX 4070, without question, but there's no getting around the fact that the AMD Radeon RX 7800 XT misses its shot at being truly magnificent.

AMD Radeon RX 7800 XT: Price & availability

An AMD Radeon RX 7800 XT on a table

(Image credit: Future / John Loeffler)
  • How much does it cost? $499.99 (about £380/AU$725)
  • When is it available? Available September 6, 2023
  • Where can you get it? Available in the US, UK, and Australia

The AMD Radeon RX 7800 XT is available on September 6, 2023, starting at $499.99 (about £380/AU$725), which puts it about 23% cheaper than the RX 6800 XT was when it launched in 2020, and $100 cheaper than direct competitor the Nvidia RTX 4070.

It's also just $50 more expensive than the RX 7700 XT that it launched alongside, so anyone looking at the RX 7700 XT might be better served by buying the RX 7800 XT instead since you'll get better performance and extra VRAM without spending a whole lot more money.

AMD Radeon RX 7800 XT: Specs

An AMD Radeon RX 7800 XT on a table

(Image credit: Future / John Loeffler)

AMD Radeon RX 7800 XT: Design

Unlike the RX 7700 XT, the AMD Radeon RX 7800 XT does have a reference card, and it'll look familiar to anyone who's been looking at AMD cards this generation. Opting for a two-fan cooling solution, this dual-slot card looks a lot like the AMD Radeon RX 7600 would if you stretched the card lengthwise. 

It's not a long card either, measuring 267mm, or about 10.5 inches, so you shouldn't have any issues getting this card to fit inside a mid-tower case or larger. You might even be able to squeeze it into some tighter-fitting cases as well, but that'll depend on the case itself and what version of the RX 7800 XT you end up getting (third-party versions will vary in size and will likely be longer).

An AMD Radeon RX 7800 XT on a table

(Image credit: Future / John Loeffler)

The reference model of the card features three DisplayPort 2.1 outputs along with an HDMI 2.1 port, so it'll be more than capable of powering the best 4K monitors with ease, along with the various sizes and resolutions of the best gaming monitors on the market.

What it doesn't have, however, is a USB-C output, so if you have one of the best USB-C monitors (which is common in creative industries), youi'll likely need to pick up an adapter if you plan on slotting this card into a workstation.

An AMD Radeon RX 7800 XT on a table

(Image credit: Future / John Loeffler)

You'll also only need two free 8-pin power connectors, so no 12HPWR cable like Nvidia's competing cards. The card is fairly solid with a decent amount of weight, so you'll definitely need a support bracket if you're slotting this directly into a motherboard's PCIe slot.

Overall, the appearance is the same no-fuss, no-bling aesthetic we've gotten from AMD's RDNA 3 reference cards this generation, so if you want that RGB look, you're better off with a third-party card, but otherwise it's a lovely card to look at and won't be the shame of anyone's PC case.

An AMD Radeon RX 7800 XT on a table

(Image credit: Future / John Loeffler)

AMD Radeon RX 7800 XT: Chipset & features

The Navi 32 GPU in the AMD Radeon RX 7800 XT is the full version of the chip compared to the slightly trimmed-down GPU powering the RX 7700 XT, with an additional 6 compute units over the RX 7700 XT's 54, giving the RX 7800 XT an additional 384 shaders, 6 ray accelerators, and 12 AI accelerators.

The RX 7800 XT has a fairly low base clock of 1,295 MHz, compared to the RX 7700 XT's 1,700 MHz, but the RX 7800 XT's boost clock runs as high as 2,430 MHz (compared to the RX 7700 XT's 2,544 MHz).

This means that even though the RX 7800 XT has slightly more compute units, everything is running slightly slower, which goes a long way to explaining the relatively close levels of performance between the two GPUs.

The RX 7800 XT does feature 16GB VRAM with a large 256-bit memory bus, with a memory clock of 2,425 MHz for an effective 19.4 Gbps. This is slower than the RTX 4070's 21 Gbps effective memory speed, but the wider bus and larger frame buffer offered by the additional 4GB VRAM with the RX 7800 XT really highlights where Nvidia went wrong with lower VRAM and tighter buses this generation, compared to AMD who generally got the memory question on their cards right.

Finally, the TGP on the RX 7800 XT is a rather high 263W, compared to the 200W RTX 4070, but this is still less than the RX 6800 XT's 300W TGP, so there's progress at least.

An AMD Radeon RX 7800 XT on a table

(Image credit: Future / John Loeffler)

AMD Radeon RX 7800 XT: Performance

And here is where the AMD Radeon RX 7800 XT impresses the most, even as it breaks my heart: performance.

I'll start with the good news for AMD here, which is that it by and large scores even with the RTX 4070 in terms of synthetic tests and gameplay performance while faltering rather badly against the RTX 4070 in creative workloads, which is pretty much expected given the Nvidia CUDA instruction set's dominance in all things creative.

Image 1 of 15

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 2 of 15

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 3 of 15

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 4 of 15

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 5 of 15

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 6 of 15

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 7 of 15

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 8 of 15

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 9 of 15

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 10 of 15

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 11 of 15

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 12 of 15

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 13 of 15

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 14 of 15

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 15 of 15

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)

On the synthetic side, the AMD Radeon RX 7800 XT outperforms the RTX 4070 by about 2% overall, with rasterization workloads being its breakout strength, while Nvidia's ray tracing capabilities continue to outperform AMD's. Though it's worth noting that the RX 7800 XT does a lot to close the gap here, so Nvidia's advantage is only about 15% at best during 3DMark Speedway and just 6% better in Port Royal. 

Meanwhile, the RX 7800 XT manages to score 25% better in 3DMark Firestrike Ultra, showing it to be a much better 4K card than the RTX 4070 thanks to the additional VRAM, a level of performance that is replicated in our gaming tests.

Image 1 of 24

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 2 of 24

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 3 of 24

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 4 of 24

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 5 of 24

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 6 of 24

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 7 of 24

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 8 of 24

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 9 of 24

Benchmarking results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 10 of 24

Benchmarking results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 11 of 24

Benchmarking results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 12 of 24

Benchmarking results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 13 of 24

Benchmarking results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 14 of 24

Benchmarking results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 15 of 24

Benchmarking results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 16 of 24

Benchmarking results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 17 of 24

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 18 of 24

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 19 of 24

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 20 of 24

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 21 of 24

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 22 of 24

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 23 of 24

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)
Image 24 of 24

Benchmark results for the AMD Radeon RX 7700 XT

(Image credit: Future / Infogram)

When not using any upscaling tech, on average, the RX 7800 XT performs 15% better without ray tracing than the RTX 4070 (and just 4% worse with ray tracing at max settings) at 1080p, 6% better on average at 1440p (16% worse when ray tracing on max settings), and 17% better at 4K (though about 25% worse at 4K when ray tracing).

FSR 2 can't hold a candle to DLSS 3 when ray tracing, but in non-RT gameplay, FSR 2 and the RX 7800 XT actually comes out way ahead across all resolutions when FSR 2 and DLSS 3 are set to balanced, with the RX 7800 XT getting 53%, 21%, 19% better performance at 1080p, 1440p, and 4K, respectively. 

Turning on ray tracing prety much reverses the case and the RTX 4070 gets as much as 47%, 16%, and 12% better performance at 1080p, 1440p, and 4K resolutions, respectively.

In short, if you're planning on gaming without ray tracing, there is no question that between the RX 7800 XT and RTX 4070, the RX 7800 XT is the card you'll want to buy. 

Here, as well, the RX 7800 XT manages to perform better than the RX 6800 XT, by about 15%, which isn't awful, but gamers hoping for a much larger improvement on the RX 6800 XT (such as myself) will be disappointed. Getting 15% better FPS on average when talking about the RX 7600 is one thing. 

Given the price and the class of card in question, 15% is pretty much all you're going to get, but for a nearly $500 graphics card, I'd have liked to see 25% to 33%, if I'm being honest, and that's where this card ultimately should have landed in a perfect world. 

But ours is a fallen land, and we're not comparing this card against a Platonic ideal projecting onto a cave wall, we're comparing it to the cards on the shelf that you have to pick between for your next upgrade. 

If you can find the RX 6800 XT for more than 15% less than the RX 7800 XT, that might make the last-gen card the better buy. If that's not an option though, and you're like most gamers looking at the RTX 4070 vs. RX 7800 XT, the vast majority are going to get a better experience from the RX 7800 XT, especially when they have an extra $100 to buy themselves something else that's nice, as a treat.

An AMD Radeon RX 7800 XT on a table

(Image credit: Future / John Loeffler)

Should you buy the AMD Radeon RX 7800 XT?

Buy it if...

You want to play at 4K
This card has serious 4K gaming chops thanks to its 16GB VRAM and wide memory bus.

You don't want to completely sacrifice ray tracing
AMD is finally getting to the point where you can have both great rasterization and decent ray tracing performance.

Don't buy it if...

You want the best ray tracing and upscaling possible
If ray tracing and upscaling are your bag, then the RTX 4070 is going to be the better buy here.

AMD Radeon RX 7800 XT: Also consider

If my AMD Radeon RX 7800 XT review has you considering other options, here are two more graphics cards to consider.

How I tested the AMD Radeon RX 7800 XT

  • I spent about a week with the RX 7800 XT
  • I focused mostly on gaming, since that is what AMD Radeon graphics cards are primarily used for
  • I used our standard battery of benchmark tests and personal gameplay experience
Test System Specs

These are the specs for the test system used for this review:

CPU: Intel Core i9-13900K
CPU Cooler:
Cougar Poseidon GT 360 AIO Cooler
Motherboard: MSI MPG Z790E Tomahawk Wifi
Memory:
64GB Corsair Dominator Platinum RGB DDR5-6000
SSD: Samsung 990 Pro
PSU: Thermaltake PF3 1050W ATX 3.0
Case: Praxis Wetbench

I spent about a week extensively testing the RX 7800 XT, both in a test bench and as my personal gaming card at home.

I ran our standard battery of performance benchmarks, including 3DMark tests and various in-game gaming benchmarks, on the RX 7800 XT and various competing graphics cards from AMD and Nvidia to get a slate of comparable figures.

In addition to my extensive computer science education and years as a tech product reviewer, I've been a PC gamer my whole life, so I know what to look for and what to expect from a graphics card at this price point.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed September 2023

« Previous PageNext Page »