Organizer
Gadget news
Xreal Air 2 review: better in all but the most important way
3:00 pm | October 24, 2023

Author: admin | Category: Computers Computing Gadgets Software Virtual Reality & Augmented Reality | Comments: Off

Two-minute Xreal Air 2 review

The Xreal Air 2 and Xreal Air 2 Pro AR glasses are here to replace the original Xreal Air (formerly Nreal Air) AR specs – and they’re just as much fun to use as the originals.

Design-wise they keep everything that made the Xreal Air good – they’re both comfortable to wear for long stretches of time and look kind of stylish too – though the base model doesn’t change things up too much from what’s come before. The Pro model has received new Electrochromic Dimming for its lenses allowing them to swap between clear and shaded at the literal push of a button – and this feature is the only difference between the regular Xreal Air 2 glasses and the Pro.

There are more changes in the picture and audio department thankfully. Both the Xreal Air 2 and Xreal Air 2 Pro have new Sony 0.55 Micro-OLED displays that boast 25% higher peak brightness and improved color calibration. This allows them to produce vibrant colors and provide good contrast in dark scenes.

The AR glasses also feature more immersive audio through their speakers, however, the bass is still a little weak and there’s a not insignificant amount of audio leakage from the open-ear speaker design. If you want to watch something in private you’ll need some Bluetooth headphones.

Unfortunately, the main issue holding the original Xreal Air glasses back wasn’t their picture quality or audio, it was their value, and the Xreal Air 2 and Air 2 Pro don’t address this issue.

I think the specs are a delight to use, but the $399 / £399 price tag – or $449 / £449 for the Xreal Air 2 Pro – is a massive turn-off. 

All you get for this high price is a wearable display for some laptops, phones, and handheld game consoles. If you want a more worthwhile experience from the AR glasses you need to buy an Xreal Beam and a handful of add-on adapter cables, and these can add the best part of $200 / £200 to the total cost.

Hamish Hector wearing the Xreal Air 2 Pro glasses

Here's what the AR glasses look like on my face (Image credit: Future)

At this price, you’ll have spent more than you would have on something like the Meta Quest 3 (which I gave five stars in our Meta Quest 3 review). Yes, a VR headset isn’t the same as these Xreal glasses but it’s an awesome XR wearable and I think most people would find it offers way more bang for your buck – as it’s not just a display, the Quest 3 does stuff without you needing to pick up a bunch of not-so-optional extras.

If you have a spare $399 / £399 lying around and you want to pick up some fun tech then you could do a lot worse than the Xreal Air 2, but for the money, you could do better too.

Xreal Air 2: Price and availability

The Xreal Air 2 and Air 2 Pro are available to preorder for $399 / £399 and $449 / £449 respectively from Xreal.com and Amazon. There’s no firm release date yet, but Xreal has told us they’ll be shipping to US and UK customers in November.

The two models are fairly similar, but the cheaper Xreal Air 2 loses out on Electrochromic Dimming. This exclusive Xreal Air 2 Pro feature allows you to dim the lenses between three presets – you can either go for fully transparent where you can see the real world more clearly, blacked-out immersive mode where the real world is (almost) entirely blocked out, or a half-way point between the two.

It’s certainly neat but an optional cover that comes with both models is still the best option for blocking out visual distractions while you try to immerse yourself in what you’re watching. As such, spending an extra $50 / £50 on the Pro glasses isn’t going to be necessary for most people.

Xreal Air 2: Design

  • Comfortable, lightweight design
  • Improved lens cover
  • New color options

I’ve tested a fair few AR smart glasses like the Xreal Air 2 and Air 2 Pro, and based on my experience they’re one of the better-designed options on the market.

The inside displays are shown off in the photo, they sit behind the Xreal Air 2 Pro AR glasses shades

Here are the Xreal Air 2 Pro's displays (Image credit: Future)

As with other smart glasses – like the Ray-Ban Meta Smart Glasses or Rokid Max AR glasses – these specs aren’t noticeably larger than a regular pair of specs. The only telltale signs they aren’t regular glasses are the inner displays that sit behind the lenses, and a USB-C port at the end of the left arm for attaching a cable between the glasses and another device.

They’re pretty lightweight too. The Xreal Air 2 comes in at 72g and the Air 2 Pro glasses at 75g. The specs also come with a few easily swappable nose clips, which make it easy to adjust the position and fit the AR gadget. Those of you who wear glasses will appreciate the free optional frame attachment that allows you to equip the AR glasses with prescription lenses – though will have to pay extra for the actual lenses.

The Xreal Air 2 glasses are comfy to wear for extended periods of time for another reason. The biggest issue of some rival devices is that the bridge of the glasses (which touches your face just above your nose) can get annoyingly hot – and can do so after barely five minutes of use. 

That’s not the case with Xreal’s specs. Even an hour in I’m happy to keep using them for as long as I can.

The only significant fault I can find with the glasses is that they lack volume controls. Instead, the only buttons on the glasses allow you to change brightness levels – though I’ve never found a reason to set them at anything less than max.

If you pick up the Xreal Air Pro 2 model you’ll find an additional button for adjusting the Electrochromic Dimming – a feature that affects how shaded the front lenses are. As I mentioned above you can change the lens between transparent, blacked-out, or in-between.

This feature is certainly neat but I personally prefer to use the case cover that comes with both models when I want to immerse myself. The cover not only helps to block out more light from the front, but has plastic parts that extend underneath the lenses to further block out annoying light and reflections – marking an upgrade from the cover used by the original Xreal Air.

The Xreal Air 2 Pro AR smart glasses on a table with a black cover over their front

The Xreal Air 2 Pro doesn't look as yellow with the cover on (Image credit: Future)

The only downside of the cover is that it hides the kaleido decal that I’ve applied to the specs. I’m not entirely sure if this optional sticker kit comes free with every pair of Xreal Air 2 glasses, just the Pro model, or was just an extra in the reviewer’s kit but it applied a fun vibrancy to the AR glasses and makes them look a little less intimidating – wearable tech (particularly glasses) can make some people feel uncomfortable even if they don’t have any cameras.

I got a yellow kit as you can see from the images in this review, but there are navy, turquoise, blue, pink, and green options as well. You could also remove the hassle of applying a sticker and get the Xreal Air 2 glasses in red by default– the Pro unfortunately only comes in black without using any decals.

  • Design Score: 4/5 

Xreal Air 2: Performance

  • Impressive HD visuals
  • Immersive audio
  • Bass is a little weak, and the sound does leak a fair bit

The Xreal Air 2 Pro glasses offer a solid performance boost over their predecessor.

When it comes to visuals, the Sony 0.55 Micro-OLED display does an impressive job. Thanks to the 25% boost in brightness the glasses have received (and the calibration Xreal has done to get the specs TÜV Rheinland certified for color accuracy) I felt that colors look more vibrant than on the original Xreal Air glasses. The contrast in dark scenes is also pretty darn good, which is to be expected from an OLED screen. 

The 100,000:1 contrast ratio and 500nits brightness might not look like a lot on paper – they’re what you’d expect from more budget-friendly projectors that aren’t all that impressive – but because the glasses aren’t attempting to throw the image across the room they’re able to use the same specs to deliver a much higher-quality picture. 

That said, I do still find the best performance is to be had in a fairly dark space with the cover attached (and the Electrochromic Dimming set to max if you have the Air 2 Pro). They can still function okay in brighter spaces, but you’ll notice a dip in quality – especially if you don’t have the cover with you.

The only disappointment is that these specs still only offer full-HD (1080p) resolution. It’s fine but 4K visuals would really have been appreciated. At least you can benefit from a 120Hz refresh rate if you want to use them for gaming.

The Nreal Air AR glasses being used to see a video game screen while some holds a controller in their living room, they're sat on the couch with pillows all around them

Here's what it looks like to see virtual images on the Xreal Air glasses (Image credit: Nreal)

The glasses’ audio performance isn’t as impressive as its picture quality, but the sound is still pretty solid and offers a good level of immersion that will suck you into the action of the content you’re watching thanks to its upgraded “cinematic sound system” with spatial audio.

While watching various shows, films, and music videos I found that mid and high-range tones were delivered with good clarity – even when I cranked the volume up a bit there wasn’t noticeable distortion. That said, I found the specs do struggle with bassier tones. The lower-end performance isn’t terrible but it doesn’t have as much force behind it as I would like – which can lead to music feeling less impactful, and some action explosions lacking the intensity you’d get from a more capable setup.

I do like the open-ear design though – which is taken wholesale from its predecessor. It’s perfect for commuting as you can enjoy your favorite show or movie while you travel while still being able to listen out for when you get to your stop.

Just watch out for audio leakage – as while the situation is improved on these newer models, much like with the original Xreal Air glasses your audio still isn’t completely private. If someone is sitting or standing next to you while content is playing through the Xreal Air 2 or Air 2 Pro glasses at moderate volumes they’ll be able to hear it.

The only solution is to add a pair of Bluetooth headphones to your setup, but this will have an impact on the battery life of the device they’re connected to. However, if they’re a decent pair – like one of the picks on our best headphones list – then you might find the battery trade-off is worthwhile for the privacy and improved sound quality you’ll experience. 

  • Performance Score: 4/5 

Xreal Air 2: Compatibility

  • Compatible with a range of devices
  • Xreal Beam and additional adapters are pricey, but feel necessary

The Xreal Air 2 and the 2 Pro have an identical compatibility list to the original Xreal Air. 

Using the included USB-C to USB-C cable you can use the glasses with a range of laptops, tablets, handled consoles like the Steam Deck, and phones that support video output through a USB-C port (called DisplayPort). Just note that not every USB-C phone will offer this feature – for example, the Google Pixel 8 and other Pixel phones don’t support video output via their USB-C port.

For devices lacking a USB-C port or DisplayPort support, you can try using the Xreal Beam adapter. This optional add-on, which comes in at $119 (UK price to be confirmed), allows you to wirelessly cast content from your incompatible phone – such as an iPhone with a lightning port – to the glasses (note, the Google Pixel line still won’t work with the Beam as they can only cast to devices using Google’s proprietary Chromecast tech, not the generic version used by the Beam). 

Using another USB-C to USB-C cable you can also connect your Xreal Air 2 glasses to a Nintendo Switch through the Xreal Beam.

The Beam serves as a power source for the glasses too, and will help you enjoy content on your Xreal Glasses for longer – as rather than draining the connected device’s battery you’ll use the Beam’s stored power instead which lasts for around 3 hours.

If you purchase an HDMI to USB-C cable (make sure it’s this way around as most cables on Amazon are USB-C to HDMI, and as I found out they don’t work as intended with the Xreal Air) you can hook your glasses up to a console like a PS5, an Xbox Series X or a docked Nintendo Switch. 

The Xreal Air 2 Pro AR smart glasses next to the Xreal Beam hub, they're both on a wooden table in front of a brick wall

The Xreal Beam is a neat but pricey add-on (Image credit: Future)

I just wish the Xreal Air 2 and Air 2 Pro came with more of these cables and adapters in their boxes – as either you have to opt for the relatively pricey adapters it sells as add-ons, or try and navigate the labyrinth of cheaper third-party options on Amazon which may or may not work. Including the cables in the box would not only make things simpler, it would help to make the Air 2 glasses feel like better value for money as you no longer need to spend extra on not-so-optional add-ons.

I also would prefer if the Beam was more like the Rokid Station – which is effectively a portable Android TV device for the Rokid Max smart AR glasses. You can jerry-rig together a setup that uses the Rokid Station and Xreal glasses, but you’ll need a pair of BlueTooth headphones for audio. If the Beam is going to stay as a more rudimentary product, much like I said for the additional connector cables, I’d like to see it included in the Xreal Air 2’s price.

Heck, even if it is given some upgrades I’d like to see it included in the Xreal Air 2’s price. It would make the $399 / £399 price tag a lot easier to swallow.

Should you buy the Xreal Air 2 glasses?

Buy them if… 

Don't buy them if...

Also consider

How I tested the Xreal Air 2 Glasses

For this review, I was sent the Xreal Air 2 Pro model to test – and as I mentioned it’s functionally identical to the regular Xreal Air 2 glasses except has lenses with Electrochromic Dimming.

To put the displays through their paces I watched a range of different content across streaming services and YouTube. I played movies with bright vibrant scenes to see how well the colors were presented, I watched shadowy scenes in films to test the glasses contrast capabilities, and I played many music videos to get a feel for the speakers’ audio chops.

I also made sure to test the glasses with a range of compatible devices including a smartphone, a laptop, the Xreal Beam, and games consoles.

Lastly, I would swap between these smart specs and a few others I have lying around from older reviews, including the original Xreal Air glasses to see how these specs compare.

Read more about how we test

  • First reviewed October 2023
Logitech G Pro X Superlight 2 Lightspeed review: lighter and better for gaming
6:00 pm | October 23, 2023

Author: admin | Category: Computer Gaming Accessories Computers Computing Gadgets Gaming Computers | Tags: | Comments: Off

Logitech G Pro X Superlight 2 Lightspeed: Two-minute review

Though its name can be a mouthful, the Logitech G Pro X Superlight 2 Lightspeed makes everything else easy-breezy for users. Logitech G took one of the best gaming mice ever and improved on it in many ways, from its weight and charging port to its sensor, while keeping the bits that already made it a crowd-favorite.

Being one of the most lightweight gaming mice isn’t its only accolade – although shedding 3g off its predecessor’s weight is nonetheless impressive. It’s a speedy and long-lasting one too, and focusing more on its performance and longevity also allows it a no-frills look that makes it easy to fit in any setup, whether or not you’re into flashy RGB.

As its name implies, this is a wireless gaming mouse that uses Logitech’s Lightspeed wireless technology for connectivity. That means that you won’t have to put up with cables snagging when gaming.

Logitech G Pro X Superlight 2 Lightspeed on the tester's desk mat

(Image credit: Future / Michelle Rae Uy)

If the Logitech G Pro X Superlight 2 Lightspeed looks to you very similar to the original though, that’s because Logitech G has largely kept the same simple minimalist design. That’s alright, in my opinion. It may be the era of maximalism, but that’s not necessary here. It’s also kept the same USB receiver garage at the bottom to keep that USB receiver safe, with the same round magnetic door that conveniently snaps in place, as well as the same five buttons, the same smooth-to-touch matte shell, and the same supportive form that makes it ideal for both claw and palm grippers. 

That smooth finish may not be everyone’s cup of tea, as some gamers need a bit of texture for proper grippage – if only Logitech G had replicated the Razer Viper V3 HyperSpeed’s grippy finish. However, stuck-on grips are included in the box that aren’t too shabby as alternatives. And all five buttons are within easy reach, even for someone like me who’s got small hands, so you can rest assured that you’re gaming comfortably.

Image 1 of 2

Logitech G Pro X Superlight 2 Lightspeed on the tester's desk mat

(Image credit: Future / Michelle Rae Uy)
Image 2 of 2

Logitech G Pro X Superlight 2 Lightspeed on the tester's desk mat

(Image credit: Future / Michelle Rae Uy)

Just like the original, there’s no Powercore module (the wireless charging puck) included, even though you can still swap out the magnetic garage door for it for wireless charging. If you already have Logitech’s Powerplay wireless charging system, then you’re all set. If not, you’ll have to spend more for that convenience, which isn’t great considering this mouse is already expensive.

There are some design improvements thrown in, however. The most welcome of them is the USB-C charging port that replaced that antiquated and frankly annoying microUSB port. And again, its weight dropped from 63g of the original to merely 60g. Finally, apart from the black and white color options, there’s also a pink one for those trying to stray from neutral shades. 

Logitech G Pro X Superlight 2 Lightspeed on the tester's desk mat

(Image credit: Future / Michelle Rae Uy)

The Logitech G Pro X Superlight 2 Lightspeed delivers a faster and more precise performance over its predecessor. That’s all thanks to its 2,000Hz polling rate and a new HERO 2 sensor that offers up to 32,000 DPI (a jump from 1,000Hz polling rate and up to 25,600 DPI).

Admittedly, those numbers, which you can set and adjust via the Logitech G Hub, are more than what most regular gamers need, but they do mean that this gaming mouse can more than keep up during fast-paced games and battles when you’re being overwhelmed by enemies, making it future-proofed. While I’m far from a competitive gamer, it’s proven more than capable when I’m playing CS:GO and Doom Eternal.

Logitech G Pro X Superlight 2 Lightspeed on the tester's desk mat

(Image credit: Future / Michelle Rae Uy)

I do have a couple of minor quibbles, however. Sadly, zero-additive PTFE mouse feet, while delivering impeccable maneuverability on some surfaces, don’t glide easily on others. I found that although they’re great on gaming mouse pads and mats, they feel fiddly on bare desks. On top of that, the lower arch of the mouse isn’t as supportive for palm grippers; wrist fatigue is real after a couple of hours.

However, the mouse makes up for it in longevity. With up to 95 hours of battery life on a single charge, you’re getting almost two weeks of gaming every day for eight hours per day. That tracks as I didn’t have to recharge once during my two-week testing period.

Logitech G Pro X Superlight 2 Lightspeed: Price & availability

  • How much does it cost? $159 / £149 / AU$299
  • When is it available? Available now
  • Where can you get it? Available in the US, UK, and Australia

All those improvements will cost you. The Logitech G Pro X Superlight 2 Lightspeed is slightly more expensive than its predecessor at $159 / £149 / AU$299. That’s around the same price as the Razer Deathadder V3 Pro, which has a base polling rate of 1,000Hz (upgradable to 4,000Hz with the Razer Hyperpolling wireless dongle), up to 30,000 DPI, and up to 90 hours battery life.

That price tag is admittedly a little steep for a gaming mouse, but if you’re looking for a fast-performing wireless mouse that lasts a while, it’s a great investment. However, if you can’t afford it, the HyperX Pulsefire Haste 2 Wireless offers 1,000Hz polling rate, up to 26,000 DPI, and an impressive 100-hour battery life for just $89.99 / £94.99 / AU$149.

  • Value: 4 / 5

Logitech G Pro X Superlight 2 Lightspeed: Specs

Logitech G Pro X Superlight 2 Lightspeed on the tester's desk mat

(Image credit: Future / Michelle Rae Uy)

Should you buy the Logitech G Pro X Superlight 2 Lightspeed?

Buy it if...

You need a fast and long-lasting wireless gaming mouse
It delivers speed and accurate performance, making it ideal for competitive and fast-paced gaming.

You prefer a lightweight mouse
It’s not the most lightweight wireless gaming mouse, but it is one of the lightest. If you want something light, this is a strong contender.

You hate charging your wireless peripherals
This has up to 90 hours of battery life on a single charge, which means you won’t have to charge that often.

Don't buy it if...

You’re on a budget
It is a pretty expensive investment, and there are cheaper under $100 / £100 alternatives available.

You prefer a gaming mouse with more heft
If you’re one of the many gamers who aren’t comfortable with lightweight mice, you should give this one a skip.

Logitech G Pro X Superlight 2 Lightspeed: Also consider

How I tested the Logitech G Pro X Superlight 2 Lightspeed

  • Tested the mouse for a couple of weeks
  • Used it for playing PC games as well as for work

I spent two weeks testing the Logitech G Pro X Superlight 2 Lightspeed, dedicating a few hours each night for gaming so I could put this gaming mouse through its paces. In the daytime, I used it as my main mouse for work.

To test it, I played a few games with it, from a couple of fast-paced titles to more leisurely-paced games, getting a feel for its buttons, ergonomics, and performance. I made sure to utilize the G Hub software to customize settings and gave it a full charge before I began testing so I could accurately assess its battery life.

I’ve been testing and reviewing PC gaming peripherals for about 10 years now. Not only do I have plenty of experience with them, but I know what makes the best ones tick and can intuitively tell you which ones are not worth your time and money.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed October 2023

Gamakay LK75 75% review: a mechanical keyboard for the truly hardcore
6:08 pm | October 18, 2023

Author: admin | Category: Computers Computing Gadgets Keyboards Peripherals & Accessories | Comments: Off

Gamakay LK75 75%: Two-minute review

The Gamakay LK75 75% is a mechanical keyboard for the truly hardcore, allowing users to customize virtually every part of it. Thanks to the level of depth of its customization options alone, it's easily one of the best mechanical keyboards out there and could even personalized to be one of the best gaming keyboards and the best keyboard for programmers.

You can swap out the keycaps, replace the switches, and reprogram every key including the knob at the top right. The knob itself is pretty interesting, as it has its own LED screen that displays the time, date, and the OS the keyboard is connected to. You can also change up the RGB lighting through the knob display.

Reprogramming the keys requires the Gamakay software, which you can download from the official website. However, you wouldn’t know that since the included manual doesn’t mention it at all, which is a bit baffling. The software is quite intricate, offering tons of ways to customize the keys including function, lightning, and performance. 

The knob is also customizable through the software itself. Its initial function is to control the volume, but I found that it doesn’t work. Also, even though the time is displayed through it, it’s not set properly until you do it yourself, which is odd since it would make more sense to automatically sync with the OS time once you connect it.

white and orange mechanical keyboard

(Image credit: Future)

The Gamakay LK75’s PC plate and PCB are 'top mount' and, combined with the built-in PET pad, bottom silicone pad, PCB sandwich silicone pad, and spacebar form, it offers increased stability and reduces both sound and general harshness when typing.

Handling this keyboard can be a bit intimidating at first for those not completely familiar with the intricacies of mechanical keyboards, especially as the Gamakay line of switches doesn’t follow normal naming conventions and the abundance of text on the keycaps themselves can be confusing. But at least one aspect is much easier compared to other keyboards: the process of changing the switches.

Included with the keyboard is a combo keycap and switch puller. The keycaps come off pretty smoothly and you can swap them out for any other Gamakays keycaps to change up the aesthetic of the keyboard, though I rather like the orange caps myself. The switches are surprisingly simple to pull out as well and are not only compatible with the three-pin Gamakay Planet switches but with any other three or five-pin switches.

Image 1 of 3

blue switch

(Image credit: Future)
Image 2 of 3

various mechanical switches

(Image credit: Future)
Image 3 of 3

various mechanical switches

(Image credit: Future)

Depending on the type of switch you install, it has a huge effect on the sound and feel, though overall each switch that I tried out still has a softer impact compared to other mechanical keyboards. The Gamakay Planet switches which is the set I tried out are Mercury (the clickiest linear), Venus clickiest tactile), Mars (heaviest and strongest feedback), and Jupiter (the most balanced linear). 

They all have the same travel distance of 3.30mm, with the Mercury and Venus switches sharing the same actuation force of 40g. You can feel it in how light and easy they are to type on. My personal favorite is the Venus switches for that reason – providing a nice clickiness and tactile feedback without requiring too much force to activate. 

But even the highest ones, Jupiter and Mars, have an actuation force of 50g compared to Gateron Greens with one of 80g. There are plenty of other Gamakay switches to choose from including the Silent switches and, if you’re yearning for something a bit more traditional, Gamakay also offers Gateron switches on its site.

Image 1 of 3

white and orange mechanical keyboard

(Image credit: Future)
Image 2 of 3

white and orange mechanical keyboard

(Image credit: Future)
Image 3 of 3

white and orange mechanical keyboard

(Image credit: Future)

There are three methods of connectivity: wired via a USB Type-C port, 2.4 wireless, and Bluetooth. They’re activated by use of the FN key plus a number key, outlined in the thin manual included. All three work well, with the wired connection offering the least latency. I also adore that there’s a tiny magnetic slot to store the dongle in, preventing it from being misplaced. 

However, there was an odd issue when I tried connecting the keyboard to an all-in-one PC using all three methods - as in, it wouldn’t connect at all. But regular and gaming PCs seemed to work just fine. It's possible this was a one-off glitch, but it may be something to be wary of.

Gamakay LK75 75%: Price & availability

white and orange mechanical keyboard

(Image credit: Future)
  • How much does it cost? $129.99 / £110 / AU$211
  • When is it available? Available now
  • Where can you get it? Available in the US, UK, and Australia

The Gamakay LK75 75% keyboard is available in the US, UK, and Australia for $129.99 / £110 / AU$211. Gamakay also ships to most other regions, which is even better for those outside the aforementioned three.

Pricing is pretty standard for high-end mechanical keyboards, meaning that it’s very expensive though less so than others. Though at the time of writing, there’s a sale that shaves off about $10. Compared to other more notable brands like the Drop ALT,  SteelSeries Apex Pro TKL (2023), and the Razer Huntsman V2 TKL, this easily competes with them while being much cheaper.

Gamakay LK75 75%: Specs

Should you buy the Gamakay LK75 75%?

Buy it if...

You want a great-quality mechanical keyboard
It's a solid-quality mechanical keyboard that's heavy and well-built, with nice feeling switches and excellent features.

You want a fully customizable keyboard
Every bit of this keyboard is customizable from the keycaps to the switches to the programmable keys themselves.

Don't buy it if...

You need a more budget-minded mechanical keyboard
Though it's cheaper than other similar keyboards, its price point is still a hard pill to swallow.

You want a plug-and-play keyboard that works everywhere
I did have some issues connecting the keyboard to certain devices, and the Gamakay software is a must-have, so this isn't an easy plug-and-play recommendation.

Gamakay LK75 75%: Also consider

How I tested the Gamakay LK75 75%

  • I spent about a week testing this keyboard
  • I tested it both for productivity work and gaming
  • I used it extensively in a home office environment

I tested the Gamakay LK75 75% keyboard in a home office environment, seeing how well it functioned in both productivity work and gaming. I also carried it around in various bags to test its portability.

The Gamakay LK75 75% is a mechanical keyboard that's meant for extensive use over years. I made sure to quality-test it to see if it held up to those standards, as well as to see how easy it is to pull the keycaps off and how easy it is to reprogram the RGB lighting.

I've tested a wide range of keyboards including mechanical ones, and understand how to properly rate and test them out to ensure that they reach a certain level of quality.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed October 2023

Intel Core i5-14600K review: wait for Meteor Lake
4:00 pm | October 17, 2023

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Intel Core i5-14600K: Two-minute review

The Intel Core i5-14600K is not the kind of processor you're really going to want to upgrade to, despite technically offering the best value of any processor I've tested.

First, the good. This is one of the best processor values you're going to find on the market, no matter what happens with the price of its predecessor. Currently, it has the best performance for its $319 price tag (about £255/AU$465), and AMD's competing Ryzen 5 7600X isn't all that close. If you're looking to get the most bang for your buck today, then the Intel Core i5-14600K is it.

In terms of performance, this isn't a bad chip at all; I'd even say it's a great one if you take its predecessor out of the running, which will inevitably happen as its last remaining stock gets bought up. It doesn't have the performance of the Intel Core i7-14700K, but that's a workhorse chip, not the kind that's meant to power the best computers for the home or the best budget gaming PCs as these chips start making their way into prebuilt systems in the next couple of months.

For a family computer or one that's just meant for general, every day use, then this chip is more than capable of handling whatever y'll need it for. It can even handle gaming fairly well thanks to its strong single core performance. So, on paper at least, the Core i5-14600K is the best Intel processor for the mainstream user as far as performance goes.

The real problem with the i5-14600K is that its performance is tragically close to the Core i5-13600K's. And even though the MSRP of the Intel Core i5-13600K is technically higher than that of the Core i5-14600K, it's not going to remain that way for very long at all.

The real problem with the i5-14600K, and one that effectively sinks any reason to buy it, is that its performance is tragically close to the Core i5-13600K's.

As long as the i5-13600K is on sale, it will be the better value, and you really won't even notice a difference between the two chips in terms of day-to day-performance.

That's because there's no difference between the specs of the 14600K vs 13600K, other than a slightly faster turbo clock speed for the 14600K's six performance cores.

While this does translate into some increased performance, it comes at the cost of higher power draw and temperature. During testing, this chip hit a maximum temperature of 101ºC, which is frankly astounding for an i5. And I was using one of the best CPU coolers around, the MSI MAG Coreliquid E360 AIO, which should be more than enough to keep the temperature in check to prevent throttling.

Image 1 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 2 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 3 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 4 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 5 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 6 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 7 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 8 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 9 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 10 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 11 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 12 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 13 of 13

Synthetic benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)

Looking at the chip's actual performance, the Core i5-14600K beats the AMD Ryzen 5 7600X and the Intel Core i5-13600K in single core performance, multi core performance, and with productivity workloads, on average. Other than its roughly 44% better average multi core performance against the Ryzen 5 7600X, the Core i5-14600K is within 3% to 4% of its competing chips.

Image 1 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 2 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 3 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 4 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 5 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 6 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 7 of 7

Creative benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)

In creative workloads, the Core i5-14600K again manages to outperform the Ryzen 5 7600X by about 31% on average, but it's just 2.4% better than its predecessor, and none of these chips are especially great at creative content work. If you're messing around with family albums or cutting up TikTok videos, any one of these chips could do that fairly easily. For heavier-duty workloads like video encoding and 3D rendering, the Intel chips hold up better than the mainstream Ryzen 5, but these chips really aren't practical for that purpose.

Image 1 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 2 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 3 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 4 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 5 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 6 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)

On the gaming front, it's more of the same, though now at least the Ryzen 5 7600X is back in the mix. Overall, the Core i5-14600K beats its 13th-gen predecessor and AMD's rival chip by about 2.1% and 3.2% respectively.

Image 1 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

All of this comes at the cost of higher power draw and hotter CPU temperatures, though, which isn't good especially for getting so little in return. What you really have here is an overclocked i5-13600K, and you can do that yourself and save some money by buying the 13600K when it goes on sale, which is will.

An Intel Core i5-14600K against its promotional packaging

(Image credit: Future / John Loeffler)

Intel Core i5-14600K: Price & availability

  • How much does it cost? US MSRP $319 (about £255/AU$465)
  • When is it out? October 17, 2023
  • Where can you get it? You can get it in the US, UK, and Australia

The Intel Core i5-14600K is available in the US, UK, and Australia as of October 17, 2023, for an MSRP of $319 (about £255/AU$465). 

This is a slight $10 price drop from its predecessor, which is always good thing, and comes in about $20 (about £15/AU$30) more than the AMD Ryzen 5 7600X, so fairly middle of the pack price-wise.

In terms of actual value, as it goes to market, this chip has the highest performance for its price of any chip in any product tier, but only by a thin margin, and one that is sure to fall very quickly once the price on the 13600K drops by even a modest amount.

Intel Core i5-14600K: Specs

Intel Core i5-14600K: Verdict

  • Best performance for the price of any chip tested...
  • ...but any price drop in the Core i5-13600K will put the 14600K in second place
  • Not really worth upgrading to with the Core i7-14700K costing just $90 more
Image 1 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 2 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 3 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 4 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 5 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 6 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)
Image 7 of 7

Final benchmark results for the Intel Core i5-14600K

(Image credit: Future / Infogram)

Ultimately, the market served by this chip specifically is incredibly narrow, and like the rest of the Raptor Lake Refresh line-up, this is the last hurrah for the Intel LGA 1700 socket.

That means if you go out and buy a motherboard and CPU cooler just for the 14th-gen, it's a one time thing, since another generation on this platform isn't coming. It doesn't make sense to do that, so, if you're upgrading from anything earlier than the 12th-gen, it just makes so much more sense to wait for Meteor Lake to land in several months time and possibly get something really innovative.

If you're on a 12th-gen chip and you can't wait for Meteor Lake next year, the smartest move is to buy the i7-14700K instead, which at least gives you i9-13900K-levels of performance for just $90 more than the i5-14600K.

Ultimately, this chip is best reserved for prebuilt systems like the best all-in-one computers at retailers like Best Buy, where you will use the computer for a reasonable amount of time, and then when it becomes obsolete, you'll go out and buy another computer rather than attempt to upgrade the one you've got.

In that case, buying a prebuilt PC with an Intel Core i5-14600K makes sense, and for that purpose, this will be a great processor. But if you're looking to swap out another Intel LGA 1700 chip for this one, there are much better options out there.

Should you buy the Intel Core i5-14600K?

Buy the Intel Core i5-14600K if...

Don't buy it if...

Also Consider

If my Intel Core i5-14600K review has you considering other options, here are two processors to consider... 

How I tested the Intel Core i5-14600K

  • I spent nearly two weeks testing the Intel Core i5-14600K
  • I ran comparable benchmarks between this chip and rival midrange processors
  • I gamed with this chip extensively
Test System Specs

These are the specs for the test system used for this review:

Intel Motherboard: MSI MPG Z790E Tomahawk Wifi
AMD Motherboard: Gigabyte Aorus X670E Extreme
CPU Cooler:
MSI MAG Coreliquid E360 AIO
Memory:
32GB SK Hynix DDR5-4800
SSD: Samsung 990 Pro
PSU: Thermaltake PF3 1050W ATX 3.0
Case: Praxis Wetbench

I spent about two weeks testing the Intel Core i5-14600K and its competition, primarily for productivity work, gaming, and content creation.

I used a standard battery of synthetic benchmarks that tested out the chip's single core, multi core, creative, and productivity performance, as well as built-in gaming benchmarks to measure its gaming chops. 

I then ran the same tests on rival chips from AMD as well as the other 14th-gen chips in the Raptor Lake Refresh launch lineup and 13th-generation Raptor Lake processors. For Intel chips, I used the same motherboard, RAM, SSD, and graphics card to ensure I was isolating just the CPU's performance across every chip. For AMD chips, I used a comparable AM5 motherboard so differences in the motherboard configuration and circuitry are mitigated to the largest extent possible.

I've been testing and reviewing computer hardware for years now, and with an extensive background in computer science, I know processors in and out, and I use that knowledge to ensure every chip is thoroughly tested.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed October 2023

Intel Core i7-14700K review: salvaging Raptor Lake Refresh with i9-13900K performance
4:00 pm |

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Intel Core i7-14700K: One-minute review

The Intel Core i7-14700K is the workhorse CPU in the Intel's 14th generation launch line-up, and like any good workhorse, it's going to be the one to do the heavy lifting for this generation of processors. Fortunately for Intel, the Core i7-14700K succeeds in keeping Raptor Lake Refresh from being completely forgettable.

Of all the chips launched on October 17, 2023, the Core i7-14700K is the only one to get a substantive spec upgrade over its predecessor as well as a slight cut in price to just $409 (about £325/AU$595), which is $10 less than the Intel Core i7-13700K it replaces.

So what do you get for $10 less? Gen-on-gen, you don't get a whole lot of improvement (about 6% better performance overall compared to the 13700K), but that figure can be deceiving, since the Core i7-13700K was at the top of our best processor list for a reason. 

With the 13700K's performance being within striking distance of the Intel Core i9-13900K, that 6% improvement for the 14700K effectively closes the gap, putting the 14700K within just 3% of the 13900K overall, and even allowing it to pull ahead in average gaming performance, losing out to only the AMD Ryzen 7 7800X3D.

Fortunately for Intel, the Core i7-14700K succeeds in keeping Raptor Lake Refresh from being completely forgetable.

In terms of productivity and general performance, the Core i7-14700K shines as well, going toe to toe with the best AMD processors like the AMD Ryzen 9 7950X and AMD Ryzen 9 7950X3D, giving it a very strong claim on being the best Intel processor processor for most people.

Given its excellent mix of performance and price, the Intel Core i7-14700K could very well be the last Intel chip of the LGA 1700 epoch that anyone should consider buying, especially if you're coming from a 12th-gen chip. 

With the Core i9-13900K outperforming the Intel Core i9-12900K by as much as 25% in some workloads, someone coming off an i9-12900K or lower will find it hard to believe that an i7 could perform this well, but that's where we're at. And with the i7-14700K coming in about 30% cheaper than the Intel Core i9-14900K, while still managing to come remarkably close in terms of its performance, the Intel Core i7-14700K is the Raptor Lake Refresh chip to buy if you're going to buy one at all.

An Intel Core i7-14700K with its promotional packaging

(Image credit: Future / John Loeffler)

Intel Core i7-14700K: Price & availability

  • How much does it cost? US MSRP $409 (about £325/AU$595)
  • When is it out? October 17, 2023
  • Where can you get it? You can get it in the US, UK, and Australia

The Intel Core i7-14700K is available on October 17, 2023, with a US MSRP of $409 (about £325/AU$595), which is a slight decrease from its predecessor's MSRP of $419 (about £335/AU$610), and about 31% lower than the Intel Core i9-14900K and 32% percent lower than the AMD Ryzen 9 7950X. 

It's also cheaper than the AMD Ryzen 7 7800X3D, and just $10 more expensive than the AMD Ryzen 7 7700X, putting it very competitively priced against processors in its class.

The comparisons against the Core i9 and Ryzen 9 are far more relevant, however, since these are the chips that the Core i7-14700K are competing against in terms of performance, and in that regard, the Intel Core i7-14700K is arguably the best value among consumer processors currently on the market.

  • Price score: 4 / 5

Intel Core i7-14700K: Specs & features

  • Four additional E-Cores
  • Slightly faster clock speeds
  • Increased Cache
  • Discrete Wi-Fi 7 and Thunderbolt 5 support

The Intel Core i7-14700K is the only processor from Intel's Raptor Lake Refresh launch line-up to get a meaningful spec upgrade.

Rather than the eight performance and eight efficiency cores like the i7-13700K, the i7-14700K comes with eight performance cores and 12 efficiency cores, all running with a slightly higher turbo boost clock for extra performance. The i7-14700K also has something called Turbo Boost Max Technology 3.0, which is a mouthful but also gives the best performing P-core an extra bump up to 5.6GHz so long as the processor is within power and thermal limits.

The increased core count also adds 7MB of additional L2 cache for the efficiency cores to use, further improving their performance over the 13700K's, as well as four additional processing threads for improved multitasking.

It has the same TDP of 125W and same Max Turbo Power rating of 253W as the 13700K, with the latter being the upper power limit of sustained (greater than one second) power draw for the processor. This ceiling can be breached, however, and processing cores can draw much more power in bursts as long as 10ms when necessary.

There is also support for discrete Wi-Fi 7 and Bluetooth 5.4 connectivity, as well as discrete Thunderbolt 5 wired connections, so there is a decent bit of future proofing in its specs.

  • Chipset & features score: 4 / 5

An Intel Core i7-14700K slotted into a motherboard

(Image credit: Future / John Loeffler)

Intel Core i7-14700K: Performance

  • Outstanding performance on par with the i9-13900K
  • Best gaming performance of any Intel processor
  • More power hungry than predecessor, so also runs hotter

The Intel Core i7-14700K is arguably the best performing midrange processor on the market, coming within striking distance of the Core i9-13900K and Ryzen 9 7950X across most workloads, including very strong multi core performance thanks to the addition of four extra efficiency cores.

Image 1 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 2 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 3 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 4 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 5 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 6 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 7 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 8 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 9 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 10 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 11 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 12 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 13 of 13

Synthetic benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)

The strongest synthetic benchmarks for the 14700K are single core workloads, which puts it effectively level with the Core i9-13900K and often beating the Ryzen 9 7950X and 7950X3D chips handily. 

This translates into better dedicated performance, rather than multitasking, but even there the Core i7-14700K does an admirable just keeping pace with chips with much higher core counts.

Image 1 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 2 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 3 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 4 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 5 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 6 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 7 of 7

Creative benchmarks for the Intel Core i7-14700K

(Image credit: Future / Infogram)

In creative workloads, the 14700K also performs exceptionally well, beating out the 13900K on everything except 3D model rendering, which is something that is rarely given to a CPU to do any when even the best cheap graphics cards can process Blender or V-Ray 5 workloads many times faster than even the best CPU can.

Image 1 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 2 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 3 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 4 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 5 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 6 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)

In gaming performance, the Core i7-14700K scores a bit of an upset over its launch sibling, the i9-14900K, besting it in gaming performance overall, though it has to be said that it got some help from a ridiculously-high average fps in Total War: Warhammer III's battle benchmark. In most cases, the i7-14700K came up short of the 13900K and 14900K, but not by much.

And while it might be tempting to write off Total War: Warhammer III as an outlier, one of the biggest issues with the Core i9's post-Alder Lake is that they are energy hogs and throttle under load quickly, pretty much by design. 

In games like Total War: Warhammer III where there are a lot of tiny moving parts to keep track of, higher clock speeds don't necessarily help. When turbo clocks kick into high gear and cause throttling, the back-and-forth between throttled and not-throttled can be worse over the course of the benchmark than the cooler but consistent Core i7s, which don't have to constantly ramp up and ramp down. 

So the 14700K isn't as much of an outlier as it looks, especially since the 13700K also excels at Total War: Warhammer III, and it too beats the two Core i9s. Total War: Warhammer III isn't the only game like this, and so there are going to be many instances where the cooler-headed 14700K steadily gets the work done while the hot-headed i9-13900K and 14900K sprint repeatedly, only to effectively tire themselves out for a bit before kicking back up to high gear.

Image 1 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

The additional efficiency cores might not draw as much power as the performance cores, but the additional power is still noticeable. The 14700K pulls down nearly 30W more watts than the 13700K, though it is still a far cry from the Core i9-13900K's power usage.

This additional power also means that the Core i7-14700K runs much hotter than its predecessor, maxing out at 100ºC, triggering the CPU to throttle on occasion. This is something that the i7-13700K didn't experience during my testing at all, so you'll need to make sure your cooling solution is up to the task here.

  • Performance: 4.5 / 5

An Intel Core i7-14700K with its promotional packaging

(Image credit: Future / John Loeffler)

Intel Core i7-14700K: Verdict

  • Fantastic single-core performance
  • Intel's best gaming processor, and second overall behind the Ryzen 7 7800X3D
  • Best value of any midrange processor
Image 1 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 2 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 3 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 4 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 5 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 6 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)
Image 7 of 7

Final benchmark results for the Intel Core i7-14700K

(Image credit: Future / Infogram)

Ultimately, the Intel Core i7-14700K is the best processor in the Raptor Lake Refresh line-up, offering very competitive performance for a better price than its predecessor and far better one than comparable chips one tier higher in the stack.

It's not without fault, though. It's not that much better than the i7-13700K, so everything I'm saying about the i7-14700K might reasonably apply to its predecessor as well. And honestly, the i7-14700K doesn't have too high a bar to clear to standout from its launch siblings, so it's performance might only look as good in comparison to the i9 and i5 standing behind it.

But, the numbers don't lie, and the Intel Core i7-14700K displays flashes of brilliance that set it apart from its predecessor and vault it into competition with the top-tier of CPUs, and that's quite an achievement independent of how the rest of Raptor Lake Refresh fares. 

A masculine hand holding an Intel Core i7-14700K

(Image credit: Future / John Loeffler)

Should you buy the Intel Core i7-14700K?

Buy the Intel Core i7-14700K if...

Don't buy it if...

Also Consider

If my Intel Core i7-14700K review has you considering other options, here are two processors to consider... 

How I tested the Intel Core i7-14700K

  • I spent nearly two weeks testing the Intel Core i7-14700K
  • I ran comparable benchmarks between this chip and rival midrange processors
  • I gamed with this chip extensively
Test System Specs

These are the specs for the test system used for this review:

Intel Motherboard: MSI MPG Z790E Tomahawk Wifi
AMD Motherboard: ASRock X670E Steel Legend
CPU Cooler:
MSI MAG Coreliquid E360 AIO
Memory:
32GB SK Hynix DDR5-4800
SSD: Samsung 990 Pro
PSU: Thermaltake PF3 1050W ATX 3.0
Case: Praxis Wetbench

I spent about two weeks testing the Intel Core i7-14700K and its competition, primarily for productivity work, gaming, and content creation.

I used a standard battery of synthetic benchmarks that tested out the chip's single core, multi core, creative, and productivity performance, as well as built-in gaming benchmarks to measure its gaming chops. 

I then ran the same tests on rival chips from AMD as well as the other 14th-gen chips in the Raptor Lake Refresh launch line-up and 13th-generation Raptor Lake processors. For Intel chips, I used the same motherboard, RAM, SSD, and graphics card to ensure I was isolating just the CPU's performance across every chip. For AMD chips, I used a comparable AM5 motherboard so differences in the motherboard configuration and circuitry are mitigated to the largest extent possible.

I've been testing and reviewing computer hardware for years now, and with an extensive background in computer science, I know processors in and out, and I use that knowledge to ensure every chip is thoroughly tested.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed October 2023

Intel Core i9-14900K review: more of a Raptor Lake overclock than a refresh
4:00 pm |

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Intel Core i9-14900K: Two-minute review

The Intel Core i9-14900K is a hard chip to justify, which is a weird thing to say about a processor that is arguably the best Intel has ever put out.

With very little fanfare to herald its arrival following the announcement of Intel Meteor Lake at Intel Innovation in September 2023 (and confirmation that Intel Meteor Lake is coming to desktop in 2024), Intel's 14th-generation flagship processor cannot help but draw parallels to the 11th-gen Rocket Lake chips that immediately preceded Intel Alder Lake.

The Core i9-11900K was something of a placeholder in the market until Intel could launch Alder Lake at the end of 2021. Those processors featured a new hybrid architecture and a more advanced 10nm process that helped propel Intel back to the top of our best processor list, despite strong competition from AMD.

With Intel Raptor Lake Refresh, we're back in placeholder territory, unfortunately. The performance gains here are all but non-existent, so we are essentially waiting on Meteor Lake while the i9-14900K absolutely guzzles electricity and runs hot enough to boil water under just about any serious workload with very little extra performance over the Intel Core i9-13900K to justify the upgrade.

The problem for the Core i9-14900K is that you can still get the i9-13900K.

It's not that the Core i9-14900K isn't a great processor; again, it's unquestionably the best Intel processor for the consumer market in terms of performance. It beats every other chip I tested in most categories with the exception of some multitasking workflows and average gaming performance, both of which it comes in as a very close runner-up. On top of that, at $589, it's the same price as the current Intel flagship, the Intel Core i9-13900K (assuming the i9-14900K matches the i9-13900K's £699 / AU$929 sale price in the UK and Australia).

The problem for the Core i9-14900K is two-fold: you can still get the i9-13900K and will be able to for a long while yet at a lower price, and the Intel Core i7-14700K offers performance so close to the 14th-gen flagship at a much lower price that the 14900K looks largely unnecessary by comparison. Essentially, If you've got an i7-13700K or i9-13900K, there's is simply nothing for you here.

If you're on an 11th-gen chip or older, or you've got an AMD Ryzen processor and you're looking to switch, this chip will be the last one to use the LGA 1700 socket, so when Meteor Lake-S comes out in 2024 (or even Lunar Lake-S, due out at the end of 2024 or early 2025), you won't be able to upgrade to that processor with an LGA 1700 motherboard. In other words, upgrading to an LGA 1700 for this chip is strictly a one-shot deal.

The only people who might find this chip worth upgrading to are those currently using a 12th-gen processor who skipped the 13th-gen entirely, or someone using a 13th-gen core i5 who wants that extra bit of performance and doesn't mind dropping $589 on a chip they might be upgrading from again in a year's time, which isn't going to be a whole lot of people. 

Unfortunately, at this price, it'll be better to save your money and wait for Meteor Lake or even Lunar Lake to drop next year and put the $589 you'd spend on this chip towards the new motherboard and CPU cooler you'll need once those chips are launched.

An Intel Core i9-14900K with its promotional packaging

(Image credit: Future / John Loeffler)

Intel Core i9-14900K: Price & availability

  • How much does it cost? US MSRP $589 (about £470/AU$855)
  • When is it out? October 17, 2023
  • Where can you get it? You can get it in the US, UK, and Australia

The Intel Core i9-14900K is available as of October 17, 2023, for a US MSRP of $589 (about £470/AU$855), which is the same as the Intel Core i9-13900K it is replacing. We don't have confirmation on UK and Australia pricing yet, though I've asked Intel for clarification and will update this review if and when I hear back from the company. If the 14900K keeps the same UK and Australia pricing as the Core i9-13900K, however, it'll sell for £699/AU$929 in the UK and Australia respectively.

Meanwhile, this is still cheaper than most of AMD's rival chips in this tier, the AMD Ryzen 9 7950X3D, AMD Ryzen 9 7950X, and AMD Ryzen 9 7900X3D, with only the AMD Ryzen 9 7900X coming in cheaper than the i9-14900K. 

This does make the Core i9-14900K the better value against these chips, especially given the level of performance on offer, but it's ultimately too close to the 13900K performance-wise to make this price meaningful, as a cheaper 13900K will offer an even better value against AMD's Ryzen 9 lineup.

  • Price score: 3 / 5

A masculine hand holding an Intel Core i9-14900K

(Image credit: Future / John Loeffler)

Intel Core i9-14900K: Specs & features

  • Faster clock speeds than i9-13900K
  • Some additional AI-related features

The Intel Core i9-14900K is the final flagship using Intel's current architecture, so it makes sense that there is very little in the way of innovation over the Intel Core i9-13900K.

Using the same 10nm Intel 7 process node as its predecessor and with the same number of processor cores (8 P-cores/16 E-cores), threads (32), and cache (32MB total L2 cache plus additional 36MB L3 cache), the only real improvement with the 14900K in terms of specs are its faster clock speeds.

All cores get a 0.2GHz increase to their base frequencies, while the P-core turbo boost clock increases to 5.6GHz and the E-core turbo clock bumps up to 4.4GHz from the 13900K's 5.4GHz P-Core turbo clock and 4.3GHz E-core turbo clock.

While those clock speeds are the official max turbo clocks for the two types of cores, the Core i9-14900K and Intel Core i7-14700K have something called Turbo Boost Max Technology 3.0, which increases the frequency of the best-performing core in the chip and gives it even more power within the power and thermal limits. That gets the Core i9-14900K up to 5.8GHz turbo clock on specific P-cores while active.

Additionally, an exclusive feature of the Core i9 is an additional Ludicrous-Speed-style boost called Intel Thermal Velocity Boost. This activates if there is still power and thermal headroom on a P-core that is already being boosted by the Turbo Boost Max Technology 3.0, and this can push the core as high as 6.0GHz, though these aren't typical operating conditions.

Both of these technologies are present in the 13900K as well, but the 14900K bumps up the maximum clock speeds of these modes slightly, and according to Intel, that 6.0GHz clock speed makes this the world's fastest processor. While that might technically be true, that 6.0GHz is very narrowly used so in practical terms, the P-Core boost clock is what you're going to see almost exclusively under load.

The Core i9-14900K has the same 125W TDP as the 13900K and the same 253W maximum turbo power as well, though power draw in bursts of less than 10ms can go far higher.

If this reads like a Redditor posting about their successful overclocking setup, then you pretty much get what this chip is about. If you're looking for something innovative about this chip, I'll say it again, you're going to have to wait for Meteor Lake.

The Core i9-14900K also has support for discrete Wi-Fi 7 and Bluetooth 5.4 connectivity, as does the rest of the 14th-gen lineup, as well as support for discrete Thunderbolt 5, both of which are still a long way down the road.

The only other thing to note is that there have been some AI-related inclusions that are going to be very specific to AI workloads that almost no one outside of industry and academia is going to be running. If you're hoping for AI-driven innovations for everyday consumers, let's say it once more, with feeling: You're going to have to wait for—

  • Chipset & features score: 3.5 / 5

An Intel Core i9-14900K slotted into a motherboard

(Image credit: Future / John Loeffler)

Intel Core i9-14900K: Performance

  • Best-in-class performance, but only by a hair
  • Gets beat by AMD Ryzen 7 7800X3D and i7-14700K in gaming performance
  • Runs even hotter than the i9-13900K

If you took any elite athlete who's used to setting records in their sport, sometimes they break their previous record by a lot, and sometimes it's by milliseconds or fractions of an inch. It's less sexy, but it still counts, and that's really what we get here with the Intel i9-14900K.

On pretty much every test I ran on it, the Core i9-14900K edged out its predecessor by single digits, percentage-wise, which is a small enough difference that a background application can fart and cause just enough of a dip in performance that the 14900K ends up losing to the 13900K. 

I ran these tests more times than I can count because I had to be sure that something wasn't secretly messing up my results, and they are what they are. The Core i9-14900K does indeed come out on top, but it really is a game of inches at this point.

Image 1 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 3 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 4 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 5 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 6 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 7 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 8 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 9 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 10 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 11 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 12 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 13 of 13

Synthetic benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

Across all synthetic performance and productivity benchmarks, the Core i9-14900K comes out on top, with the notable exception of Geekbench 6.1's multi-core performance test, where the AMD Ryzen 9 7950X scores substantially higher, and the Passmark Performance Test's overall CPU score, which puts the AMD Ryzen 9 7950X and Ryzen 9 7950X3D significantly higher. Given that all 16 cores of the 7950X and 7950X3D are full-throttle performance cores, this result isn't surprising.

Other than that though, it's the 14900K all the way, with a 5.6% higher geometric average on single-core performance than the 13900K. For multi-core performance, the 14900K scores a 3.1% better geometric average, and in productivity workloads, it scores a 5.3% better geometric average than its predecessor.

Against the AMD Ryzen 9 7950X, the Core i9-14900K scores about 13% higher in single-core performance, about 1% lower in multi-core performance, and 5% better in productivity performance.

Image 1 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 3 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 4 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 5 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 6 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 7 of 7

Creative benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

Creative benchmarks reveal something of a mixed bag for the Core i9-14900K. In all cases, it beats its predecessor by between 2.6% to as much as 10.9%. Against the AMD Ryzen 9 7950X and 7950X3D, the Core i9-14900K consistently loses out when it comes to rendering workloads like Blender and V-Ray 5, but beats the two best AMD processors by just as much in photo and video editing. And since 3D rendering is almost leaning heavily on a GPU rather than the CPU, AMD's advantage here is somewhat muted in practice.

Image 1 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 2 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 3 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 4 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 5 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)
Image 6 of 6

Gaming benchmarks for Intel 14th gen processors

(Image credit: Future / Infogram)

Gaming is another area where Intel had traditionally done well thanks to its strong single-core performance over AMD, but all that flipped with the introduction of AMD's 3D V-Cache. 

While the Intel Core i9-14900K barely moves the needle from its predecessor's performance, it really doesn't matter, since the AMD Ryzen 7 7800X3D manages to ultimately score an overall victory and it's not very close. The Core i9-14900K actually manages a tie for fourth place with the Intel Core i7-13700K, with the Core i7-14700K edging it out by about 4 fps on average.

Image 1 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 2

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

Of course, all this performance requires power, and lots of it. The Core i9-14900K pretty much matched the maximum recorded power draw of the Core i9-13900K, with less of a watt's difference between the two, 351.097W to 351.933, respectively.

The Core i9-14900K still managed to find a way to run hotter than its predecessor, however; something I didn't really think was possible. But there it is, the 14900K maxing out at 105ºC, three degrees hotter than the 13900K's max. It's the hottest I've ever seen a CPU run, and I'm genuinely shocked it was allowed to run so far past its official thermal limit without any overclocking on my part.

  • Performance: 3.5 / 5

A masculine hand holding an Intel Core i9-14900K

(Image credit: Future / John Loeffler)

Intel Core i9-14900K: Verdict

  • The best chip for dedicated performance like video editing and productivity
  • There are better gaming processors out there for cheaper
  • The Intel Core i7-14700K offers a far better value
Image 1 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 2 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 3 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 4 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 5 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 6 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)
Image 7 of 7

Final benchmark results for the Intel Core i9-14900K

(Image credit: Future / Infogram)

In the final assessment then, the Core i9-14900K does manage to win the day, topping the leaderboard by enough of a margin to be a clear winner, but close enough that it isn't the cleanest of wins. 

Overall, its single-core and productivity performance are its best categories, slightly faltering in creative workloads, and coming up short enough on gaming that it's not the chip I would recommend as a gaming CPU.

Like all Core i9s before it, the 14900K is the worst value of Intel's 14th-gen launch lineup, but it's better than its predecessor for the time being (though that advantage won't last very long at all), and it does manage to be a better value proposition than the Ryzen 9 7950X and Ryzen 9 7950X3D, while matching the Ryzen 7 7800X3D, so all in all, not too bad for an enthusiast chip.

Still, the Intel Core i7-14700K is right there, and its superior balance of price and performance makes the Intel Core i9-14900K a harder chip to recommend than it should be.

Should you buy the Intel Core i9-14900K?

Buy the Intel Core i9-14900K if...

Don't buy it if...

Also Consider

If my Intel Core i9-14900K review has you considering other options, here are two processors to consider... 

How I tested the Intel Core i9-14900K

  • I spent nearly two weeks testing the Intel Core i9-14900K
  • I ran comparable benchmarks between this chip and rival flagship processors
  • I gamed with this chip extensively
Test System Specs

These are the specs for the test system used for this review:

Intel Motherboard: MSI MPG Z790E Tomahawk Wifi
AMD Motherboard: ASRock X670E Steel Legend
CPU Cooler:
MSI MAG Coreliquid E360 AIO
Memory:
32GB SK Hynix DDR5-4800
SSD: Samsung 990 Pro
PSU: Thermaltake PF3 1050W ATX 3.0
Case: Praxis Wetbench

I spent about two weeks testing the Intel Core i9-14900K and its competition, using it mostly for productivity and content creation, with some gaming thrown in as well.

I used the standard battery of synthetic benchmarks I use for processor testing, and ran the same tests on rival chips from AMD as well as the other 14th-gen chips in the Raptor Lake Refresh launch lineup and 13th-generation Raptor Lake processors. For Intel chips, I used the same motherboard, RAM, SSD, and graphics card to ensure I was isolating just the CPU's performance across every chip. For AMD chips, I used a comparable AM5 motherboard so differences in the motherboard configuration and circuitry are mitigated to the largest extent possible.

I've been testing and reviewing computer hardware for years now, and with an extensive background in computer science, I know processors in and out, and I use that knowledge to ensure every chip is thoroughly tested.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed October 2023

Intel Arc A770 review: a great 1440p graphics card for those on a budget
4:00 pm | October 16, 2023

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Intel Arc A770: One-minute review

The Intel Arc A770 has had quite a journey since its release back on October 12, 2022, and fortunately, it has been a positive one for Intel despite a somewhat rocky start.

Right out the gate, I'll say that if you are looking for one of the best cheap graphics cards for 1440p gaming, this card definitely needs to be on your list. It offers great 1440p performance for most modern PC titles that most of us are going to be playing and it's priced very competitively against its rivals. 

Where the card falters, much like with my Intel Arc A750 review earlier this year, is with older DirectX 9 and DirectX 10 titles, and this really does hurt its overall score in the end. Which is a shame, since for games released in the last five or six years, this card is going to surprise a lot of people who might have written it off even six months ago.

Intel's discrete graphics unit has been working overtime on its driver for this card, providing regular updates that continue to improve performance across the board, though some games benefit more than others. 

Naturally, a lot of emphasis is going to be put on more recently released titles. And even though Intel has also been paying attention to shoring up support for older games as well, if you're someone with an extensive back catalog of DX9 and DX10 titles from the mid-2000s that you regularly return to, then this is not the best graphics card for your needs. Nvidia and AMD drivers carry a long legacy of support for older titles that Intel will honestly never be able to match.

But if what you're looking for is the best 1440p graphics card to play the best PC games of the modern era but you're not about to plop down half a grand on a new GPU, then the Intel Arc A770 is going to be a very solid pick with a lot more to offer than many will probably realize.

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Intel Arc A770: Price & availability

  • How much is it? US MSRP for 16GB card: $349 (about £280/AU$510); for 8GB card: $329 (about £265/AU$475)
  • When was it released? It went on sale on October 12, 2022
  • Where can you buy it? Available in the US, UK, and Australia

The Intel Arc A770 is available now in the US, UK, and Australia, with two variants: one with 16GB GDDR6 VRAM and an official US MSRP of $349 (about £280/AU$510), and one with 8GB GDDR6 VRAM and an official MSRP of $329 (about £265/AU$475).

Those are the launch MSRPs from October 2022, of course, and the cards have come down considerably in price in the year since their release, and you can either card for about 20% to 25% less than that. This is important, since the Nvidia GeForce RTX 4060 and AMD Radeon RX 7600 are very close to the 16GB Arc A770 cards in terms of current prices, and offer distinct advantages that will make potential buyers want to go with the latter rather than the former.

But those decisions are not as cut and dry as you might think, and Intel's Arc A770 holds up very well against modern midrange offerings, despite really being a last-gen card. And, currently, the 16GB variant is the only 1440p card that you're going to find at this price, even among Nvidia and AMD's last-gen offerings like the RTX 3060 Ti and AMD Radeon RX 6750 XT. So for 1440p gamers on a very tight budget, this card fills a very vital niche, and it's really the only card that does so.

  • Price score: 4/5

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Intel Arc A770: Design

  • Intel's Limited Edition reference card is gorgeous
  • Will fit most gaming PC cases easily
Intel Arc A770 Limited Edition Design Specs

Slot size: Dual slot
Length: 11.02 inches | 280mm
Height: 4.53 inches | 115mm
Cooling: Dual fan
Power Connection: 1 x 8-pin and 1 x 6-pin
Video outputs: 3 x DisplayPort 2.0, 1 x HDMI 2.1

The Intel Arc A770 Limited Edition that I'm reviewing is Intel's reference model that is no longer being manufactured, but you can still find some stock online (though at what price is a whole other question). 

Third-party partners include ASRock, Sparkle, and Gunnir. Interestingly, Acer also makes its own version of the A770 (the Acer Predator BiFrost Arc A770), the first time the company has dipped its toe into the discrete graphics card market.

All of these cards will obviously differ in terms of their shrouds, cooling solutions, and overall size, but as far as Intel's Limited Edition card goes, it's one of my favorite graphics cards ever in terms of aesthetics. If it were still easily available, I'd give this design five out of five, hands down, but most purchasers will have to opt for third-party cards which aren't nearly as good-looking, as far as I'm concerned, so I have to dock a point for that.

It's hard to convey from just the photos of the card, but the black finish on the plastic shroud of the card has a lovely textured feel to it. It's not quite velvety, but you know it's different the second you touch it, and it's something that really stands out from every other card I've reviewed.

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

The silver trim on the card and the more subtle RGB lighting against a matte black shroud and fans really bring a bit of class to the RGB graphics card I typically see. The twin fans aren't especially loud (not any more so than other dual-fan cards, at least), and the card feels thinner than most other similar cards I've reviewed and used, whether or not the card is thinner in fact.

The power connector is an 8-pin and 6-pin combo, so you'll have a pair of cables dangling from the card which may or may not affect the aesthetic of your case, but at least you won't need to worry about a 12VHPWR or 12-pin adapter like you do with Nvidia's RTX 4000-series and 3000-series cards.

You're also getting three DisplayPort 2.0 outputs and an HDMI 2.1 output, which puts it in the same camp as Nvidia's recent GPUs, but can't match AMD's recent move to DisplayPort 2.1, which will enable faster 8K video output. As it stands, the Intel Arc A770 is limited to 8K@60Hz, just like Nvidia. Will you be doing much 8K gaming on a 16GB card? Absolutely not, but as we get more 8K monitors next year, it'd be nice to have an 8K desktop running at 165Hz, but that's a very speculative prospect at this point, so it's probably not anything anyone looking at the Arc A770 needs to be concerned about.

  • Design Score: 4 / 5

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Intel Arc A770: Specs & features

  • Good hardware AI cores for better XeSS upscaling
  • Fast memory for better 1440p performance

Intel's Xe HPG architecture inside the Arc A770 introduces a whole other way to arrange the various co-processors that make up a GPU, adding a third, not very easily comparable set of specs to the already head-scratching differences between Nvidia and AMD architectures.

Intel breaks up its architecture into "render slices", which contain 4 Xe Cores, which each contain 128 shaders, a ray tracing processor, and 16 matrix processors (which are directly comparable to Nvidia's vaunted tensor cores at least), which handle graphics upsampling and machine learning workflows. Both 8GB and 16GB versions of the A770 contain eight render slices for a total of 4096 shaders, 32 ray processors, and 512 matrix processors.

The ACM-G10 GPU in the A770 runs at 2,100MHz base frequency with a 2,400MHz boost frequency, with a slightly faster memory clock speed (2,184MHz) for the 16GB variant than the 8GB variant's 2,000MHz. This leads to an effective memory speed of 16 Gbps for the 8GB card and 17.5 Gbps for the 16GB.

With a 256-bit memory bus, this gives the Arc A770 a much wider lane for high-resolution textures to be processed through, reducing bottlenecks and enabling faster performance when gaming at 1440p and higher resolutions thanks to a 512 GB/s and 559.9 GB/s memory bandwidth for the 8GB and 16GB cards, respectively.

All of this does require a good bit of power, though, and the Arc A770 has a TDP of 225W, which is higher than most 1440p cards on the market today.

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

As far as features all this hardware empowers, there's a lot to like here. The matrix cores are leveraged to great effect by Intel's XeSS graphics upscaling tech found in a growing number of games, and this hardware advantage generally outperforms AMD's FSR 2.0, which is strictly a software-based upscaler.

XeSS does not have frame generation though, and the matrix processors in the Arc A770 are not nearly as mature as Nvidia's 3rd and 4th generation tensor cores found in the RTX 3000-series and RTX 4000-series, respectively.

The Arc A770 also has AV1 hardware-accelerated encoding support, meaning that streaming videos will look far better than those with only software encoding at the same bitrate, making this a compelling alternative for video creators who don't have the money to invest in one of Nvidia's 4000-series GPUs.

  • Specs & features: 3.5 / 5

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Intel Arc A770: Performance

  • Great 1440p performance
  • Intel XeSS even allows for some 4K gaming
  • DirectX 9 and DirectX 10 support lacking, so older games will run poorly
  • Resizable BAR is pretty much a must

At the time of this writing, Intel's Arc A770 has been on the market for about a year, and I have to admit, had I gotten the chance to review this card at launch, I would probably have been as unkind as many other reviewers were.

As it stands though, the Intel Arc A770 fixes many of the issues I found when I reviewed the A750, but some issues still hold this card back somewhat. For starters, if you don't enable Resizable BAR in your BIOS settings, don't expect this card to perform well at all. It's an easy enough fix, but one that is likely to be overlooked, so it's important to know that going in.

Image 1 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 2 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 3 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 4 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 5 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 6 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 7 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 10 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 11 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 12 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 13 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 14 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 15 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)

In synthetic benchmarks, the A770 performed fairly well against the current crop of graphics cards, despite its effectively being a last-gen card. It is particularly strong competition against the Nvidia RTX 4060 Ti across multiple workloads, and it even beats the 4060 Ti in a couple of tests.

Its Achilles Heel, though, is revealed in the PassMark 3D Graphics test. Whereas 3DMark tests DirectX 11 and DirectX 12 workloads, Passmark's test also runs DirectX 9 and DirectX 10 workflows, and here the Intel Arc A770 simply can't keep up with AMD and Nvidia.

Image 1 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 2 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 3 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 4 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 5 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 6 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 7 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 10 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 11 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 12 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 13 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 14 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 15 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 16 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 17 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 18 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 19 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 20 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 21 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 22 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 23 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 24 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)

In non-ray-traced and native-resolution gaming benchmarks, the Intel Arc A770 managed to put up some decent numbers against the competition. At 1080p, the Arc A770 manages an average of 103 fps with an average minimum fps of 54. At 1440p, it averages 78 fps, with an average minimum of 47, and even at 4K, the A770 manages an average of 46 fps, with an average minimum of 27 fps.

Image 1 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 2 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 3 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 4 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 5 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 6 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 7 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 10 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 11 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 12 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)

Turn on ray tracing, however, and these numbers understandably tank, as they do for just about every card below the RTX 4070 Ti and RX 7900 XT. Still, even here, the A770 does manage an average fps of 41 fps, with an average minimum of 32 fps) at 1080p with ray tracing enabled, which is technically still playable performance. Once you move up to 1440p and 4K, however, your average title isn't going to be playable at native resolution with ray tracing enabled.

Image 1 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 2 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 3 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 4 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 5 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 6 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 7 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)

Enter Intel XeSS. When set to "Balanced", XeSS turns out to be a game changer for the A770, getting it an average framerate of 66 fps (with an average minimum of 46 fps) at 1080p, an average of 51 fps (with an average minimum of 38 fps) at 1440p, and an average 33 fps (average minimum 26 fps) at 4K with ray tracing maxed out.

While the 26 fps average minimum fps at 4K means it's really not playable at that resolution even with XeSS turned on, with settings tweaks, or more modest ray tracing, you could probably bring that up into the low to high 30s, making 4K games playable on this card with ray tracing turned on. 

That's something the RTX 4060 Ti can't manage thanks to its smaller frame buffer (8GB VRAM), and while the 16GB RTX 4060 Ti could theoretically perform better (I have not tested the 16GB so I cannot say for certain), it still has half the memory bus width of the A770, leading to a much lower bandwidth for larger texture files to pass through.

This creates an inescapable bottleneck that the RTX 4060 Ti's much larger L2 cache can't adequately compensate for, and so takes it out of the running as a 4K card. When tested, very few games managed to maintain playable frame rates even without ray tracing unless you dropped the settings so low as to not make it worth the effort. The A770 16GB, meanwhile, isn't technically a 4K card, but it can still dabble at that resolution with the right settings tweaks and still look reasonably good.

Image 1 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 2 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 3 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 4 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 5 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 6 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 7 of 9

Final performance scores for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 9

Final performance scores for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 9

Final performance scores for the Intel Arc A770

(Image credit: Future / Infogram)

All told, then, the Intel Arc A770 turns out to be a surprisingly good graphics card for modern gaming titles that can sometimes even hold its own against the Nvidia RTX 4060 Ti. It can't hold a candle to the RX 7700 XT or RTX 4070, but it was never meant to, and given that those cards cost substantially more than the Arc A770, this is entirely expected.

Its maximum observed power draw of 191.909W is pretty high for the kind of card the A770 is, but it's not the most egregious offender in that regard. All this power meant that keeping it cool was a struggle, with its maximum observed temperature hitting about 74 ºC.

Among all the cards tested, the Intel Arc A770 was at nearly the bottom of the list with the RX 6700 XT, so the picture for this card might have been very different had it launched three years ago and it had to compete with the RTX 3000-series and RX-6000 series exclusively. In the end, this card performs like a last-gen card, because it is. 

Despite that, it still manages to be a fantastic value on the market right now given its low MSRP and fairly solid performance, rivaling the RTX 4060 Ti on the numbers. In reality though, with this card selling for significantly less than its MSRP, it is inarguably the best value among midrange cards right now, and it's not even close.

  • Performance score: 3.5 / 5

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Should you buy the Intel Arc A770?

Buy the Intel Arc A770 if...

Don't buy it if...

Also Consider

If my Intel Arc A770 review has you considering other options, here are two more graphics cards for you to consider.

How I tested the Intel Arc A770

  • I spent several days benchmarking the card, with an additional week using it as my primary GPU
  • I ran our standard battery of synthetic and gaming benchmarks 
Test Bench

These are the specs for the test system used for this review:
CPU: Intel Core i9-13900K
CPU Cooler: 
Cougar Poseidon GT 360 AIO Cooler
Motherboard: MSI MPG Z790E Tomahawk Wifi
Memory: 
64GB Corsair Dominator Platinum RGB DDR5-6000
SSD: Samsung 990 Pro
PSU: Thermaltake PF3 1050W ATX 3.0
Case: Praxis Wetbench

I spent about two weeks with the Intel Arc A770 in total, with a little over half that time using it as my main GPU on my personal PC. I used it for gaming, content creation, and other general-purpose use with varying demands on the card.

I focused mostly on synthetic and gaming benchmarks since this card is overwhelmingly a gaming graphics card. Though it does have some video content creation potential, it's not enough to dethrone Nvidia's 4000-series GPUs, so it isn't a viable rival in that sense and wasn't tested as such.

I've been reviewing computer hardware for years now, with an extensive computer science background as well, so I know how graphics cards like this should perform at this tier.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

  • First reviewed October 2023
Lenovo Legion 5 Slim 14 review: another win for affordable gaming laptops
8:00 pm | October 14, 2023

Author: admin | Category: Computers Computing Gadgets Gaming Computers Gaming Laptops | Comments: Off

Lenovo Legion 5 Slim 14: Two-minute review

Lenovo has been on a roll in 2023 with plenty of affordable gaming laptop options, and the Lenovo Legion 5 Slim 14 is the latest one of the bunch. A sleek, stainless steel-looking finish with the logo machine-carved into it, it’s a very simple yet distinctive aesthetic that stands out from the traditional black-colored crowd. 

It’s on the slimmer side compared to some other laptops, but when looking at the Razer Blade 14 or the Origin EON 16SL, it’s harder to place this machine under the best thin and light gaming laptops, though it could easily net a spot on best cheap gaming laptops. That said, thanks to its very lightweight and 14-inch display, it really is a portable machine that can easily fit into most bags without weighing them down.

As with most other Lenovo gaming laptops, the majority of the port selection is located in the back, which can be inconvenient for some as it requires a bit of reach. Thankfully the back ports are labeled with icons to make locating them easier. 

The major benefit to using three sides for ports is a robust port selection that includes two USB Type-A ports, two USB Type-C ports, an HDMI port, an SD card reader, an audio combo jack, an e-shutter for the webcam, and a charging port. However, it’s disappointing to see an ethernet port missing from the bunch, which is bizarre considering that there was plenty of space to put it in the back.

Opening the laptop up, we have the standard Lenovo laptop keyboard and touchpad, which is certainly not a bad thing. The keys are well-sized and well-spaced with a satisfying snap while the touchpad is responsive and just as snappy as the keyboard. There’s a soft white backlight for late-night typing, a more subtle option compared to the glare of RGB.

The specs are solid with an AMD Ryzen 7 7840HS CPU, up to an Nvidia GeForce RTX 4060 GPU, 16GB RAM, 1TB of storage, and a lovely 14-inch WQXGA+ 120Hz IPS (2880 x 1800p) display. There’s a nice balance between what gives solid performance but which also keeps the pricing more budget-minded. 

As a result, we have some quite competitive benchmark scores that nearly match what the best gaming laptops with a much higher pricetag put out, which has been something that Lenovo’s also mastered this generation. And general performance with the best PC games nets some truly impressive results.

Sound quality is also pretty solid, especially since the speaker is located above the keyboard. Audio is clear whether you’re streaming movies, listening to music, or gaming, and at high volumes the sounds don’t lose too much. The webcam is 1080p but is of average-at-best quality, requiring great lighting for a clearer image. It comes with a physical e-shutter, which is excellent for safety and should be standard on any laptop. All in all, this is a very solid win for Lenovo–and any gamer on a budget, for that matter.

Lenovo Legion 5 Slim 14: Price & availability

keyboard closeup

(Image credit: Future)
  • How much does it cost? Starting at $1,439.99 / £1,399.99 (including VAT) / AU$2,949
  • When is it available? Available now
  • Where can you get it? Available in the US, UK, and Australia

Pricing is quite good for the Lenovo Legion 5 Slim 14, starting out at $1,439.99 / £1,399.99 (including VAT) / AU$2,949 though, at the time of this writing, there’s a discount making it about $200 cheaper. My review unit is a bit pricier at $1,634.99 / £1,630 (including VAT) / AU$2,998 thanks to the RTX 4060 replacing the RTX 4050 in the base configuration. The most expensive configuration will run you $1,884.99 / £1,780 (including VAT) / AU$3,397, which is still lower than a lot of competing gaming laptops, thanks to keeping the RTX 4060.

The UK version has similar pricing to the US, though there’s an interesting difference in that you can opt for no operating system, which saves you £90 off the cheapest configuration. And the Australian version doesn’t come with that option, like the US.

The Legion 5 Slim 14 compares best in price with the Origin EON 16SL starting at $1,949 / £1,763.64 (around AU$3,050) and one of Lenovo’s other offerings, the Lenovo Legion 5i (2022) starting at $1,099.99 / £1,293.49 / AU$2,349. The latter is a truly budget option while the former offers similar specs and pricing, really boiling down to which aesthetics you prefer.

  • Price score: 5 / 5

Lenovo Legion 5 Slim 14: Specs

specs stickers

(Image credit: Future)

The review unit I received comes with the following configuration: an AMD Ryzen 7 7840HS CPU, an Nvidia GeForce RTX 4060 GPU, 16GB RAM, 1TB of storage, and a 14-inch WQXGA+ 120Hz IPS (2880 x 1800) display.

The Lenovo Legion 5 Slim 14 doesn’t come in separate models, allowing buyers to somewhat configure the CPU, GPU, RAM, and storage space. The US and Australian versions let you choose between 16GB and 32GB of RAM, while the UK model only has 16GB. Another oddity with the Australian version is that if you choose the Ryzen 9 7940HS CPU, then you can only choose the 16GB RAM option.

  • Specs score: 4.5 / 5

legion logo

(Image credit: Future)

Lenovo Legion 5 Slim 14: Design

  • Solid port selection
  • Excellent display
  • Great keyboard and touchpad

Lenovo tends to use a stainless steel-type look for most of its gaming machines, with the manufacturer's logo machine-carved into the side. It gives the laptops a very distinctive and appealing aesthetic, which really works to stand out against the sea of boring black laptops that gamers so often get saddled with. 

Though it says Slim in the name, it doesn’t look very thin compared to other laptops in that particular market, though the weight and 14-inch display size make up for it as it’s quite manageable carrying it around.

Port selection is solid, including two USB Type-A ports, two USB Type-C ports, an HDMI port, an SD card reader, an audio combo jack, an e-shutter for the webcam, and a charging port. 

Pretty much every option except for an ethernet port, which makes little sense both because it’s a gaming laptop and a stable internet connection is paramount to competitive play, as well as the fact that it has ports on the back and plenty of space there to stick an extra port there.

Image 1 of 8

black gaming laptop

(Image credit: Future)
Image 2 of 8

closeup of black gaming laptop

(Image credit: Future)
Image 3 of 8

black gaming laptop

(Image credit: Future)
Image 4 of 8

back of black gaming laptop

(Image credit: Future)
Image 5 of 8

back of black gaming laptop

(Image credit: Future)
Image 6 of 8

side of black gaming laptop

(Image credit: Future)
Image 7 of 8

side of black gaming laptop

(Image credit: Future)
Image 8 of 8

webcam closeup

(Image credit: Future)

Its display is the crowning 14-inch jewel with WQXGA+ 2.8K (2880 x 1800) resolution, 120Hz refresh rate, with up to 500 nits of brightness, 100%DCI-P3 color gamut, and HDR support. The result is a screen that showcases any game with a lovely depth of color and brightness. The color gamut also means that creatives can use this laptop effectively.

The keyboard is the same reliable Lenovo one, meaning wide keys that have a nice snappiness and a white backlight that’s much easier on the eyes while still useful for late-night typing. The trackpad is also the same quality type, with an equally snappy feel and high responsiveness.

Unfortunately, the webcam is also more of the same, needing work office-level lighting to make your image look good. It’s fine for conference calls but grab one of the best webcams if you need to stream. 

The sound quality is also very good thanks to the speaker located above the keyboard, allowing you to hear the various layers of instrumental music, as well as vocals, and sound design. Ideal for gaming for sure.

  • Design score: 4.5 / 5

black gaming laptop

(Image credit: Future)

Lenovo Legion 5 Slim 14: Performance

  • Solid all-around performance
  • Doesn't play nice with ray tracing
Lenovo Legion 5 Slim 14: Benchmarks

Here's how the Lenovo Legion 5 Slim 14 performed in our suite of benchmark tests:

3DMark: Night Raid: 49,967; Fire Strike: 24,906; Time Spy: 10,540; Port Royal: 5,951
GeekBench 5: 1,951 (single-core); 11,595 (multi-core)
Cinebench:
16,671 (multi-core)
Total War: Warhammer III (1080p, Ultra):
87 fps; (1080p, Low): 207 fps
Cyberpunk 2077 (1080p, Ultra): 94 fps; (1080p, Low): 122 fps
Dirt 5 (1080p, Ultra): 63 fps; (1080p, Low): 95 fps
25GB File Copy: 19.3
Handbrake 1.6: 4:26
CrossMark: Overall: 1,886 Productivity: 1,834 Creativity: 1,987 Responsiveness: 1,753
Web Surfing (Battery Informant): 7:46:44
PCMark 10 Home test: 7,871
TechRadar Movie Battery test: 4 hours and 33 minutes

General performance is impressive, especially for its cheaper price point. Benchmark scores are comparable to more expensive gaming laptops including for 3DMark, PCMark10, Cinebench, and Geekbench. It shows that you don’t need tricked out specs in order to deliver great performance and that affordable laptops can offer a lot to even more hardcore and professional gamers.

Its results in non-gaming benchmarks like the 25GB File Copy, Handbrake, and Crossmark tests are quite good, pairing well with its high color gamut. Creatives can rest assured that they’ll be able to double this laptop as an editing and creative machine.

The AMD CPU gives the Lenovo Legion 5 Slim 14 an edge in terms of more CPU-heavy tasks, while the RTX 4060 runs even AAA games at high settings like a dream. If you need ray tracing and resolutions higher than 1080p, you need to prepare for those framerates to drop significantly, with the worst offender being Cyberpunk 2077.

Ventilation is solid between productivity work and normal gaming sessions, though not as excellent as I would have expected considering how much Lenovo brags about the cooling system. According to the manufacturer, it features phase-change thermal compounds, hybrid copper heat pipes, air intake and exhaust systems, and a 12V dual liquid crystal polymer fan system. But temperatures can still get quite a bit hot on the underside.

  • Performance score: 4.5 / 5

Lenovo Legion 5 Slim 14: Battery

battery info

(Image credit: Future)
  • Works well with normal use
  • Not so much with video streaming

The battery life is very interesting, as it scored very well on the web surfing testing, nearly netting eight hours. I also found that it lasts for about seven hours when using it for daily productivity work. 

However, on the TechRadar movie test, it managed only four and a half hours. Extremely inconsistent results on opposite ends, though still better than most other gaming laptops.

  • Battery score: 4 / 5

Should you buy the Lenovo Legion 5 Slim 14?

Buy it if...

You want an easy-to-carry laptop
Though it's a little thick to be called "Slim" the 14-inch screen and uncumbersome weight still makes it extremely easy to carry around.

Don't buy it if...

You want a better webcam
The webcam in this is pretty average, especially if you plan on using it to livestream.

Lenovo Legion 5 Slim 14: Also consider

If my Lenovo Legion 5 Slim 14 review has you considering other options, here are two more laptops to consider...

How I tested the Lenovo Legion 5 Slim 14

  • I tested this laptop for about two weeks
  • I tested the gaming performance as well as productivity work
  • I used a variety of benchmark tests as well as high-end PC games to test this laptop.

To test out the Lenovo Legion 5 Slim 14 I used a full suite of benchmarks to rank both CPU and GPU performance, with more emphasis on the latter. I also tested out frame rate performance on max settings with a range of high-end PC games like Cyberpunk 2077, Dirt 5,  Marvel’s Spider-Man Remastered, and more.

This laptop would primarily be used for gaming, specifically hardcore gaming. Due to its GPU and high color gamut, it can also be used for creative and editing projects, and its CPU means that productivity work is a breeze as well.

I’ve tested out many laptops, especially gaming ones, which gives me plenty of experience with properly benchmarking them. I also have extensive knowledge of testing out general performance such as framerate and graphics.

Read more about how we test

First reviewed October 2023

Lenovo Legion Slim 7i (2023) review: a purposefully improved update
4:30 pm |

Author: admin | Category: Computers Computing Gadgets Laptops | Tags: , | Comments: Off

Lenovo Legion Slim 7i (2023): Two-minute review

The Lenovo Legion Slim 7i (2023) was praised for its outstanding performance power within a chassis that was only a bit above five pounds in our review last year. There were a few compromises made in making this one of the best gaming laptops available. For the latest iteration of the Lenovo Legion Slim 7i, there are some adequate updates in both power and performance with some caveats including the uncomfortably huge packed-in power brick. Meanwhile, the gaming laptop manages to even weigh a bit lighter than its predecessor as well at a little under five pounds. 

Though the Legion Slim 7i may struggle with native 4K resolutions, it shines exceptionally in the 1440p range. This means graphically intensive games ranging from Cyberpunk 2077, Starfield and the likes run wonderfully. Be mindful that those types of power pushing tasks will have the cooling system fans run noticeably louder than they already do at menial tasks like web browsing. Battery life is slightly above average and won’t last more than 7 hours which is good for a coastal trip. It also means that the gaming laptop won’t survive playing games with higher visual fidelity without the power supply.

Packed into this year’s Legion Slim 7i is a 14 core 13th gen Intel i9, 16GB DDR5 RAM, Nvidia RTX 4070 and 1TB SSD storage. There’s also an impressive audio/visual package thanks to the beautiful 16-inch 2560 X 1600 display offering a 240Hz refresh rate and great Harman speakers as well. Other carryovers from the previous version include an individually per-key lit RGB keyboard alongside generous port selection. Packed in apps like Lenovo Vantage are totally fine for customizing one’s experience but other apps like Legion Arena and WebAdviser feel like unnecessary bloatware.

Not much has improved significantly with the latest Legion Slim 7i and that’s totally fine. What has changed is that it’s incrementally more powerful and lighter which means more than anything. Simply put, this is one of the best thin and light gaming laptops released this year.  

Lenovo Legion Slim 7i (2023): Price & availability

  • How much does it cost?  Starting at $1,499.99 / £1,820 / AU$2,719 
  • When is it available? Available now
  • Where can you get it? Available in the US, UK, and Australia

Depending on the territory one is in, there are going to be various pricing options for the Lenovo Legion Slim 7i (2023). In the United States, there are three configurations available alongside the ability to customizable a build. The review spec version I receive will run potential buyers around $1,749. 

There are more affordable options around $1,499 that includes a 13th gen Intel i7, Nvidia RTX 4060, 16GB DDR5 RAM, 1TB SSD storage with ability to bump that down to 512GB SSD for around $50 less. At the highest $2,211 tier, the customizable build includes everything in the review build alongside the ability to push the RAM up to 32GB and a better 3200 x 2000 display with 165Hz refresh rate. 

In the UK, there’s a £1,980 pre-build that features a 13th Gen Intel i9, 32GB RAM, Nvidia RTX 4070, 1TB SSD storage and the 3200 x 2000 display. The other customizable configurations are split between £1,820.00 - £2,300.00 with the two similar Intel CPU, RAM, storage, GPU and display options similar to the U.S. Regardless of which of the $2,719 singular pre-built or two customizable builds between $3,089 and $3,799, they all come with the 3200 x 2000 display at 165Hz refresh rate. The pre-build comes with Intel i7, 32GB, Nvidia RTX 4060 and 1TB SSD. 

The Legion Slim 7i is definitely less expensive than Razer Blade 14 or MSI Stealth GS66 while offering similar power. Within the realm of slim form gaming laptops, it’s safe to call it one of the most approachable when it comes to price.  

  • Price score: 4.5 / 5

Lenovo Legion Slim 7i (2023): Specs

My review configuration of the Lenovo Legion Slim 7i (2023) sits in the middle price wise at $1749 and features a Intel i9 CPU, NVIDIA RTX 4070 GPU, 16GB DDR5 RAM, 1TB SSD storage and 16-inch WQXGA display with a 240Hz refresh rate. 

The cheapest configuration Intel i7 CPU, NVIDIA RTX 4060 GPU, 16GB DDR5 RAM, 512GB SSD storage and same 16-inch display that comes in the review configuration. Potential buyers with a bit more expendable cash can look toward a customizable version that’ll cost around $2,211. This comes with everything in the review configuration in addition to 32GB DDR RAM and 16-inch 3.2K display at a 165Hz refresh rate. 

Outside of different choices in CPU, GPU, RAM and SSD storage, one of the most notable difference is the ability to choose between two different 16-inch displays. One being the standard WQXGA (2560 x 1600 resolution) pumping out 240Hz refresh rates and 3.2K (3200 x 2000 resolution) at a refresh rate of 165Hz. Legion has a configuration tool to create the best set-up for buyers as well. 

Lenovo Legion Slim 7i (2023): Design

  • More lightweight than ever notwithstanding power increase
  • Adequate port selection
  • Awesome RGB keyboard that’s beautifully lit

Like its predecessor, the Lenovo Legion Slim 7i (2023) balances both form and function when it comes to overall design. When closed, it’s really easy to appreciate the all-metal chassis made from sandblasted aluminum and magnesium. The Legion Slim 7i provides a quality finish that doesn’t smudge yet feels like it’s noticeably tough when it comes to chassis build. 

Port placement has changed slightly but feels overall familiar compared to last year’s model. In the lower rear is an HDMI port and three USB-A ports. There’s also a power port that connects to a large power brick that’s still a bit unwieldy and feels heavier than the laptop itself. For added measure, the rear ports feature light-up icons on the top of the laptop’s bottom as the display itself is offset about an inch from them. Some of the smaller changes to the port layout take place on the side, the audio jack is on the left side near the dual USB-C ports. On the left side is the SD slot for content creators and camera shutter for privacy. When closed it’s about less than an inch as well.

When opened, the Legion Slim 7i still features the power button/fingerprint scanner sitting in the middle of the Harmen speakers. The audio quality of the speakers sounds good when listening to music, gaming or watching video content. However, playing graphically intensive games will ensure the cooling fans are running at full blast. This means it’ll probably be best to have a pair of headphones handy. 

The display itself is a joy to look at with beautifully bold colors, deep blacks and respectable brightness. On top of the display is a 1080p webcam that also features dual microphones as well which is solid for video conferencing and probably streaming with the right backlighting. Below that is the wonderfully lit per-key RGB keyboard that feels comfortable to use when typing documents or gaming. The trackpad below feels smooth to the touch when moving the cursor around and offers a nice tactile click when pressed in. 

All of this comes in a slim package that fits well in even a backpack or briefcase. If anything, the power brick really hurts portability. Laying down or on the lap, the Legion Slim 7i doesn’t feel uncomfortable to use for long periods of time. 

  • Design score: 4 / 5

Lenovo Legion Slim 7i (2023): Performance

  • Can run most AAA games at native 1440p range with max settings well
  • Runs well with creative based software
  • Don’t expect to make total use of the 240Hz on visually intensive games
  • Cooling fans are incredibly loud
Lenovo Legion Slim 7i (2023): Laptop benchmarks

Here's how the Lenovo Legion Slim 7i: Benchmarks performed in our suite of benchmark tests:

3DMark: Night Raid: 61244 ; Fire Strike: 25797; Time Spy: 12202
GeekBench 6: 2653 (single-core); 2653 (multi-core)
Metro Exodus Enhanced Edition
 (1080p, Extreme): 49.11 fps; (1080p, High): 93.73 fps
Cyberpunk 2077 (1080p, Ultra): 88.23fps; (1080p, Low): 125.12fps
Dirt 5 (1920x1200, Ultra): 103.90fps; (1920x1200, Low): 213.00fps
25GB File Copy: 2006.553812
Handbrake 1.6: 4:28
CrossMark: Overall: 1948 Productivity: 1908 Creativity: 2044 Responsiveness: 1794
Web Surfing (Battery Informant): 7 hours 01 minute
PCMark 10 Gaming Battery Life: 59 hours

As mentioned previously, the Lenovo Legion Slim 7i (2023) manages to pack more performance power in an even smaller case compared to the previous version. When it comes to general computing tasks, this gaming laptop can handle dozens of Google Chrome tabs without any slow down or stuttering and can open apps from Tidal to the Xbox app instantly. Watching 4K video and listening to hi-fi quality audio wasn’t a problem either. Considering the component specs of the Legion Slim 7i, it’s interesting how loud the cooling fans can get even while running applications that take significantly less processing power than performance pushing games. 

When it comes to games, the lightweight gaming laptop can pretty much play all of the best AAA games at native 1440p with admirable frame rates. Cyberpunk 2077 managed to get 88 frames-per-second on average during benchmark tests. Adding ray-tracing into the mix bumped that down around ten which still put it above 60fps. Though Metro Exodus Enhanced Edition could only get 49fps, it’s still above 30fps which makes it still playable. Providing the highest frame rate was Dirt 5 which pushed out 103 fps at Ultra settings.

The good thing about these frame rates is that they can be improved through the use of Nvidia’s Deep Learning Super Sampling which is compatible with the games mentioned. Despite the serviceable performance, don’t expect to utilize the 240Hz refresh rate either. The only game that got in that ballpark was Dirt 5 at low settings.

Adobe Creative Suite software users will have much to celebrate with the Legion Slim 7i between gaming sessions. Photoshop didn’t run slow when playing around with 4K resolution photos and added layers. On the other side, exports on Premier Pro were pretty instant with a minute video taking somewhere in the ballpark of less than one minute. 

  • Performance score: 4.5 / 5

Lenovo Legion Slim 7i (2023): Battery

  • General battery life is around 7 hours
  • Charging to full battery takes a bit over an hour

Expectations for battery life on these types of gaming focused laptops aren’t necessarily high as most games usually top out after around an hour and similar results come with Lenovo Legion Slim 7i (2023). 

During the PCMark 10 Gaming Battery Life test, the gaming laptop only lasted around 59 minutes. However, general battery life is fairly average for better or worse. The web surfing had the Legion Slim 7i top out at 7 hours which is good enough for a bi-coastal trip. Though it’s not ideal, it’s still better than competing lightweight gaming laptops. 

Charging takes a little over an hour to get to full battery life and there are two ways to juice up the gaming laptop. The most obvious is through the port in the back that connects to the big power brick. It can also use the USB-C to recharge up to 140W which means users could use a Macbook charger but that may affect performance. 

  • Battery score: 4 / 5

Should you buy the Lenovo Legion Slim 7i (2023)?

Buy it if...

You want a gaming laptop that doesn’t weigh a lot
Large power brick aside, the Legion Slim 7i design weighs under five pounds and less than the size of a quarter when closed. 

You require significant performance power
Size be damned as the gaming laptop has a powerful combo of a Intel i9, 16GB DDR5 RAM and Nvidia RTX 4070 that allows great performance across the many visually impressive AAA games. 

Don't buy it if...

You want a laptop that is quieter
Even when using web browsers or music streaming apps, the cooling fans can get extremely loud. 

Lenovo Legion Slim 7i (2023): Also consider

If the Lenovo Legion Slim 7i (2023) has you considering other options, here are two more laptops to consider...

How I tested the Lenovo Legion Slim 7i (2023)

  • I tested this over three weeks
  • It was used for general and creative tasks alongside gaming
  • Games played include Cyberpunk 2077, Starfield, Hi-Fi Rush and Forza Horizon 5

During my three weeks with the Lenovo Legion Slim 7i, most of the time was split between using Google Chrome for various tasks, gaming and using creative software. I used the laptop at both home and in office spaces. Most of the time, the Legion Slim 7i was plugged in outside of a few occasions. 

To witness how far the gaming laptop could go performance wise, I tested some of the biggest demanding games on it including Forza Horizon 5, Cyberpunk 2077, Starfield and Hi-Fi Rush. I used both Adobe Photoshop and Premier to try out how well it would work for creatives. 

I have spent the past several years writing dozens of features on PC Gaming for TechRadar. Pieces ranging from reviews on various components and hardware alongside editorials exploring PC Gaming culture at large.

Read more about how we test

First reviewed September 2023

Corsair M75 AIR review: a solid gaming mouse that falls short of perfection
4:00 pm | October 12, 2023

Author: admin | Category: Computer Gaming Accessories Computers Computing Gadgets Gaming Computers | Comments: Off

Corsair M75 AIR: Two-minute review

The Corsair M75 AIR is another entry in the premium gaming mouse market, with the main draw being its super light weight – think only 60g, which is absolutely unreal. And yes, you feel that near airiness as you use it. I compared that to the 86g of the Alienware AW720M, and the difference was literally night and day. The only other gaming mouse I can recall having a similar weight is the Razer DeathAdder V3 Pro at only 64g.

It boasts an incredibly high DPI of 26,000, which should be more than enough to satiate even the most thirsty gaming mouse enthusiasts. It also features an excellent polling rate of 2000Hz / 0.5 msec, 650 IPS tracking, and up to 50G acceleration. 

While testing out this mouse on first-person shooters like Call of Duty: Modern Warfare II and Cyberpunk 2077, the delay between right and left click switches were completely unnoticeable, as well as between other buttons I mapped out. Between the responsive switches and feather weight, this mouse is tailor-made to eat competitive first-person shooters for breakfast. 

The only downside to its physical build is its 100% PTFE skates, which unfortunately don’t work on metal surfaces. Most desks are made of wood and it performs exceptionally well on those surfaces, so the metal issue shouldn’t impact most gamers, but it’s still an odd issue.

There are two connectivity options: 2.4Hz wireless and Bluetooth. The former requires both dongle use and software installation, as it’s meant for hardcore gaming, and the latter is for everyday productivity and casual use. It would have been nice for the wireless mode to not require an installation like other gaming mice on the market don’t, which hurts its chances at claiming the best wireless gaming mouse crown.

Image 1 of 5

black gaming mouse

(Image credit: Future)
Image 2 of 5

black gaming mouse

(Image credit: Future)
Image 3 of 5

black gaming mouse

(Image credit: Future)
Image 4 of 5

black gaming mouse

(Image credit: Future)
Image 5 of 5

black gaming mouse

(Image credit: Future)

Though the weight takes some getting used to, it’s a well-built and practical mouse that can take some falls without damage. Its shape is suitable for most grip types, though the claw grip feels the most natural to use for me due to its egg shape. I’m not a huge fan of the feel of its paint job, however, as it’s rather coarse, but it does allow for better gripping even when your hands are sweating.

Though it only comes in black, there are no RGB lights which, depending on who you ask, is either a supreme relief or a terrible omission. I fall into the former camp, as unlike a keyboard or the chassis of a laptop, you can’t even see the RGB lighting while gaming so it ends up only eating away at your battery power.

There aren’t any shortcut buttons on the bottom of the mouse to change the DPI from its default of 1200, meaning all customization options are done through Corsair’s software. You can create unique profiles with different settings for your mouse, which is handy to switch between depending on your current mouse needs.

Battery life is great going by Corsair’s own claims, with up to 45 hours for wireless connection and up to 100 hours with Bluetooth. While actually measuring that is too daunting, I’ve found that I’m still using the mouse on the same charge through Bluetooth for a week now.

One major gripe I have with the M75 AIR is that it could have easily been an ambidextrous mouse had Corsair just put two more buttons on the right side of it, similar to the way Dell designed the Alienware AW720M. This not only would have given the mouse more buttons to program but would have allowed left-handed gamers the opportunity to use said mouse. Such a simple fix would have elevated this mouse to not only one of the best gaming mice but one of the best mice in general.

Corsair M75 AIR: Price & availability

black gaming mouse

(Image credit: Future)
  • How much does it cost? $149.99 / £139.99 / AU$249
  • When is it available? Available now
  • Where can you get it? Available in the US, UK, and Australia

The Corsair M75 AIR gaming mouse is available in the US, UK, and Australia, with a suggested retail price of $149.99 / £139.99 / AU$249, making it quite the premium option. 

It's pricier than the Alienware AW720M, which has an MSRP of $129.99 / £124.99 / AU$151.80, but comes in a bit cheaper than the super-premium Razer DeathAdder V3 Pro (MSRP $149 / £149 / AU$279), though it lacks many of the bells and whistles that typically justify this kind of price. 

It's an excellent quality mouse, however, that doesn’t change the fact that you can still purchase a similar gaming mouse like the MSI Clutch GM51 for cheaper, or even the Cougar Airblader Tournament if you don’t mind a super-budget mouse.

Corsair M75 AIR: Specs

black gaming mouse

(Image credit: Future)

Should you buy the Corsair M75 AIR?

Buy it if...

You want an ultra-premium gaming experience
The specs in this mouse are incredible: ridiculously high DPI and polling rates coupled with high tracking and max acceleration. The software is also quite good for programming the buttons.

You want high-end build quality
It’s lightweight yet has good heft, and feels natural to use during even the most intense gaming sessions.

Don't buy it if...

You don’t have money to burn
Like most premium gaming mice, this one is very expensive so if you’re on a budget, it’s best to look for cheaper options instead.

You want more programmable buttons
Five programmable buttons aren't really a lot, especially since it could have been made ambidextrous if it just had two more on the right.

Corsair M75 AIR: Also consider

How I tested the Corsair M75 AIR

  • I used the Corsair M75 AIR for about two weeks
  • I tested it out using first-person shooters as well as for work
  • I used the Corsair software to test out various settings

I used the Corsair M75 AIR in my home office for extended periods of time. I tested out various settings like DPI and customizing button layouts, as well as how well the mouse's ergonomics felt using it in both claw and palm grips and how it held up in right-handed use.

To further test out its gaming capabilities, I tested it with first-person shooters like Call of Duty: Modern Warfare II and Cyberpunk 2077 in order to see how quickly and efficiently I can move or shoot. Third-person shooters and action games were also played to test how flexible this mouse was.

I've reviewed a number of mice, both gaming and productivity, in my career. This allows me to know what to look for on an individual basis as well as compare the performance of this mouse to other ones I reviewed in the past.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed October 2023

« Previous PageNext Page »