Press paws on your Razer Kraken Kitty V2 BT purr-chase for now. I get why that may seem hard; this pair of gaming headset is adorable and completely irresistible. And it also has its share of appealing features outside of its feline design, many of which make it a great option for gaming while on the go.
However, it may not be for you if you’re more discerning about audio quality.
Don’t get me wrong here; I can get a bit snobby when it comes to audio quality myself, and despite its performance shortcomings, which I will get to later in this review, I also don’t mind this gaming headset. I probably wouldn’t call it one of the best PC gaming headsets I’ve tested, but it has its merits – the big one of which is convenience. And, I am an absolute sucker for all things pink and all things cat.
So I wouldn’t knock it just yet just because you’re used to great-sounding audio devices. Instead, read this review first before hitting buy.
Image 1 of 2
(Image credit: Future / Michelle Rae Uy)
Image 2 of 2
(Image credit: Future / Michelle Rae Uy)
Design-wise, of course, the Razer Kraken Kitty V2 BT is winning. Who among us doesn’t love cats, after all? (If you hate cats, you’re a monster, and I don’t want to be friends!) And that pink-and-gray colorway (or as Razer calls it, Quartz) is just the thing if you want to veer away from that black aesthetic that seems to dominate the gaming world. Adding to its appeal are customizable RGB lighting zones on the cat ears and the earcups.
(Image credit: Future / Michelle Rae Uy)
This wireless gaming headset isn’t all looks either. The headset itself feels solid and made of quality materials, with a textured finish on the headband, yokes, and parts of the earcups that ensure minimal scratches as well as plush cushioning for the earpads and the underside of the headband for comfort.
Being a lower mid-range gaming headset, there are noticeable signs of corners being cut here. While the earcups themselves can be tilted up and down, the yokes don’t have any articulation, slightly affecting comfort. And the leatherette used is neither the softest nor the most breathable. The latter doesn’t bother me much, but the former can feel like an inconvenience in some instances – like when I need to free one ear by moving the earcup off to the side.
While great for gaming on your desktop PC, the Kraken Kitty V2 BT is really designed more for gaming and media consumption on the go – whether that’s on your laptop (you don’t have to use any of your precious ports), your portable gaming console like the Nintendo Switch, or your smartphone. It only has Bluetooth 5.2 connectivity and a built-in, not boom, mic, is fairly lightweight especially considering it’s got two kitty ears attached to it, and comes with uncomplicated physical controls.
(Image credit: Future / Michelle Rae Uy)
When I say uncomplicated, I mean it’s only got a multi-function button that acts as the power, Bluetooth pairing, media, and call buttons, and the volume dial. You just have to remember the presses and holds for every function. Luckily, it’s all very intuitive – single press for play/pause and accept/end call; double press to skip a track; triple press for the previous track; etc – that you’ll have it all memorized after a couple of uses.
I only wish Razer also added a way to customize the RGB lighting on-the-fly. It only allows you to do so by connecting it to your smartphone and to the Razer Audio mobile app, and personalizing it from there. The mobile app, by the way, also gives you access to a 10-band EQ, five different sound presets, and toggling Gaming Mode for low latency audio.
Really, the only reasons why you’d think it’s not for on-the-go use are the fact that it’s not foldable and the absence of a travel pouch. These omissions are a tad purr-plexing given that Razer has positioned this as more of a headset to be used everywhere than an actual gaming headset to be used at home.
(Image credit: Future / Michelle Rae Uy)
Don’t expect high-quality audio on the Razer Kraken Kitty V2 BT. That isn’t to say that it sounds bad, because it doesn’t. And it’s got a bright sound profile that I think a lot of people would appreciate.
At the same time, the high end is not very crisp and the mids are not very prominent, resulting in audio that is not very rich and at times piercing to the ears. That’s whether I’m playing Hogwarts Legacy on my PC, playing Cooking Madness on my phone, or listening to my favorite tunes. The low end is also underwhelming, even when you switch to Razer’s Enhanced Bass sound profile.
The issue that bothers me most of all, however, is the clipping. When there are spikes in volume in the audio, even when the headset volume itself is not all the way up, I hear a popping sound. There’s definitely digital distortion happening that might be possibly related to the headset not having a lot of headroom.
The integrated beamforming mics are less-than purr-fect either. They have no issues with plosives or sibilance, and you will come through loud and audible. However, you’ll also sound a little distorted and not crisp. Plus, there’s no background noise rejection – friends could hear me typing, snapping my fingers, or playing music when I’m chatting with them.
At least, the battery life is decent. It’s not the longest I’ve seen, but you are getting up to 40 hours of playtime, especially if you turn off that bright RGB lighting.
Razer Kraken Kitty V2 BT: Price & availability
How much does it cost? $99.99 / £99.99 (about AU$190)
When is it available? Available now
Where can you get it? Available in the US, UK, and Australia
I wouldn’t call the Razer Kraken Kitty V2 BT a pricey gaming headset. One of the best wireless gaming headsets out there, especially in terms of design, it sits nicely in the mid-range market at $99.99 / £99.99 (about AU$190). However, I also expected it to have better audio performance at that price.
If you want a more elevated listening experience, I’d go for the Razer BlackShark V2 Pro (2023) if you can spend a little more. It’s got lots of volume, an immersive sound, an amazing mic, and even better battery life. For only a bit more than the Kraken Kitty V2 BT, the Logitech G733 Lightspeed Wireless RGB will deliver a more elevated sound.
Value: 3.5 / 5
Razer Kraken Kitty V2 BT: Specs
Should you buy the Razer Kraken Kitty V2 BT?
(Image credit: Future / Michelle Rae Uy)
Buy it if...
You love fun gaming peripherals to spruce up your setup There’s no doubt that the Razer Kraken Kitty V2 BT is like catnip to gamers who prefer fun gaming accessories with its kitty ears and its pink design.
You want a gaming headset you can take anywhere Razer had portability in mind when designing this gaming headset, with its Bluetooth connectivity, multi-function button, light weight, and more.
Don't buy it if...
You need better audio quality This doesn’t sound bad, but if you’re a more discerning listener, you’ll find a lot of flaws in its audio and mic performance.
Razer Kraken Kitty V2 BT: Also consider
How I tested the Razer Kraken Kitty V2 BT
I tested the Razer Kraken Kitty V2 BT for a week
I used it with my PC, my iPhone, and my laptop
I used it for gaming and for listening to music
Testing the Razer Kraken Kitty V2 BT for a full week, I used it as my main gaming headset when playing different titles on my desktop PC and on my smartphone. I also used it to listen to music on my laptop and again, on my phone, as well as make phone calls with friends. I also made sure to test its features, including its multi-function button and its battery life.
I’ve been testing, reviewing, and using gaming headsets for years as a freelance tech journalist and now as one of the Computing editors at TechRadar. My years of experience along with my discerning audio tastes make me more than qualified to test and vet these devices for you.
In a way, the Timbuk2 Division Laptop Backpack Deluxe kind of reminds me of Hermione’s beaded handbag in Harry Potter and the Deathly Hallows. You know, the one that’s small on the outside but is so big on the inside, she was able to fit a bunch of clothes, potions, books, and even a tent.
That isn’t to say that the Division Laptop Backpack Deluxe is magical in any way, but the fact that it’s so deceptively small and compact is impressive – and something that someone like me (a tiny person who once drew the concern of an old woman on a train to Budapest because of how massive my pack looked on me) – can very much appreciate.
(Image credit: Future / Michelle Rae Uy)
The first time I took this backpack out of its packaging, I was in all honesty skeptical. I was about to embark on a month-long trip around Spain and Morocco, and I was planning on using it as my carry-on so I’d have something to put some of my essentials in. However, at mere 10.6 x 5.12 x 17.3 in (27 x 13 x 44 cm), it didn’t look like it could fit my work devices, my toiletries, and enough clothes to last me a few days just in case my checked bag got lost in transit.
So imagine my surprise when I realized it could fit all that and then some – by some, I mean my full-frame mirrorless camera, a portable fan, a rain jacket, and a throw blanket too. And that’s just one of its many, many merits.
Image 1 of 4
(Image credit: Future / Michelle Rae Uy)
Image 2 of 4
(Image credit: Future / Michelle Rae Uy)
Image 3 of 4
(Image credit: Future / Michelle Rae Uy)
Image 4 of 4
(Image credit: Future / Michelle Rae Uy)
Now, I just don’t hand out five out of five ratings casually, and I’m not saying that the Timbuk2 Division Laptop Backpack Deluxe is THE backpack to rule them all (though it has proven itself among the best laptop bags and best backpacks we’ve tested to date). I think that the luggage pass-through in the back could have been slightly bigger – the way it is now, you have to put in a little bit of effort to slot it through a luggage handle.
It also could use more pockets for better organization. It only has five, if you don’t count the rear laptop compartment – a front pocket, a smaller mesh pocket in the main compartment, a padded laptop sleeve also in the main compartment, a stretch water bottle pocket on the side, and a tiny slot for a charging cable. Finally, and this might just be me being a little nitpicky, rear loading straps or a bottom compartment for shoes would have been nice.
Still, it’s hard to complain when you’ve already got a near-perfect laptop backpack in your hands. And it’s more than just because it’s capable of fitting all your devices and everything you need for a long weekend away while staying beautifully compact.
(Image credit: Future / Michelle Rae Uy)
At only 2.2 pounds (under 1kg), it’s one of the most lightweight premium backpacks we’ve tested here – something you’ll appreciate when you’re already lugging several pounds of weight while trying to catch a train during your daily commute or trudging on hilly streets while dragging your suitcase behind you. Adding to your comfort are those thick, padded straps that help spread the weight around your shoulders and do not dig into your skin, as well as its nicely padded back that is kind to your back – not to mention, your laptop.
(Image credit: Future / Michelle Rae Uy)
The 18L Timbuk2 Division Laptop Backpack Deluxe’s main compartment is expandable, which explains how I was able to fit several days’ worth of clothes in there on top of my toiletries, make-up and devices, while the external compression straps allowed me to scrunch the backpack even when it’s at its capacity, keeping it compact-looking. And while I still think it could use more pockets, the ones it does have still allowed me to keep things organized, with my chargers, cables and small devices in the front, the personal items I needed easy access to in the top mesh pocket inside the main compartment, and the things I needed super quick access to (like my portable fan – a lifesaver when you’re traveling in humid Andalucia in the summer) in the water bottle compartment.
(Image credit: Future / Michelle Rae Uy)
My constant concern about backpacks are pickpockets, but I quickly found out that you need not worry about your things in this one. Made of premium and hearty materials that are, by the way, so very easy to clean, this pack is so robust, pickpockets will have a hard time slashing its front or side open. Plus, its front pocket zipper has been thoughtfully installed in a way that makes it tricky for just anyone to slide it open. That can be a double-edged sword, as you may have a hard time opening and closing it when the pack is at full capacity, but I’d rather be inconvenienced a little than have my things stolen.
(Image credit: Future / Michelle Rae Uy)
A surprise feature (to me) was its waterproofing. I didn’t know it was waterproofed until I found myself caught in a heavy downpour for a few hours in Cordoba, Spain. I was forced to walk twenty minutes from the train station to the Airbnb my friend and I had rented, then wait a couple more hours in the rain with nothing but a tree to shelter me, the whole time dreading the fate of my laptop. Luckily, this pack held up its end of the bargain, keeping all my belongings safe and dry. Nary a drop leaked, even through its zippers!
Without a doubt, the Timbuk2 Division Laptop Backpack Deluxe not only delivers as promised but also goes above and beyond its call of duty. I’m supposed to test a few more laptop bags, but if I’m being honest, I’m not sure if I want to swap this one out for others.
Timbuk2 Division Laptop Backpack Deluxe: Price
How much does it cost? $139 / £149 (about AU$220)
When is it available? Available now
Where can you get it? Available in the US, UK, and Australia
Available in Eco Black Deluxe, Eco Nightfall, Eco Static, and Eco Titanium colorways, the Timbuk2 Division Laptop Backpack Deluxe will set you back $139 / £149 (about AU$220). And it’s worth every penny! Budget-minded consumers might feel that paying more than $100 / £100 is too much, but trust me, this backpack is going to last you a while and can be used for most of your carrying needs (outside of a cocktail party, a formal event, or a small bags only concert).
Plus, in comparison to other premium packs, it’s actually decently-priced. The Mous 25L pack, for example, will cost you $279.99 / £219.99 / AU$430 while the The 20L Peak Design Everyday Backpack V2 will set you back $260 / £192 / AU$355. Granted, these have a slightly higher capacity, but that’s still a massive jump in price.
Value: 5 / 5
Timbuk2 Division Laptop Backpack Deluxe: Specs
Should you buy the Timbuk2 Division Laptop Backpack Deluxe?
(Image credit: Future / Michelle Rae Uy)
Buy it if...
You want a spacious backpack that doesn’t look massive While small and compact, the Timbuk2 Division Laptop Backpack Deluxe can fit a lot – enough for a long weekend trip with devices included and definitely more than enough for your daily work commute.
You’re looking for a robust, waterproof backpack This has kept my belongings dry after a few hours of heavy downpour in Spain, and it will certainly do so during your few minutes’ walk from the station or home to your office.
You need one pack for most needs It’s a city commuter backpack, but it’s great as a travel carry-on, as well as for school, events, and even an outdoor movie picnic.
Don't buy it if...
You need more pockets This takes on a more minimalist approach in terms of organization. If you need something with more pockets, look elsewhere.
You want a bigger backpack With an 18L capacity, this is more than enough for most people’s needs. But if you need a bigger capacity pack, explore other options.
Timbuk2 Division Laptop Backpack Deluxe: Also consider
How I tested the Timbuk2 Division Laptop Backpack Deluxe
Tested it for a month
Used it as my work pack and my carry-on during a month-long trip
Put its features through their paces, including its waterproofing capability
Taking the Timbuk2 Division Laptop Backpack Deluxe with me on my month-long trip around Europe, I used it as my carry-on while traveling around Spain and Morocco, and as my work backpack when I would come into the office while in London.
During this time, I was able to really put it to the test, gauging its comfort especially when I’m lugging it during transfers or when I’m moving from one destination to another, its capacity as I stuff it to its full capacity, and its durability. I was able to also test its waterproofing after being caught in a downpour for a few hours.
I’ve been testing and reviewing devices and accessories for about 10 years now. Not only do I have plenty of experience with them, but I know what makes the best ones tick and can intuitively tell you which ones are not worth your time and money.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
The Xreal Air 2 and Xreal Air 2 Pro AR glasses are here to replace the original Xreal Air (formerly Nreal Air) AR specs – and they’re just as much fun to use as the originals.
Design-wise they keep everything that made the Xreal Air good – they’re both comfortable to wear for long stretches of time and look kind of stylish too – though the base model doesn’t change things up too much from what’s come before. The Pro model has received new Electrochromic Dimming for its lenses allowing them to swap between clear and shaded at the literal push of a button – and this feature is the only difference between the regular Xreal Air 2 glasses and the Pro.
There are more changes in the picture and audio department thankfully. Both the Xreal Air 2 and Xreal Air 2 Pro have new Sony 0.55 Micro-OLED displays that boast 25% higher peak brightness and improved color calibration. This allows them to produce vibrant colors and provide good contrast in dark scenes.
The AR glasses also feature more immersive audio through their speakers, however, the bass is still a little weak and there’s a not insignificant amount of audio leakage from the open-ear speaker design. If you want to watch something in private you’ll need some Bluetooth headphones.
Unfortunately, the main issue holding the original Xreal Air glasses back wasn’t their picture quality or audio, it was their value, and the Xreal Air 2 and Air 2 Pro don’t address this issue.
I think the specs are a delight to use, but the $399 / £399 price tag – or $449 / £449 for the Xreal Air 2 Pro – is a massive turn-off.
All you get for this high price is a wearable display for some laptops, phones, and handheld game consoles. If you want a more worthwhile experience from the AR glasses you need to buy an Xreal Beam and a handful of add-on adapter cables, and these can add the best part of $200 / £200 to the total cost.
Here's what the AR glasses look like on my face (Image credit: Future)
At this price, you’ll have spent more than you would have on something like the Meta Quest 3 (which I gave five stars in our Meta Quest 3 review). Yes, a VR headset isn’t the same as these Xreal glasses but it’s an awesome XR wearable and I think most people would find it offers way more bang for your buck – as it’s not just a display, the Quest 3 does stuff without you needing to pick up a bunch of not-so-optional extras.
If you have a spare $399 / £399 lying around and you want to pick up some fun tech then you could do a lot worse than the Xreal Air 2, but for the money, you could do better too.
Xreal Air 2: Price and availability
The Xreal Air 2 and Air 2 Pro are available to preorder for $399 / £399 and $449 / £449 respectively from Xreal.com and Amazon. There’s no firm release date yet, but Xreal has told us they’ll be shipping to US and UK customers in November.
The two models are fairly similar, but the cheaper Xreal Air 2 loses out on Electrochromic Dimming. This exclusive Xreal Air 2 Pro feature allows you to dim the lenses between three presets – you can either go for fully transparent where you can see the real world more clearly, blacked-out immersive mode where the real world is (almost) entirely blocked out, or a half-way point between the two.
It’s certainly neat but an optional cover that comes with both models is still the best option for blocking out visual distractions while you try to immerse yourself in what you’re watching. As such, spending an extra $50 / £50 on the Pro glasses isn’t going to be necessary for most people.
Xreal Air 2: Design
Comfortable, lightweight design
Improved lens cover
New color options
I’ve tested a fair few AR smart glasses like the Xreal Air 2 and Air 2 Pro, and based on my experience they’re one of the better-designed options on the market.
Here are the Xreal Air 2 Pro's displays (Image credit: Future)
As with other smart glasses – like the Ray-Ban Meta Smart Glasses or Rokid Max AR glasses – these specs aren’t noticeably larger than a regular pair of specs. The only telltale signs they aren’t regular glasses are the inner displays that sit behind the lenses, and a USB-C port at the end of the left arm for attaching a cable between the glasses and another device.
They’re pretty lightweight too. The Xreal Air 2 comes in at 72g and the Air 2 Pro glasses at 75g. The specs also come with a few easily swappable nose clips, which make it easy to adjust the position and fit the AR gadget. Those of you who wear glasses will appreciate the free optional frame attachment that allows you to equip the AR glasses with prescription lenses – though will have to pay extra for the actual lenses.
The Xreal Air 2 glasses are comfy to wear for extended periods of time for another reason. The biggest issue of some rival devices is that the bridge of the glasses (which touches your face just above your nose) can get annoyingly hot – and can do so after barely five minutes of use.
That’s not the case with Xreal’s specs. Even an hour in I’m happy to keep using them for as long as I can.
The only significant fault I can find with the glasses is that they lack volume controls. Instead, the only buttons on the glasses allow you to change brightness levels – though I’ve never found a reason to set them at anything less than max.
If you pick up the Xreal Air Pro 2 model you’ll find an additional button for adjusting the Electrochromic Dimming – a feature that affects how shaded the front lenses are. As I mentioned above you can change the lens between transparent, blacked-out, or in-between.
This feature is certainly neat but I personally prefer to use the case cover that comes with both models when I want to immerse myself. The cover not only helps to block out more light from the front, but has plastic parts that extend underneath the lenses to further block out annoying light and reflections – marking an upgrade from the cover used by the original Xreal Air.
The Xreal Air 2 Pro doesn't look as yellow with the cover on (Image credit: Future)
The only downside of the cover is that it hides the kaleido decal that I’ve applied to the specs. I’m not entirely sure if this optional sticker kit comes free with every pair of Xreal Air 2 glasses, just the Pro model, or was just an extra in the reviewer’s kit but it applied a fun vibrancy to the AR glasses and makes them look a little less intimidating – wearable tech (particularly glasses) can make some people feel uncomfortable even if they don’t have any cameras.
I got a yellow kit as you can see from the images in this review, but there are navy, turquoise, blue, pink, and green options as well. You could also remove the hassle of applying a sticker and get the Xreal Air 2 glasses in red by default– the Pro unfortunately only comes in black without using any decals.
Design Score: 4/5
Xreal Air 2: Performance
Impressive HD visuals
Immersive audio
Bass is a little weak, and the sound does leak a fair bit
The Xreal Air 2 Pro glasses offer a solid performance boost over their predecessor.
When it comes to visuals, the Sony 0.55 Micro-OLED display does an impressive job. Thanks to the 25% boost in brightness the glasses have received (and the calibration Xreal has done to get the specs TÜV Rheinland certified for color accuracy) I felt that colors look more vibrant than on the original Xreal Air glasses. The contrast in dark scenes is also pretty darn good, which is to be expected from an OLED screen.
The 100,000:1 contrast ratio and 500nits brightness might not look like a lot on paper – they’re what you’d expect from more budget-friendly projectors that aren’t all that impressive – but because the glasses aren’t attempting to throw the image across the room they’re able to use the same specs to deliver a much higher-quality picture.
That said, I do still find the best performance is to be had in a fairly dark space with the cover attached (and the Electrochromic Dimming set to max if you have the Air 2 Pro). They can still function okay in brighter spaces, but you’ll notice a dip in quality – especially if you don’t have the cover with you.
The only disappointment is that these specs still only offer full-HD (1080p) resolution. It’s fine but 4K visuals would really have been appreciated. At least you can benefit from a 120Hz refresh rate if you want to use them for gaming.
Here's what it looks like to see virtual images on the Xreal Air glasses (Image credit: Nreal)
The glasses’ audio performance isn’t as impressive as its picture quality, but the sound is still pretty solid and offers a good level of immersion that will suck you into the action of the content you’re watching thanks to its upgraded “cinematic sound system” with spatial audio.
While watching various shows, films, and music videos I found that mid and high-range tones were delivered with good clarity – even when I cranked the volume up a bit there wasn’t noticeable distortion. That said, I found the specs do struggle with bassier tones. The lower-end performance isn’t terrible but it doesn’t have as much force behind it as I would like – which can lead to music feeling less impactful, and some action explosions lacking the intensity you’d get from a more capable setup.
I do like the open-ear design though – which is taken wholesale from its predecessor. It’s perfect for commuting as you can enjoy your favorite show or movie while you travel while still being able to listen out for when you get to your stop.
Just watch out for audio leakage – as while the situation is improved on these newer models, much like with the original Xreal Air glasses your audio still isn’t completely private. If someone is sitting or standing next to you while content is playing through the Xreal Air 2 or Air 2 Pro glasses at moderate volumes they’ll be able to hear it.
The only solution is to add a pair of Bluetooth headphones to your setup, but this will have an impact on the battery life of the device they’re connected to. However, if they’re a decent pair – like one of the picks on our best headphones list – then you might find the battery trade-off is worthwhile for the privacy and improved sound quality you’ll experience.
Performance Score: 4/5
Xreal Air 2: Compatibility
Compatible with a range of devices
Xreal Beam and additional adapters are pricey, but feel necessary
The Xreal Air 2 and the 2 Pro have an identical compatibility list to the original Xreal Air.
Using the included USB-C to USB-C cable you can use the glasses with a range of laptops, tablets, handled consoles like the Steam Deck, and phones that support video output through a USB-C port (called DisplayPort). Just note that not every USB-C phone will offer this feature – for example, the Google Pixel 8 and other Pixel phones don’t support video output via their USB-C port.
For devices lacking a USB-C port or DisplayPort support, you can try using the Xreal Beam adapter. This optional add-on, which comes in at $119 (UK price to be confirmed), allows you to wirelessly cast content from your incompatible phone – such as an iPhone with a lightning port – to the glasses (note, the Google Pixel line still won’t work with the Beam as they can only cast to devices using Google’s proprietary Chromecast tech, not the generic version used by the Beam).
Using another USB-C to USB-C cable you can also connect your Xreal Air 2 glasses to a Nintendo Switch through the Xreal Beam.
The Beam serves as a power source for the glasses too, and will help you enjoy content on your Xreal Glasses for longer – as rather than draining the connected device’s battery you’ll use the Beam’s stored power instead which lasts for around 3 hours.
If you purchase an HDMI to USB-C cable (make sure it’s this way around as most cables on Amazon are USB-C to HDMI, and as I found out they don’t work as intended with the Xreal Air) you can hook your glasses up to a console like a PS5, an Xbox Series X or a docked Nintendo Switch.
The Xreal Beam is a neat but pricey add-on (Image credit: Future)
I just wish the Xreal Air 2 and Air 2 Pro came with more of these cables and adapters in their boxes – as either you have to opt for the relatively pricey adapters it sells as add-ons, or try and navigate the labyrinth of cheaper third-party options on Amazon which may or may not work. Including the cables in the box would not only make things simpler, it would help to make the Air 2 glasses feel like better value for money as you no longer need to spend extra on not-so-optional add-ons.
I also would prefer if the Beam was more like the Rokid Station – which is effectively a portable Android TV device for the Rokid Max smart AR glasses. You can jerry-rig together a setup that uses the Rokid Station and Xreal glasses, but you’ll need a pair of BlueTooth headphones for audio. If the Beam is going to stay as a more rudimentary product, much like I said for the additional connector cables, I’d like to see it included in the Xreal Air 2’s price.
Heck, even if it is given some upgrades I’d like to see it included in the Xreal Air 2’s price. It would make the $399 / £399 price tag a lot easier to swallow.
Should you buy the Xreal Air 2 glasses?
Buy them if…
Don't buy them if...
Also consider
How I tested the Xreal Air 2 Glasses
For this review, I was sent the Xreal Air 2 Pro model to test – and as I mentioned it’s functionally identical to the regular Xreal Air 2 glasses except has lenses with Electrochromic Dimming.
To put the displays through their paces I watched a range of different content across streaming services and YouTube. I played movies with bright vibrant scenes to see how well the colors were presented, I watched shadowy scenes in films to test the glasses contrast capabilities, and I played many music videos to get a feel for the speakers’ audio chops.
I also made sure to test the glasses with a range of compatible devices including a smartphone, a laptop, the Xreal Beam, and games consoles.
Lastly, I would swap between these smart specs and a few others I have lying around from older reviews, including the original Xreal Air glasses to see how these specs compare.
Logitech G Pro X Superlight 2 Lightspeed: Two-minute review
Though its name can be a mouthful, the Logitech G Pro X Superlight 2 Lightspeed makes everything else easy-breezy for users. Logitech G took one of the best gaming mice ever and improved on it in many ways, from its weight and charging port to its sensor, while keeping the bits that already made it a crowd-favorite.
Being one of the most lightweight gaming mice isn’t its only accolade – although shedding 3g off its predecessor’s weight is nonetheless impressive. It’s a speedy and long-lasting one too, and focusing more on its performance and longevity also allows it a no-frills look that makes it easy to fit in any setup, whether or not you’re into flashy RGB.
As its name implies, this is a wireless gaming mouse that uses Logitech’s Lightspeed wireless technology for connectivity. That means that you won’t have to put up with cables snagging when gaming.
(Image credit: Future / Michelle Rae Uy)
If the Logitech G Pro X Superlight 2 Lightspeed looks to you very similar to the original though, that’s because Logitech G has largely kept the same simple minimalist design. That’s alright, in my opinion. It may be the era of maximalism, but that’s not necessary here. It’s also kept the same USB receiver garage at the bottom to keep that USB receiver safe, with the same round magnetic door that conveniently snaps in place, as well as the same five buttons, the same smooth-to-touch matte shell, and the same supportive form that makes it ideal for both claw and palm grippers.
That smooth finish may not be everyone’s cup of tea, as some gamers need a bit of texture for proper grippage – if only Logitech G had replicated the Razer Viper V3 HyperSpeed’s grippy finish. However, stuck-on grips are included in the box that aren’t too shabby as alternatives. And all five buttons are within easy reach, even for someone like me who’s got small hands, so you can rest assured that you’re gaming comfortably.
Image 1 of 2
(Image credit: Future / Michelle Rae Uy)
Image 2 of 2
(Image credit: Future / Michelle Rae Uy)
Just like the original, there’s no Powercore module (the wireless charging puck) included, even though you can still swap out the magnetic garage door for it for wireless charging. If you already have Logitech’s Powerplay wireless charging system, then you’re all set. If not, you’ll have to spend more for that convenience, which isn’t great considering this mouse is already expensive.
There are some design improvements thrown in, however. The most welcome of them is the USB-C charging port that replaced that antiquated and frankly annoying microUSB port. And again, its weight dropped from 63g of the original to merely 60g. Finally, apart from the black and white color options, there’s also a pink one for those trying to stray from neutral shades.
(Image credit: Future / Michelle Rae Uy)
The Logitech G Pro X Superlight 2 Lightspeed delivers a faster and more precise performance over its predecessor. That’s all thanks to its 2,000Hz polling rate and a new HERO 2 sensor that offers up to 32,000 DPI (a jump from 1,000Hz polling rate and up to 25,600 DPI).
Admittedly, those numbers, which you can set and adjust via the Logitech G Hub, are more than what most regular gamers need, but they do mean that this gaming mouse can more than keep up during fast-paced games and battles when you’re being overwhelmed by enemies, making it future-proofed. While I’m far from a competitive gamer, it’s proven more than capable when I’m playing CS:GO and Doom Eternal.
(Image credit: Future / Michelle Rae Uy)
I do have a couple of minor quibbles, however. Sadly, zero-additive PTFE mouse feet, while delivering impeccable maneuverability on some surfaces, don’t glide easily on others. I found that although they’re great on gaming mouse pads and mats, they feel fiddly on bare desks. On top of that, the lower arch of the mouse isn’t as supportive for palm grippers; wrist fatigue is real after a couple of hours.
However, the mouse makes up for it in longevity. With up to 95 hours of battery life on a single charge, you’re getting almost two weeks of gaming every day for eight hours per day. That tracks as I didn’t have to recharge once during my two-week testing period.
Logitech G Pro X Superlight 2 Lightspeed: Price & availability
How much does it cost? $159 / £149 / AU$299
When is it available? Available now
Where can you get it? Available in the US, UK, and Australia
All those improvements will cost you. The Logitech G Pro X Superlight 2 Lightspeed is slightly more expensive than its predecessor at $159 / £149 / AU$299. That’s around the same price as the Razer Deathadder V3 Pro, which has a base polling rate of 1,000Hz (upgradable to 4,000Hz with the Razer Hyperpolling wireless dongle), up to 30,000 DPI, and up to 90 hours battery life.
That price tag is admittedly a little steep for a gaming mouse, but if you’re looking for a fast-performing wireless mouse that lasts a while, it’s a great investment. However, if you can’t afford it, the HyperX Pulsefire Haste 2 Wireless offers 1,000Hz polling rate, up to 26,000 DPI, and an impressive 100-hour battery life for just $89.99 / £94.99 / AU$149.
Value: 4 / 5
Logitech G Pro X Superlight 2 Lightspeed: Specs
(Image credit: Future / Michelle Rae Uy)
Should you buy the Logitech G Pro X Superlight 2 Lightspeed?
Buy it if...
You need a fast and long-lasting wireless gaming mouse It delivers speed and accurate performance, making it ideal for competitive and fast-paced gaming.
You prefer a lightweight mouse It’s not the most lightweight wireless gaming mouse, but it is one of the lightest. If you want something light, this is a strong contender.
You hate charging your wireless peripherals This has up to 90 hours of battery life on a single charge, which means you won’t have to charge that often.
Don't buy it if...
You’re on a budget It is a pretty expensive investment, and there are cheaper under $100 / £100 alternatives available.
You prefer a gaming mouse with more heft If you’re one of the many gamers who aren’t comfortable with lightweight mice, you should give this one a skip.
Logitech G Pro X Superlight 2 Lightspeed: Also consider
How I tested the Logitech G Pro X Superlight 2 Lightspeed
Tested the mouse for a couple of weeks
Used it for playing PC games as well as for work
I spent two weeks testing the Logitech G Pro X Superlight 2 Lightspeed, dedicating a few hours each night for gaming so I could put this gaming mouse through its paces. In the daytime, I used it as my main mouse for work.
To test it, I played a few games with it, from a couple of fast-paced titles to more leisurely-paced games, getting a feel for its buttons, ergonomics, and performance. I made sure to utilize the G Hub software to customize settings and gave it a full charge before I began testing so I could accurately assess its battery life.
I’ve been testing and reviewing PC gaming peripherals for about 10 years now. Not only do I have plenty of experience with them, but I know what makes the best ones tick and can intuitively tell you which ones are not worth your time and money.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
The Gamakay LK75 75% is a mechanical keyboard for the truly hardcore, allowing users to customize virtually every part of it. Thanks to the level of depth of its customization options alone, it's easily one of the best mechanical keyboards out there and could even personalized to be one of the best gaming keyboards and the best keyboard for programmers.
You can swap out the keycaps, replace the switches, and reprogram every key including the knob at the top right. The knob itself is pretty interesting, as it has its own LED screen that displays the time, date, and the OS the keyboard is connected to. You can also change up the RGB lighting through the knob display.
Reprogramming the keys requires the Gamakay software, which you can download from the official website. However, you wouldn’t know that since the included manual doesn’t mention it at all, which is a bit baffling. The software is quite intricate, offering tons of ways to customize the keys including function, lightning, and performance.
The knob is also customizable through the software itself. Its initial function is to control the volume, but I found that it doesn’t work. Also, even though the time is displayed through it, it’s not set properly until you do it yourself, which is odd since it would make more sense to automatically sync with the OS time once you connect it.
(Image credit: Future)
The Gamakay LK75’s PC plate and PCB are 'top mount' and, combined with the built-in PET pad, bottom silicone pad, PCB sandwich silicone pad, and spacebar form, it offers increased stability and reduces both sound and general harshness when typing.
Handling this keyboard can be a bit intimidating at first for those not completely familiar with the intricacies of mechanical keyboards, especially as the Gamakay line of switches doesn’t follow normal naming conventions and the abundance of text on the keycaps themselves can be confusing. But at least one aspect is much easier compared to other keyboards: the process of changing the switches.
Included with the keyboard is a combo keycap and switch puller. The keycaps come off pretty smoothly and you can swap them out for any other Gamakays keycaps to change up the aesthetic of the keyboard, though I rather like the orange caps myself. The switches are surprisingly simple to pull out as well and are not only compatible with the three-pin Gamakay Planet switches but with any other three or five-pin switches.
Image 1 of 3
(Image credit: Future)
Image 2 of 3
(Image credit: Future)
Image 3 of 3
(Image credit: Future)
Depending on the type of switch you install, it has a huge effect on the sound and feel, though overall each switch that I tried out still has a softer impact compared to other mechanical keyboards. The Gamakay Planet switches which is the set I tried out are Mercury (the clickiest linear), Venus clickiest tactile), Mars (heaviest and strongest feedback), and Jupiter (the most balanced linear).
They all have the same travel distance of 3.30mm, with the Mercury and Venus switches sharing the same actuation force of 40g. You can feel it in how light and easy they are to type on. My personal favorite is the Venus switches for that reason – providing a nice clickiness and tactile feedback without requiring too much force to activate.
But even the highest ones, Jupiter and Mars, have an actuation force of 50g compared to Gateron Greens with one of 80g. There are plenty of other Gamakay switches to choose from including the Silent switches and, if you’re yearning for something a bit more traditional, Gamakay also offers Gateron switches on its site.
Image 1 of 3
(Image credit: Future)
Image 2 of 3
(Image credit: Future)
Image 3 of 3
(Image credit: Future)
There are three methods of connectivity: wired via a USB Type-C port, 2.4 wireless, and Bluetooth. They’re activated by use of the FN key plus a number key, outlined in the thin manual included. All three work well, with the wired connection offering the least latency. I also adore that there’s a tiny magnetic slot to store the dongle in, preventing it from being misplaced.
However, there was an odd issue when I tried connecting the keyboard to an all-in-one PC using all three methods - as in, it wouldn’t connect at all. But regular and gaming PCs seemed to work just fine. It's possible this was a one-off glitch, but it may be something to be wary of.
Gamakay LK75 75%: Price & availability
(Image credit: Future)
How much does it cost? $129.99 / £110 / AU$211
When is it available? Available now
Where can you get it? Available in the US, UK, and Australia
The Gamakay LK75 75% keyboard is available in the US, UK, and Australia for $129.99 / £110 / AU$211. Gamakay also ships to most other regions, which is even better for those outside the aforementioned three.
Pricing is pretty standard for high-end mechanical keyboards, meaning that it’s very expensive though less so than others. Though at the time of writing, there’s a sale that shaves off about $10. Compared to other more notable brands like the Drop ALT, SteelSeries Apex Pro TKL (2023), and the Razer Huntsman V2 TKL, this easily competes with them while being much cheaper.
Gamakay LK75 75%: Specs
Should you buy the Gamakay LK75 75%?
Buy it if...
You want a great-quality mechanical keyboard It's a solid-quality mechanical keyboard that's heavy and well-built, with nice feeling switches and excellent features.
You want a fully customizable keyboard Every bit of this keyboard is customizable from the keycaps to the switches to the programmable keys themselves.
Don't buy it if...
You need a more budget-minded mechanical keyboard Though it's cheaper than other similar keyboards, its price point is still a hard pill to swallow.
You want a plug-and-play keyboard that works everywhere I did have some issues connecting the keyboard to certain devices, and the Gamakay software is a must-have, so this isn't an easy plug-and-play recommendation.
Gamakay LK75 75%: Also consider
How I tested the Gamakay LK75 75%
I spent about a week testing this keyboard
I tested it both for productivity work and gaming
I used it extensively in a home office environment
I tested the Gamakay LK75 75% keyboard in a home office environment, seeing how well it functioned in both productivity work and gaming. I also carried it around in various bags to test its portability.
The Gamakay LK75 75% is a mechanical keyboard that's meant for extensive use over years. I made sure to quality-test it to see if it held up to those standards, as well as to see how easy it is to pull the keycaps off and how easy it is to reprogram the RGB lighting.
I've tested a wide range of keyboards including mechanical ones, and understand how to properly rate and test them out to ensure that they reach a certain level of quality.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
The Intel Core i5-14600K is not the kind of processor you're really going to want to upgrade to, despite technically offering the best value of any processor I've tested.
First, the good. This is one of the best processor values you're going to find on the market, no matter what happens with the price of its predecessor. Currently, it has the best performance for its $319 price tag (about £255/AU$465), and AMD's competing Ryzen 5 7600X isn't all that close. If you're looking to get the most bang for your buck today, then the Intel Core i5-14600K is it.
In terms of performance, this isn't a bad chip at all; I'd even say it's a great one if you take its predecessor out of the running, which will inevitably happen as its last remaining stock gets bought up. It doesn't have the performance of the Intel Core i7-14700K, but that's a workhorse chip, not the kind that's meant to power the best computers for the home or the best budget gaming PCs as these chips start making their way into prebuilt systems in the next couple of months.
For a family computer or one that's just meant for general, every day use, then this chip is more than capable of handling whatever y'll need it for. It can even handle gaming fairly well thanks to its strong single core performance. So, on paper at least, the Core i5-14600K is the best Intel processor for the mainstream user as far as performance goes.
The real problem with the i5-14600K is that its performance is tragically close to the Core i5-13600K's. And even though the MSRP of the Intel Core i5-13600K is technically higher than that of the Core i5-14600K, it's not going to remain that way for very long at all.
The real problem with the i5-14600K, and one that effectively sinks any reason to buy it, is that its performance is tragically close to the Core i5-13600K's.
As long as the i5-13600K is on sale, it will be the better value, and you really won't even notice a difference between the two chips in terms of day-to day-performance.
That's because there's no difference between the specs of the 14600K vs 13600K, other than a slightly faster turbo clock speed for the 14600K's six performance cores.
While this does translate into some increased performance, it comes at the cost of higher power draw and temperature. During testing, this chip hit a maximum temperature of 101ºC, which is frankly astounding for an i5. And I was using one of the best CPU coolers around, the MSI MAG Coreliquid E360 AIO, which should be more than enough to keep the temperature in check to prevent throttling.
Image 1 of 13
(Image credit: Future / Infogram)
Image 2 of 13
(Image credit: Future / Infogram)
Image 3 of 13
(Image credit: Future / Infogram)
Image 4 of 13
(Image credit: Future / Infogram)
Image 5 of 13
(Image credit: Future / Infogram)
Image 6 of 13
(Image credit: Future / Infogram)
Image 7 of 13
(Image credit: Future / Infogram)
Image 8 of 13
(Image credit: Future / Infogram)
Image 9 of 13
(Image credit: Future / Infogram)
Image 10 of 13
(Image credit: Future / Infogram)
Image 11 of 13
(Image credit: Future / Infogram)
Image 12 of 13
(Image credit: Future / Infogram)
Image 13 of 13
(Image credit: Future / Infogram)
Looking at the chip's actual performance, the Core i5-14600K beats the AMD Ryzen 5 7600X and the Intel Core i5-13600K in single core performance, multi core performance, and with productivity workloads, on average. Other than its roughly 44% better average multi core performance against the Ryzen 5 7600X, the Core i5-14600K is within 3% to 4% of its competing chips.
Image 1 of 7
(Image credit: Future / Infogram)
Image 2 of 7
(Image credit: Future / Infogram)
Image 3 of 7
(Image credit: Future / Infogram)
Image 4 of 7
(Image credit: Future / Infogram)
Image 5 of 7
(Image credit: Future / Infogram)
Image 6 of 7
(Image credit: Future / Infogram)
Image 7 of 7
(Image credit: Future / Infogram)
In creative workloads, the Core i5-14600K again manages to outperform the Ryzen 5 7600X by about 31% on average, but it's just 2.4% better than its predecessor, and none of these chips are especially great at creative content work. If you're messing around with family albums or cutting up TikTok videos, any one of these chips could do that fairly easily. For heavier-duty workloads like video encoding and 3D rendering, the Intel chips hold up better than the mainstream Ryzen 5, but these chips really aren't practical for that purpose.
Image 1 of 6
(Image credit: Future / Infogram)
Image 2 of 6
(Image credit: Future / Infogram)
Image 3 of 6
(Image credit: Future / Infogram)
Image 4 of 6
(Image credit: Future / Infogram)
Image 5 of 6
(Image credit: Future / Infogram)
Image 6 of 6
(Image credit: Future / Infogram)
On the gaming front, it's more of the same, though now at least the Ryzen 5 7600X is back in the mix. Overall, the Core i5-14600K beats its 13th-gen predecessor and AMD's rival chip by about 2.1% and 3.2% respectively.
Image 1 of 2
(Image credit: Future / Infogram)
Image 2 of 2
(Image credit: Future / Infogram)
All of this comes at the cost of higher power draw and hotter CPU temperatures, though, which isn't good especially for getting so little in return. What you really have here is an overclocked i5-13600K, and you can do that yourself and save some money by buying the 13600K when it goes on sale, which is will.
(Image credit: Future / John Loeffler)
Intel Core i5-14600K: Price & availability
How much does it cost? US MSRP $319 (about £255/AU$465)
When is it out? October 17, 2023
Where can you get it? You can get it in the US, UK, and Australia
The Intel Core i5-14600K is available in the US, UK, and Australia as of October 17, 2023, for an MSRP of $319 (about £255/AU$465).
This is a slight $10 price drop from its predecessor, which is always good thing, and comes in about $20 (about £15/AU$30) more than the AMD Ryzen 5 7600X, so fairly middle of the pack price-wise.
In terms of actual value, as it goes to market, this chip has the highest performance for its price of any chip in any product tier, but only by a thin margin, and one that is sure to fall very quickly once the price on the 13600K drops by even a modest amount.
Intel Core i5-14600K: Specs
Intel Core i5-14600K: Verdict
Best performance for the price of any chip tested...
...but any price drop in the Core i5-13600K will put the 14600K in second place
Not really worth upgrading to with the Core i7-14700K costing just $90 more
Image 1 of 7
(Image credit: Future / Infogram)
Image 2 of 7
(Image credit: Future / Infogram)
Image 3 of 7
(Image credit: Future / Infogram)
Image 4 of 7
(Image credit: Future / Infogram)
Image 5 of 7
(Image credit: Future / Infogram)
Image 6 of 7
(Image credit: Future / Infogram)
Image 7 of 7
(Image credit: Future / Infogram)
Ultimately, the market served by this chip specifically is incredibly narrow, and like the rest of the Raptor Lake Refresh line-up, this is the last hurrah for the Intel LGA 1700 socket.
That means if you go out and buy a motherboard and CPU cooler just for the 14th-gen, it's a one time thing, since another generation on this platform isn't coming. It doesn't make sense to do that, so, if you're upgrading from anything earlier than the 12th-gen, it just makes so much more sense to wait for Meteor Lake to land in several months time and possibly get something really innovative.
If you're on a 12th-gen chip and you can't wait for Meteor Lake next year, the smartest move is to buy the i7-14700K instead, which at least gives you i9-13900K-levels of performance for just $90 more than the i5-14600K.
Ultimately, this chip is best reserved for prebuilt systems like the best all-in-one computers at retailers like Best Buy, where you will use the computer for a reasonable amount of time, and then when it becomes obsolete, you'll go out and buy another computer rather than attempt to upgrade the one you've got.
In that case, buying a prebuilt PC with an Intel Core i5-14600K makes sense, and for that purpose, this will be a great processor. But if you're looking to swap out another Intel LGA 1700 chip for this one, there are much better options out there.
Should you buy the Intel Core i5-14600K?
Buy the Intel Core i5-14600K if...
Don't buy it if...
Also Consider
If my Intel Core i5-14600K review has you considering other options, here are two processors to consider...
How I tested the Intel Core i5-14600K
I spent nearly two weeks testing the Intel Core i5-14600K
I ran comparable benchmarks between this chip and rival midrange processors
I gamed with this chip extensively
Test System Specs
These are the specs for the test system used for this review:
I spent about two weeks testing the Intel Core i5-14600K and its competition, primarily for productivity work, gaming, and content creation.
I used a standard battery of synthetic benchmarks that tested out the chip's single core, multi core, creative, and productivity performance, as well as built-in gaming benchmarks to measure its gaming chops.
I then ran the same tests on rival chips from AMD as well as the other 14th-gen chips in the Raptor Lake Refresh launch lineup and 13th-generation Raptor Lake processors. For Intel chips, I used the same motherboard, RAM, SSD, and graphics card to ensure I was isolating just the CPU's performance across every chip. For AMD chips, I used a comparable AM5 motherboard so differences in the motherboard configuration and circuitry are mitigated to the largest extent possible.
I've been testing and reviewing computer hardware for years now, and with an extensive background in computer science, I know processors in and out, and I use that knowledge to ensure every chip is thoroughly tested.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
The Intel Core i7-14700K is the workhorse CPU in the Intel's 14th generation launch line-up, and like any good workhorse, it's going to be the one to do the heavy lifting for this generation of processors. Fortunately for Intel, the Core i7-14700K succeeds in keeping Raptor Lake Refresh from being completely forgettable.
Of all the chips launched on October 17, 2023, the Core i7-14700K is the only one to get a substantive spec upgrade over its predecessor as well as a slight cut in price to just $409 (about £325/AU$595), which is $10 less than the Intel Core i7-13700K it replaces.
So what do you get for $10 less? Gen-on-gen, you don't get a whole lot of improvement (about 6% better performance overall compared to the 13700K), but that figure can be deceiving, since the Core i7-13700K was at the top of our best processor list for a reason.
With the 13700K's performance being within striking distance of the Intel Core i9-13900K, that 6% improvement for the 14700K effectively closes the gap, putting the 14700K within just 3% of the 13900K overall, and even allowing it to pull ahead in average gaming performance, losing out to only the AMD Ryzen 7 7800X3D.
Fortunately for Intel, the Core i7-14700K succeeds in keeping Raptor Lake Refresh from being completely forgetable.
Given its excellent mix of performance and price, the Intel Core i7-14700K could very well be the last Intel chip of the LGA 1700 epoch that anyone should consider buying, especially if you're coming from a 12th-gen chip.
With the Core i9-13900K outperforming the Intel Core i9-12900K by as much as 25% in some workloads, someone coming off an i9-12900K or lower will find it hard to believe that an i7 could perform this well, but that's where we're at. And with the i7-14700K coming in about 30% cheaper than the Intel Core i9-14900K, while still managing to come remarkably close in terms of its performance, the Intel Core i7-14700K is the Raptor Lake Refresh chip to buy if you're going to buy one at all.
(Image credit: Future / John Loeffler)
Intel Core i7-14700K: Price & availability
How much does it cost? US MSRP $409 (about £325/AU$595)
When is it out? October 17, 2023
Where can you get it? You can get it in the US, UK, and Australia
The Intel Core i7-14700K is available on October 17, 2023, with a US MSRP of $409 (about £325/AU$595), which is a slight decrease from its predecessor's MSRP of $419 (about £335/AU$610), and about 31% lower than the Intel Core i9-14900K and 32% percent lower than the AMD Ryzen 9 7950X.
It's also cheaper than the AMD Ryzen 7 7800X3D, and just $10 more expensive than the AMD Ryzen 7 7700X, putting it very competitively priced against processors in its class.
The comparisons against the Core i9 and Ryzen 9 are far more relevant, however, since these are the chips that the Core i7-14700K are competing against in terms of performance, and in that regard, the Intel Core i7-14700K is arguably the best value among consumer processors currently on the market.
Price score: 4 / 5
Intel Core i7-14700K: Specs & features
Four additional E-Cores
Slightly faster clock speeds
Increased Cache
Discrete Wi-Fi 7 and Thunderbolt 5 support
The Intel Core i7-14700K is the only processor from Intel's Raptor Lake Refresh launch line-up to get a meaningful spec upgrade.
Rather than the eight performance and eight efficiency cores like the i7-13700K, the i7-14700K comes with eight performance cores and 12 efficiency cores, all running with a slightly higher turbo boost clock for extra performance. The i7-14700K also has something called Turbo Boost Max Technology 3.0, which is a mouthful but also gives the best performing P-core an extra bump up to 5.6GHz so long as the processor is within power and thermal limits.
The increased core count also adds 7MB of additional L2 cache for the efficiency cores to use, further improving their performance over the 13700K's, as well as four additional processing threads for improved multitasking.
It has the same TDP of 125W and same Max Turbo Power rating of 253W as the 13700K, with the latter being the upper power limit of sustained (greater than one second) power draw for the processor. This ceiling can be breached, however, and processing cores can draw much more power in bursts as long as 10ms when necessary.
There is also support for discrete Wi-Fi 7 and Bluetooth 5.4 connectivity, as well as discrete Thunderbolt 5 wired connections, so there is a decent bit of future proofing in its specs.
Chipset & features score: 4 / 5
(Image credit: Future / John Loeffler)
Intel Core i7-14700K: Performance
Outstanding performance on par with the i9-13900K
Best gaming performance of any Intel processor
More power hungry than predecessor, so also runs hotter
The Intel Core i7-14700K is arguably the best performing midrange processor on the market, coming within striking distance of the Core i9-13900K and Ryzen 9 7950X across most workloads, including very strong multi core performance thanks to the addition of four extra efficiency cores.
Image 1 of 13
(Image credit: Future / Infogram)
Image 2 of 13
(Image credit: Future / Infogram)
Image 3 of 13
(Image credit: Future / Infogram)
Image 4 of 13
(Image credit: Future / Infogram)
Image 5 of 13
(Image credit: Future / Infogram)
Image 6 of 13
(Image credit: Future / Infogram)
Image 7 of 13
(Image credit: Future / Infogram)
Image 8 of 13
(Image credit: Future / Infogram)
Image 9 of 13
(Image credit: Future / Infogram)
Image 10 of 13
(Image credit: Future / Infogram)
Image 11 of 13
(Image credit: Future / Infogram)
Image 12 of 13
(Image credit: Future / Infogram)
Image 13 of 13
(Image credit: Future / Infogram)
The strongest synthetic benchmarks for the 14700K are single core workloads, which puts it effectively level with the Core i9-13900K and often beating the Ryzen 9 7950X and 7950X3D chips handily.
This translates into better dedicated performance, rather than multitasking, but even there the Core i7-14700K does an admirable just keeping pace with chips with much higher core counts.
Image 1 of 7
(Image credit: Future / Infogram)
Image 2 of 7
(Image credit: Future / Infogram)
Image 3 of 7
(Image credit: Future / Infogram)
Image 4 of 7
(Image credit: Future / Infogram)
Image 5 of 7
(Image credit: Future / Infogram)
Image 6 of 7
(Image credit: Future / Infogram)
Image 7 of 7
(Image credit: Future / Infogram)
In creative workloads, the 14700K also performs exceptionally well, beating out the 13900K on everything except 3D model rendering, which is something that is rarely given to a CPU to do any when even the best cheap graphics cards can process Blender or V-Ray 5 workloads many times faster than even the best CPU can.
Image 1 of 6
(Image credit: Future / Infogram)
Image 2 of 6
(Image credit: Future / Infogram)
Image 3 of 6
(Image credit: Future / Infogram)
Image 4 of 6
(Image credit: Future / Infogram)
Image 5 of 6
(Image credit: Future / Infogram)
Image 6 of 6
(Image credit: Future / Infogram)
In gaming performance, the Core i7-14700K scores a bit of an upset over its launch sibling, the i9-14900K, besting it in gaming performance overall, though it has to be said that it got some help from a ridiculously-high average fps in Total War: Warhammer III's battle benchmark. In most cases, the i7-14700K came up short of the 13900K and 14900K, but not by much.
And while it might be tempting to write off Total War: Warhammer III as an outlier, one of the biggest issues with the Core i9's post-Alder Lake is that they are energy hogs and throttle under load quickly, pretty much by design.
In games like Total War: Warhammer III where there are a lot of tiny moving parts to keep track of, higher clock speeds don't necessarily help. When turbo clocks kick into high gear and cause throttling, the back-and-forth between throttled and not-throttled can be worse over the course of the benchmark than the cooler but consistent Core i7s, which don't have to constantly ramp up and ramp down.
So the 14700K isn't as much of an outlier as it looks, especially since the 13700K also excels at Total War: Warhammer III, and it too beats the two Core i9s. Total War: Warhammer III isn't the only game like this, and so there are going to be many instances where the cooler-headed 14700K steadily gets the work done while the hot-headed i9-13900K and 14900K sprint repeatedly, only to effectively tire themselves out for a bit before kicking back up to high gear.
Image 1 of 2
(Image credit: Future / Infogram)
Image 2 of 2
(Image credit: Future / Infogram)
The additional efficiency cores might not draw as much power as the performance cores, but the additional power is still noticeable. The 14700K pulls down nearly 30W more watts than the 13700K, though it is still a far cry from the Core i9-13900K's power usage.
This additional power also means that the Core i7-14700K runs much hotter than its predecessor, maxing out at 100ºC, triggering the CPU to throttle on occasion. This is something that the i7-13700K didn't experience during my testing at all, so you'll need to make sure your cooling solution is up to the task here.
Performance: 4.5 / 5
(Image credit: Future / John Loeffler)
Intel Core i7-14700K: Verdict
Fantastic single-core performance
Intel's best gaming processor, and second overall behind the Ryzen 7 7800X3D
Best value of any midrange processor
Image 1 of 7
(Image credit: Future / Infogram)
Image 2 of 7
(Image credit: Future / Infogram)
Image 3 of 7
(Image credit: Future / Infogram)
Image 4 of 7
(Image credit: Future / Infogram)
Image 5 of 7
(Image credit: Future / Infogram)
Image 6 of 7
(Image credit: Future / Infogram)
Image 7 of 7
(Image credit: Future / Infogram)
Ultimately, the Intel Core i7-14700K is the best processor in the Raptor Lake Refresh line-up, offering very competitive performance for a better price than its predecessor and far better one than comparable chips one tier higher in the stack.
It's not without fault, though. It's not that much better than the i7-13700K, so everything I'm saying about the i7-14700K might reasonably apply to its predecessor as well. And honestly, the i7-14700K doesn't have too high a bar to clear to standout from its launch siblings, so it's performance might only look as good in comparison to the i9 and i5 standing behind it.
But, the numbers don't lie, and the Intel Core i7-14700K displays flashes of brilliance that set it apart from its predecessor and vault it into competition with the top-tier of CPUs, and that's quite an achievement independent of how the rest of Raptor Lake Refresh fares.
(Image credit: Future / John Loeffler)
Should you buy the Intel Core i7-14700K?
Buy the Intel Core i7-14700K if...
Don't buy it if...
Also Consider
If my Intel Core i7-14700K review has you considering other options, here are two processors to consider...
How I tested the Intel Core i7-14700K
I spent nearly two weeks testing the Intel Core i7-14700K
I ran comparable benchmarks between this chip and rival midrange processors
I gamed with this chip extensively
Test System Specs
These are the specs for the test system used for this review:
I spent about two weeks testing the Intel Core i7-14700K and its competition, primarily for productivity work, gaming, and content creation.
I used a standard battery of synthetic benchmarks that tested out the chip's single core, multi core, creative, and productivity performance, as well as built-in gaming benchmarks to measure its gaming chops.
I then ran the same tests on rival chips from AMD as well as the other 14th-gen chips in the Raptor Lake Refresh launch line-up and 13th-generation Raptor Lake processors. For Intel chips, I used the same motherboard, RAM, SSD, and graphics card to ensure I was isolating just the CPU's performance across every chip. For AMD chips, I used a comparable AM5 motherboard so differences in the motherboard configuration and circuitry are mitigated to the largest extent possible.
I've been testing and reviewing computer hardware for years now, and with an extensive background in computer science, I know processors in and out, and I use that knowledge to ensure every chip is thoroughly tested.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
The Intel Core i9-14900K is a hard chip to justify, which is a weird thing to say about a processor that is arguably the best Intel has ever put out.
With very little fanfare to herald its arrival following the announcement of Intel Meteor Lake at Intel Innovation in September 2023 (and confirmation that Intel Meteor Lake is coming to desktop in 2024), Intel's 14th-generation flagship processor cannot help but draw parallels to the 11th-gen Rocket Lake chips that immediately preceded Intel Alder Lake.
The Core i9-11900K was something of a placeholder in the market until Intel could launch Alder Lake at the end of 2021. Those processors featured a new hybrid architecture and a more advanced 10nm process that helped propel Intel back to the top of our best processor list, despite strong competition from AMD.
With Intel Raptor Lake Refresh, we're back in placeholder territory, unfortunately. The performance gains here are all but non-existent, so we are essentially waiting on Meteor Lake while the i9-14900K absolutely guzzles electricity and runs hot enough to boil water under just about any serious workload with very little extra performance over the Intel Core i9-13900K to justify the upgrade.
The problem for the Core i9-14900K is that you can still get the i9-13900K.
It's not that the Core i9-14900K isn't a great processor; again, it's unquestionably the best Intel processor for the consumer market in terms of performance. It beats every other chip I tested in most categories with the exception of some multitasking workflows and average gaming performance, both of which it comes in as a very close runner-up. On top of that, at $589, it's the same price as the current Intel flagship, the Intel Core i9-13900K (assuming the i9-14900K matches the i9-13900K's £699 / AU$929 sale price in the UK and Australia).
The problem for the Core i9-14900K is two-fold: you can still get the i9-13900K and will be able to for a long while yet at a lower price, and the Intel Core i7-14700K offers performance so close to the 14th-gen flagship at a much lower price that the 14900K looks largely unnecessary by comparison. Essentially, If you've got an i7-13700K or i9-13900K, there's is simply nothing for you here.
If you're on an 11th-gen chip or older, or you've got an AMD Ryzen processor and you're looking to switch, this chip will be the last one to use the LGA 1700 socket, so when Meteor Lake-S comes out in 2024 (or even Lunar Lake-S, due out at the end of 2024 or early 2025), you won't be able to upgrade to that processor with an LGA 1700 motherboard. In other words, upgrading to an LGA 1700 for this chip is strictly a one-shot deal.
The only people who might find this chip worth upgrading to are those currently using a 12th-gen processor who skipped the 13th-gen entirely, or someone using a 13th-gen core i5 who wants that extra bit of performance and doesn't mind dropping $589 on a chip they might be upgrading from again in a year's time, which isn't going to be a whole lot of people.
Unfortunately, at this price, it'll be better to save your money and wait for Meteor Lake or even Lunar Lake to drop next year and put the $589 you'd spend on this chip towards the new motherboard and CPU cooler you'll need once those chips are launched.
(Image credit: Future / John Loeffler)
Intel Core i9-14900K: Price & availability
How much does it cost? US MSRP $589 (about £470/AU$855)
When is it out? October 17, 2023
Where can you get it? You can get it in the US, UK, and Australia
The Intel Core i9-14900K is available as of October 17, 2023, for a US MSRP of $589 (about £470/AU$855), which is the same as the Intel Core i9-13900K it is replacing. We don't have confirmation on UK and Australia pricing yet, though I've asked Intel for clarification and will update this review if and when I hear back from the company. If the 14900K keeps the same UK and Australia pricing as the Core i9-13900K, however, it'll sell for £699/AU$929 in the UK and Australia respectively.
This does make the Core i9-14900K the better value against these chips, especially given the level of performance on offer, but it's ultimately too close to the 13900K performance-wise to make this price meaningful, as a cheaper 13900K will offer an even better value against AMD's Ryzen 9 lineup.
Price score: 3 / 5
(Image credit: Future / John Loeffler)
Intel Core i9-14900K: Specs & features
Faster clock speeds than i9-13900K
Some additional AI-related features
The Intel Core i9-14900K is the final flagship using Intel's current architecture, so it makes sense that there is very little in the way of innovation over the Intel Core i9-13900K.
Using the same 10nm Intel 7 process node as its predecessor and with the same number of processor cores (8 P-cores/16 E-cores), threads (32), and cache (32MB total L2 cache plus additional 36MB L3 cache), the only real improvement with the 14900K in terms of specs are its faster clock speeds.
All cores get a 0.2GHz increase to their base frequencies, while the P-core turbo boost clock increases to 5.6GHz and the E-core turbo clock bumps up to 4.4GHz from the 13900K's 5.4GHz P-Core turbo clock and 4.3GHz E-core turbo clock.
While those clock speeds are the official max turbo clocks for the two types of cores, the Core i9-14900K and Intel Core i7-14700K have something called Turbo Boost Max Technology 3.0, which increases the frequency of the best-performing core in the chip and gives it even more power within the power and thermal limits. That gets the Core i9-14900K up to 5.8GHz turbo clock on specific P-cores while active.
Additionally, an exclusive feature of the Core i9 is an additional Ludicrous-Speed-style boost called Intel Thermal Velocity Boost. This activates if there is still power and thermal headroom on a P-core that is already being boosted by the Turbo Boost Max Technology 3.0, and this can push the core as high as 6.0GHz, though these aren't typical operating conditions.
Both of these technologies are present in the 13900K as well, but the 14900K bumps up the maximum clock speeds of these modes slightly, and according to Intel, that 6.0GHz clock speed makes this the world's fastest processor. While that might technically be true, that 6.0GHz is very narrowly used so in practical terms, the P-Core boost clock is what you're going to see almost exclusively under load.
The Core i9-14900K has the same 125W TDP as the 13900K and the same 253W maximum turbo power as well, though power draw in bursts of less than 10ms can go far higher.
If this reads like a Redditor posting about their successful overclocking setup, then you pretty much get what this chip is about. If you're looking for something innovative about this chip, I'll say it again, you're going to have to wait for Meteor Lake.
The Core i9-14900K also has support for discrete Wi-Fi 7 and Bluetooth 5.4 connectivity, as does the rest of the 14th-gen lineup, as well as support for discrete Thunderbolt 5, both of which are still a long way down the road.
The only other thing to note is that there have been some AI-related inclusions that are going to be very specific to AI workloads that almost no one outside of industry and academia is going to be running. If you're hoping for AI-driven innovations for everyday consumers, let's say it once more, with feeling: You're going to have to wait for—
Chipset & features score: 3.5 / 5
(Image credit: Future / John Loeffler)
Intel Core i9-14900K: Performance
Best-in-class performance, but only by a hair
Gets beat by AMD Ryzen 7 7800X3D and i7-14700K in gaming performance
Runs even hotter than the i9-13900K
If you took any elite athlete who's used to setting records in their sport, sometimes they break their previous record by a lot, and sometimes it's by milliseconds or fractions of an inch. It's less sexy, but it still counts, and that's really what we get here with the Intel i9-14900K.
On pretty much every test I ran on it, the Core i9-14900K edged out its predecessor by single digits, percentage-wise, which is a small enough difference that a background application can fart and cause just enough of a dip in performance that the 14900K ends up losing to the 13900K.
I ran these tests more times than I can count because I had to be sure that something wasn't secretly messing up my results, and they are what they are. The Core i9-14900K does indeed come out on top, but it really is a game of inches at this point.
Image 1 of 13
(Image credit: Future / Infogram)
Image 2 of 13
(Image credit: Future / Infogram)
Image 3 of 13
(Image credit: Future / Infogram)
Image 4 of 13
(Image credit: Future / Infogram)
Image 5 of 13
(Image credit: Future / Infogram)
Image 6 of 13
(Image credit: Future / Infogram)
Image 7 of 13
(Image credit: Future / Infogram)
Image 8 of 13
(Image credit: Future / Infogram)
Image 9 of 13
(Image credit: Future / Infogram)
Image 10 of 13
(Image credit: Future / Infogram)
Image 11 of 13
(Image credit: Future / Infogram)
Image 12 of 13
(Image credit: Future / Infogram)
Image 13 of 13
(Image credit: Future / Infogram)
Across all synthetic performance and productivity benchmarks, the Core i9-14900K comes out on top, with the notable exception of Geekbench 6.1's multi-core performance test, where the AMD Ryzen 9 7950X scores substantially higher, and the Passmark Performance Test's overall CPU score, which puts the AMD Ryzen 9 7950X and Ryzen 9 7950X3D significantly higher. Given that all 16 cores of the 7950X and 7950X3D are full-throttle performance cores, this result isn't surprising.
Other than that though, it's the 14900K all the way, with a 5.6% higher geometric average on single-core performance than the 13900K. For multi-core performance, the 14900K scores a 3.1% better geometric average, and in productivity workloads, it scores a 5.3% better geometric average than its predecessor.
Against the AMD Ryzen 9 7950X, the Core i9-14900K scores about 13% higher in single-core performance, about 1% lower in multi-core performance, and 5% better in productivity performance.
Image 1 of 7
(Image credit: Future / Infogram)
Image 2 of 7
(Image credit: Future / Infogram)
Image 3 of 7
(Image credit: Future / Infogram)
Image 4 of 7
(Image credit: Future / Infogram)
Image 5 of 7
(Image credit: Future / Infogram)
Image 6 of 7
(Image credit: Future / Infogram)
Image 7 of 7
(Image credit: Future / Infogram)
Creative benchmarks reveal something of a mixed bag for the Core i9-14900K. In all cases, it beats its predecessor by between 2.6% to as much as 10.9%. Against the AMD Ryzen 9 7950X and 7950X3D, the Core i9-14900K consistently loses out when it comes to rendering workloads like Blender and V-Ray 5, but beats the two best AMD processors by just as much in photo and video editing. And since 3D rendering is almost leaning heavily on a GPU rather than the CPU, AMD's advantage here is somewhat muted in practice.
Image 1 of 6
(Image credit: Future / Infogram)
Image 2 of 6
(Image credit: Future / Infogram)
Image 3 of 6
(Image credit: Future / Infogram)
Image 4 of 6
(Image credit: Future / Infogram)
Image 5 of 6
(Image credit: Future / Infogram)
Image 6 of 6
(Image credit: Future / Infogram)
Gaming is another area where Intel had traditionally done well thanks to its strong single-core performance over AMD, but all that flipped with the introduction of AMD's 3D V-Cache.
While the Intel Core i9-14900K barely moves the needle from its predecessor's performance, it really doesn't matter, since the AMD Ryzen 7 7800X3D manages to ultimately score an overall victory and it's not very close. The Core i9-14900K actually manages a tie for fourth place with the Intel Core i7-13700K, with the Core i7-14700K edging it out by about 4 fps on average.
Image 1 of 2
(Image credit: Future / Infogram)
Image 2 of 2
(Image credit: Future / Infogram)
Of course, all this performance requires power, and lots of it. The Core i9-14900K pretty much matched the maximum recorded power draw of the Core i9-13900K, with less of a watt's difference between the two, 351.097W to 351.933, respectively.
The Core i9-14900K still managed to find a way to run hotter than its predecessor, however; something I didn't really think was possible. But there it is, the 14900K maxing out at 105ºC, three degrees hotter than the 13900K's max. It's the hottest I've ever seen a CPU run, and I'm genuinely shocked it was allowed to run so far past its official thermal limit without any overclocking on my part.
Performance: 3.5 / 5
(Image credit: Future / John Loeffler)
Intel Core i9-14900K: Verdict
The best chip for dedicated performance like video editing and productivity
There are better gaming processors out there for cheaper
The Intel Core i7-14700K offers a far better value
Image 1 of 7
(Image credit: Future / Infogram)
Image 2 of 7
(Image credit: Future / Infogram)
Image 3 of 7
(Image credit: Future / Infogram)
Image 4 of 7
(Image credit: Future / Infogram)
Image 5 of 7
(Image credit: Future / Infogram)
Image 6 of 7
(Image credit: Future / Infogram)
Image 7 of 7
(Image credit: Future / Infogram)
In the final assessment then, the Core i9-14900K does manage to win the day, topping the leaderboard by enough of a margin to be a clear winner, but close enough that it isn't the cleanest of wins.
Overall, its single-core and productivity performance are its best categories, slightly faltering in creative workloads, and coming up short enough on gaming that it's not the chip I would recommend as a gaming CPU.
Like all Core i9s before it, the 14900K is the worst value of Intel's 14th-gen launch lineup, but it's better than its predecessor for the time being (though that advantage won't last very long at all), and it does manage to be a better value proposition than the Ryzen 9 7950X and Ryzen 9 7950X3D, while matching the Ryzen 7 7800X3D, so all in all, not too bad for an enthusiast chip.
Still, the Intel Core i7-14700K is right there, and its superior balance of price and performance makes the Intel Core i9-14900K a harder chip to recommend than it should be.
Should you buy the Intel Core i9-14900K?
Buy the Intel Core i9-14900K if...
Don't buy it if...
Also Consider
If my Intel Core i9-14900K review has you considering other options, here are two processors to consider...
How I tested the Intel Core i9-14900K
I spent nearly two weeks testing the Intel Core i9-14900K
I ran comparable benchmarks between this chip and rival flagship processors
I gamed with this chip extensively
Test System Specs
These are the specs for the test system used for this review:
I spent about two weeks testing the Intel Core i9-14900K and its competition, using it mostly for productivity and content creation, with some gaming thrown in as well.
I used the standard battery of synthetic benchmarks I use for processor testing, and ran the same tests on rival chips from AMD as well as the other 14th-gen chips in the Raptor Lake Refresh launch lineup and 13th-generation Raptor Lake processors. For Intel chips, I used the same motherboard, RAM, SSD, and graphics card to ensure I was isolating just the CPU's performance across every chip. For AMD chips, I used a comparable AM5 motherboard so differences in the motherboard configuration and circuitry are mitigated to the largest extent possible.
I've been testing and reviewing computer hardware for years now, and with an extensive background in computer science, I know processors in and out, and I use that knowledge to ensure every chip is thoroughly tested.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
The Intel Arc A770 has had quite a journey since its release back on October 12, 2022, and fortunately, it has been a positive one for Intel despite a somewhat rocky start.
Right out the gate, I'll say that if you are looking for one of the best cheap graphics cards for 1440p gaming, this card definitely needs to be on your list. It offers great 1440p performance for most modern PC titles that most of us are going to be playing and it's priced very competitively against its rivals.
Where the card falters, much like with my Intel Arc A750 review earlier this year, is with older DirectX 9 and DirectX 10 titles, and this really does hurt its overall score in the end. Which is a shame, since for games released in the last five or six years, this card is going to surprise a lot of people who might have written it off even six months ago.
Intel's discrete graphics unit has been working overtime on its driver for this card, providing regular updates that continue to improve performance across the board, though some games benefit more than others.
Naturally, a lot of emphasis is going to be put on more recently released titles. And even though Intel has also been paying attention to shoring up support for older games as well, if you're someone with an extensive back catalog of DX9 and DX10 titles from the mid-2000s that you regularly return to, then this is not the best graphics card for your needs. Nvidia and AMD drivers carry a long legacy of support for older titles that Intel will honestly never be able to match.
But if what you're looking for is the best 1440p graphics card to play the best PC games of the modern era but you're not about to plop down half a grand on a new GPU, then the Intel Arc A770 is going to be a very solid pick with a lot more to offer than many will probably realize.
(Image credit: Future / John Loeffler)
Intel Arc A770: Price & availability
How much is it? US MSRP for 16GB card: $349 (about £280/AU$510); for 8GB card: $329 (about £265/AU$475)
When was it released? It went on sale on October 12, 2022
Where can you buy it? Available in the US, UK, and Australia
The Intel Arc A770 is available now in the US, UK, and Australia, with two variants: one with 16GB GDDR6 VRAM and an official US MSRP of $349 (about £280/AU$510), and one with 8GB GDDR6 VRAM and an official MSRP of $329 (about £265/AU$475).
Those are the launch MSRPs from October 2022, of course, and the cards have come down considerably in price in the year since their release, and you can either card for about 20% to 25% less than that. This is important, since the Nvidia GeForce RTX 4060 and AMD Radeon RX 7600 are very close to the 16GB Arc A770 cards in terms of current prices, and offer distinct advantages that will make potential buyers want to go with the latter rather than the former.
But those decisions are not as cut and dry as you might think, and Intel's Arc A770 holds up very well against modern midrange offerings, despite really being a last-gen card. And, currently, the 16GB variant is the only 1440p card that you're going to find at this price, even among Nvidia and AMD's last-gen offerings like the RTX 3060 Ti and AMD Radeon RX 6750 XT. So for 1440p gamers on a very tight budget, this card fills a very vital niche, and it's really the only card that does so.
Price score: 4/5
(Image credit: Future / John Loeffler)
Intel Arc A770: Design
Intel's Limited Edition reference card is gorgeous
Will fit most gaming PC cases easily
Intel Arc A770 Limited Edition Design Specs
Slot size: Dual slot Length: 11.02 inches | 280mm Height: 4.53 inches | 115mm Cooling: Dual fan Power Connection: 1 x 8-pin and 1 x 6-pin Video outputs: 3 x DisplayPort 2.0, 1 x HDMI 2.1
The Intel Arc A770 Limited Edition that I'm reviewing is Intel's reference model that is no longer being manufactured, but you can still find some stock online (though at what price is a whole other question).
Third-party partners include ASRock, Sparkle, and Gunnir. Interestingly, Acer also makes its own version of the A770 (the Acer Predator BiFrost Arc A770), the first time the company has dipped its toe into the discrete graphics card market.
All of these cards will obviously differ in terms of their shrouds, cooling solutions, and overall size, but as far as Intel's Limited Edition card goes, it's one of my favorite graphics cards ever in terms of aesthetics. If it were still easily available, I'd give this design five out of five, hands down, but most purchasers will have to opt for third-party cards which aren't nearly as good-looking, as far as I'm concerned, so I have to dock a point for that.
It's hard to convey from just the photos of the card, but the black finish on the plastic shroud of the card has a lovely textured feel to it. It's not quite velvety, but you know it's different the second you touch it, and it's something that really stands out from every other card I've reviewed.
(Image credit: Future / John Loeffler)
The silver trim on the card and the more subtle RGB lighting against a matte black shroud and fans really bring a bit of class to the RGB graphics card I typically see. The twin fans aren't especially loud (not any more so than other dual-fan cards, at least), and the card feels thinner than most other similar cards I've reviewed and used, whether or not the card is thinner in fact.
The power connector is an 8-pin and 6-pin combo, so you'll have a pair of cables dangling from the card which may or may not affect the aesthetic of your case, but at least you won't need to worry about a 12VHPWR or 12-pin adapter like you do with Nvidia's RTX 4000-series and 3000-series cards.
You're also getting three DisplayPort 2.0 outputs and an HDMI 2.1 output, which puts it in the same camp as Nvidia's recent GPUs, but can't match AMD's recent move to DisplayPort 2.1, which will enable faster 8K video output. As it stands, the Intel Arc A770 is limited to 8K@60Hz, just like Nvidia. Will you be doing much 8K gaming on a 16GB card? Absolutely not, but as we get more 8K monitors next year, it'd be nice to have an 8K desktop running at 165Hz, but that's a very speculative prospect at this point, so it's probably not anything anyone looking at the Arc A770 needs to be concerned about.
Design Score: 4 / 5
(Image credit: Future / John Loeffler)
Intel Arc A770: Specs & features
Good hardware AI cores for better XeSS upscaling
Fast memory for better 1440p performance
Intel's Xe HPG architecture inside the Arc A770 introduces a whole other way to arrange the various co-processors that make up a GPU, adding a third, not very easily comparable set of specs to the already head-scratching differences between Nvidia and AMD architectures.
Intel breaks up its architecture into "render slices", which contain 4 Xe Cores, which each contain 128 shaders, a ray tracing processor, and 16 matrix processors (which are directly comparable to Nvidia's vaunted tensor cores at least), which handle graphics upsampling and machine learning workflows. Both 8GB and 16GB versions of the A770 contain eight render slices for a total of 4096 shaders, 32 ray processors, and 512 matrix processors.
The ACM-G10 GPU in the A770 runs at 2,100MHz base frequency with a 2,400MHz boost frequency, with a slightly faster memory clock speed (2,184MHz) for the 16GB variant than the 8GB variant's 2,000MHz. This leads to an effective memory speed of 16 Gbps for the 8GB card and 17.5 Gbps for the 16GB.
With a 256-bit memory bus, this gives the Arc A770 a much wider lane for high-resolution textures to be processed through, reducing bottlenecks and enabling faster performance when gaming at 1440p and higher resolutions thanks to a 512 GB/s and 559.9 GB/s memory bandwidth for the 8GB and 16GB cards, respectively.
All of this does require a good bit of power, though, and the Arc A770 has a TDP of 225W, which is higher than most 1440p cards on the market today.
(Image credit: Future / John Loeffler)
As far as features all this hardware empowers, there's a lot to like here. The matrix cores are leveraged to great effect by Intel's XeSS graphics upscaling tech found in a growing number of games, and this hardware advantage generally outperforms AMD's FSR 2.0, which is strictly a software-based upscaler.
XeSS does not have frame generation though, and the matrix processors in the Arc A770 are not nearly as mature as Nvidia's 3rd and 4th generation tensor cores found in the RTX 3000-series and RTX 4000-series, respectively.
The Arc A770 also has AV1 hardware-accelerated encoding support, meaning that streaming videos will look far better than those with only software encoding at the same bitrate, making this a compelling alternative for video creators who don't have the money to invest in one of Nvidia's 4000-series GPUs.
Specs & features: 3.5 / 5
(Image credit: Future / John Loeffler)
Intel Arc A770: Performance
Great 1440p performance
Intel XeSS even allows for some 4K gaming
DirectX 9 and DirectX 10 support lacking, so older games will run poorly
Resizable BAR is pretty much a must
At the time of this writing, Intel's Arc A770 has been on the market for about a year, and I have to admit, had I gotten the chance to review this card at launch, I would probably have been as unkind as many other reviewers were.
As it stands though, the Intel Arc A770 fixes many of the issues I found when I reviewed the A750, but some issues still hold this card back somewhat. For starters, if you don't enable Resizable BAR in your BIOS settings, don't expect this card to perform well at all. It's an easy enough fix, but one that is likely to be overlooked, so it's important to know that going in.
Image 1 of 15
(Image credit: Future / Infogram)
Image 2 of 15
(Image credit: Future / Infogram)
Image 3 of 15
(Image credit: Future / Infogram)
Image 4 of 15
(Image credit: Future / Infogram)
Image 5 of 15
(Image credit: Future / Infogram)
Image 6 of 15
(Image credit: Future / Infogram)
Image 7 of 15
(Image credit: Future / Infogram)
Image 8 of 15
(Image credit: Future / Infogram)
Image 9 of 15
(Image credit: Future / Infogram)
Image 10 of 15
(Image credit: Future / Infogram)
Image 11 of 15
(Image credit: Future / Infogram)
Image 12 of 15
(Image credit: Future / Infogram)
Image 13 of 15
(Image credit: Future / Infogram)
Image 14 of 15
(Image credit: Future / Infogram)
Image 15 of 15
(Image credit: Future / Infogram)
In synthetic benchmarks, the A770 performed fairly well against the current crop of graphics cards, despite its effectively being a last-gen card. It is particularly strong competition against the Nvidia RTX 4060 Ti across multiple workloads, and it even beats the 4060 Ti in a couple of tests.
Its Achilles Heel, though, is revealed in the PassMark 3D Graphics test. Whereas 3DMark tests DirectX 11 and DirectX 12 workloads, Passmark's test also runs DirectX 9 and DirectX 10 workflows, and here the Intel Arc A770 simply can't keep up with AMD and Nvidia.
Image 1 of 24
(Image credit: Future / Infogram)
Image 2 of 24
(Image credit: Future / Infogram)
Image 3 of 24
(Image credit: Future / Infogram)
Image 4 of 24
(Image credit: Future / Infogram)
Image 5 of 24
(Image credit: Future / Infogram)
Image 6 of 24
(Image credit: Future / Infogram)
Image 7 of 24
(Image credit: Future / Infogram)
Image 8 of 24
(Image credit: Future / Infogram)
Image 9 of 24
(Image credit: Future / Infogram)
Image 10 of 24
(Image credit: Future / Infogram)
Image 11 of 24
(Image credit: Future / Infogram)
Image 12 of 24
(Image credit: Future / Infogram)
Image 13 of 24
(Image credit: Future / Infogram)
Image 14 of 24
(Image credit: Future / Infogram)
Image 15 of 24
(Image credit: Future / Infogram)
Image 16 of 24
(Image credit: Future / Infogram)
Image 17 of 24
(Image credit: Future / Infogram)
Image 18 of 24
(Image credit: Future / Infogram)
Image 19 of 24
(Image credit: Future / Infogram)
Image 20 of 24
(Image credit: Future / Infogram)
Image 21 of 24
(Image credit: Future / Infogram)
Image 22 of 24
(Image credit: Future / Infogram)
Image 23 of 24
(Image credit: Future / Infogram)
Image 24 of 24
(Image credit: Future / Infogram)
In non-ray-traced and native-resolution gaming benchmarks, the Intel Arc A770 managed to put up some decent numbers against the competition. At 1080p, the Arc A770 manages an average of 103 fps with an average minimum fps of 54. At 1440p, it averages 78 fps, with an average minimum of 47, and even at 4K, the A770 manages an average of 46 fps, with an average minimum of 27 fps.
Image 1 of 12
(Image credit: Future / Infogram)
Image 2 of 12
(Image credit: Future / Infogram)
Image 3 of 12
(Image credit: Future / Infogram)
Image 4 of 12
(Image credit: Future / Infogram)
Image 5 of 12
(Image credit: Future / Infogram)
Image 6 of 12
(Image credit: Future / Infogram)
Image 7 of 12
(Image credit: Future / Infogram)
Image 8 of 12
(Image credit: Future / Infogram)
Image 9 of 12
(Image credit: Future / Infogram)
Image 10 of 12
(Image credit: Future / Infogram)
Image 11 of 12
(Image credit: Future / Infogram)
Image 12 of 12
(Image credit: Future / Infogram)
Turn on ray tracing, however, and these numbers understandably tank, as they do for just about every card below the RTX 4070 Ti and RX 7900 XT. Still, even here, the A770 does manage an average fps of 41 fps, with an average minimum of 32 fps) at 1080p with ray tracing enabled, which is technically still playable performance. Once you move up to 1440p and 4K, however, your average title isn't going to be playable at native resolution with ray tracing enabled.
Image 1 of 9
(Image credit: Future / Infogram)
Image 2 of 9
(Image credit: Future / Infogram)
Image 3 of 9
(Image credit: Future / Infogram)
Image 4 of 9
(Image credit: Future / Infogram)
Image 5 of 9
(Image credit: Future / Infogram)
Image 6 of 9
(Image credit: Future / Infogram)
Image 7 of 9
(Image credit: Future / Infogram)
Image 8 of 9
(Image credit: Future / Infogram)
Image 9 of 9
(Image credit: Future / Infogram)
Enter Intel XeSS. When set to "Balanced", XeSS turns out to be a game changer for the A770, getting it an average framerate of 66 fps (with an average minimum of 46 fps) at 1080p, an average of 51 fps (with an average minimum of 38 fps) at 1440p, and an average 33 fps (average minimum 26 fps) at 4K with ray tracing maxed out.
While the 26 fps average minimum fps at 4K means it's really not playable at that resolution even with XeSS turned on, with settings tweaks, or more modest ray tracing, you could probably bring that up into the low to high 30s, making 4K games playable on this card with ray tracing turned on.
That's something the RTX 4060 Ti can't manage thanks to its smaller frame buffer (8GB VRAM), and while the 16GB RTX 4060 Ti could theoretically perform better (I have not tested the 16GB so I cannot say for certain), it still has half the memory bus width of the A770, leading to a much lower bandwidth for larger texture files to pass through.
This creates an inescapable bottleneck that the RTX 4060 Ti's much larger L2 cache can't adequately compensate for, and so takes it out of the running as a 4K card. When tested, very few games managed to maintain playable frame rates even without ray tracing unless you dropped the settings so low as to not make it worth the effort. The A770 16GB, meanwhile, isn't technically a 4K card, but it can still dabble at that resolution with the right settings tweaks and still look reasonably good.
Image 1 of 9
(Image credit: Future / John Loeffler)
Image 2 of 9
(Image credit: Future / John Loeffler)
Image 3 of 9
(Image credit: Future / John Loeffler)
Image 4 of 9
(Image credit: Future / John Loeffler)
Image 5 of 9
(Image credit: Future / John Loeffler)
Image 6 of 9
(Image credit: Future / John Loeffler)
Image 7 of 9
(Image credit: Future / Infogram)
Image 8 of 9
(Image credit: Future / Infogram)
Image 9 of 9
(Image credit: Future / Infogram)
All told, then, the Intel Arc A770 turns out to be a surprisingly good graphics card for modern gaming titles that can sometimes even hold its own against the Nvidia RTX 4060 Ti. It can't hold a candle to the RX 7700 XT or RTX 4070, but it was never meant to, and given that those cards cost substantially more than the Arc A770, this is entirely expected.
Its maximum observed power draw of 191.909W is pretty high for the kind of card the A770 is, but it's not the most egregious offender in that regard. All this power meant that keeping it cool was a struggle, with its maximum observed temperature hitting about 74 ºC.
Among all the cards tested, the Intel Arc A770 was at nearly the bottom of the list with the RX 6700 XT, so the picture for this card might have been very different had it launched three years ago and it had to compete with the RTX 3000-series and RX-6000 series exclusively. In the end, this card performs like a last-gen card, because it is.
Despite that, it still manages to be a fantastic value on the market right now given its low MSRP and fairly solid performance, rivaling the RTX 4060 Ti on the numbers. In reality though, with this card selling for significantly less than its MSRP, it is inarguably the best value among midrange cards right now, and it's not even close.
Performance score: 3.5 / 5
(Image credit: Future / John Loeffler)
Should you buy the Intel Arc A770?
Buy the Intel Arc A770 if...
Don't buy it if...
Also Consider
If my Intel Arc A770 review has you considering other options, here are two more graphics cards for you to consider.
How I tested the Intel Arc A770
I spent several days benchmarking the card, with an additional week using it as my primary GPU
I ran our standard battery of synthetic and gaming benchmarks
Test Bench
These are the specs for the test system used for this review: CPU: Intel Core i9-13900K
CPU Cooler: Cougar Poseidon GT 360 AIO Cooler Motherboard: MSI MPG Z790E Tomahawk Wifi
Memory: 64GB Corsair Dominator Platinum RGB DDR5-6000 SSD: Samsung 990 Pro PSU: Thermaltake PF3 1050W ATX 3.0 Case: Praxis Wetbench
I spent about two weeks with the Intel Arc A770 in total, with a little over half that time using it as my main GPU on my personal PC. I used it for gaming, content creation, and other general-purpose use with varying demands on the card.
I focused mostly on synthetic and gaming benchmarks since this card is overwhelmingly a gaming graphics card. Though it does have some video content creation potential, it's not enough to dethrone Nvidia's 4000-series GPUs, so it isn't a viable rival in that sense and wasn't tested as such.
I've been reviewing computer hardware for years now, with an extensive computer science background as well, so I know how graphics cards like this should perform at this tier.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
Lenovo has been on a roll in 2023 with plenty of affordable gaming laptop options, and the Lenovo Legion 5 Slim 14 is the latest one of the bunch. A sleek, stainless steel-looking finish with the logo machine-carved into it, it’s a very simple yet distinctive aesthetic that stands out from the traditional black-colored crowd.
It’s on the slimmer side compared to some other laptops, but when looking at the Razer Blade 14 or the Origin EON 16SL, it’s harder to place this machine under the best thin and light gaming laptops, though it could easily net a spot on best cheap gaming laptops. That said, thanks to its very lightweight and 14-inch display, it really is a portable machine that can easily fit into most bags without weighing them down.
As with most other Lenovo gaming laptops, the majority of the port selection is located in the back, which can be inconvenient for some as it requires a bit of reach. Thankfully the back ports are labeled with icons to make locating them easier.
The major benefit to using three sides for ports is a robust port selection that includes two USB Type-A ports, two USB Type-C ports, an HDMI port, an SD card reader, an audio combo jack, an e-shutter for the webcam, and a charging port. However, it’s disappointing to see an ethernet port missing from the bunch, which is bizarre considering that there was plenty of space to put it in the back.
Opening the laptop up, we have the standard Lenovo laptop keyboard and touchpad, which is certainly not a bad thing. The keys are well-sized and well-spaced with a satisfying snap while the touchpad is responsive and just as snappy as the keyboard. There’s a soft white backlight for late-night typing, a more subtle option compared to the glare of RGB.
The specs are solid with an AMD Ryzen 7 7840HS CPU, up to an Nvidia GeForce RTX 4060 GPU, 16GB RAM, 1TB of storage, and a lovely 14-inch WQXGA+ 120Hz IPS (2880 x 1800p) display. There’s a nice balance between what gives solid performance but which also keeps the pricing more budget-minded.
As a result, we have some quite competitive benchmark scores that nearly match what the best gaming laptops with a much higher pricetag put out, which has been something that Lenovo’s also mastered this generation. And general performance with the best PC games nets some truly impressive results.
Sound quality is also pretty solid, especially since the speaker is located above the keyboard. Audio is clear whether you’re streaming movies, listening to music, or gaming, and at high volumes the sounds don’t lose too much. The webcam is 1080p but is of average-at-best quality, requiring great lighting for a clearer image. It comes with a physical e-shutter, which is excellent for safety and should be standard on any laptop. All in all, this is a very solid win for Lenovo–and any gamer on a budget, for that matter.
Lenovo Legion 5 Slim 14: Price & availability
(Image credit: Future)
How much does it cost? Starting at $1,439.99 / £1,399.99 (including VAT) / AU$2,949
When is it available? Available now
Where can you get it? Available in the US, UK, and Australia
Pricing is quite good for the Lenovo Legion 5 Slim 14, starting out at $1,439.99 / £1,399.99 (including VAT) / AU$2,949 though, at the time of this writing, there’s a discount making it about $200 cheaper. My review unit is a bit pricier at $1,634.99 / £1,630 (including VAT) / AU$2,998 thanks to the RTX 4060 replacing the RTX 4050 in the base configuration. The most expensive configuration will run you $1,884.99 / £1,780 (including VAT) / AU$3,397, which is still lower than a lot of competing gaming laptops, thanks to keeping the RTX 4060.
The UK version has similar pricing to the US, though there’s an interesting difference in that you can opt for no operating system, which saves you £90 off the cheapest configuration. And the Australian version doesn’t come with that option, like the US.
The Legion 5 Slim 14 compares best in price with the Origin EON 16SL starting at $1,949 / £1,763.64 (around AU$3,050) and one of Lenovo’s other offerings, the Lenovo Legion 5i (2022) starting at $1,099.99 / £1,293.49 / AU$2,349. The latter is a truly budget option while the former offers similar specs and pricing, really boiling down to which aesthetics you prefer.
Price score: 5 / 5
Lenovo Legion 5 Slim 14: Specs
(Image credit: Future)
The review unit I received comes with the following configuration: an AMD Ryzen 7 7840HS CPU, an Nvidia GeForce RTX 4060 GPU, 16GB RAM, 1TB of storage, and a 14-inch WQXGA+ 120Hz IPS (2880 x 1800) display.
The Lenovo Legion 5 Slim 14 doesn’t come in separate models, allowing buyers to somewhat configure the CPU, GPU, RAM, and storage space. The US and Australian versions let you choose between 16GB and 32GB of RAM, while the UK model only has 16GB. Another oddity with the Australian version is that if you choose the Ryzen 9 7940HS CPU, then you can only choose the 16GB RAM option.
Specs score: 4.5 / 5
(Image credit: Future)
Lenovo Legion 5 Slim 14: Design
Solid port selection
Excellent display
Great keyboard and touchpad
Lenovo tends to use a stainless steel-type look for most of its gaming machines, with the manufacturer's logo machine-carved into the side. It gives the laptops a very distinctive and appealing aesthetic, which really works to stand out against the sea of boring black laptops that gamers so often get saddled with.
Though it says Slim in the name, it doesn’t look very thin compared to other laptops in that particular market, though the weight and 14-inch display size make up for it as it’s quite manageable carrying it around.
Port selection is solid, including two USB Type-A ports, two USB Type-C ports, an HDMI port, an SD card reader, an audio combo jack, an e-shutter for the webcam, and a charging port.
Pretty much every option except for an ethernet port, which makes little sense both because it’s a gaming laptop and a stable internet connection is paramount to competitive play, as well as the fact that it has ports on the back and plenty of space there to stick an extra port there.
Image 1 of 8
(Image credit: Future)
Image 2 of 8
(Image credit: Future)
Image 3 of 8
(Image credit: Future)
Image 4 of 8
(Image credit: Future)
Image 5 of 8
(Image credit: Future)
Image 6 of 8
(Image credit: Future)
Image 7 of 8
(Image credit: Future)
Image 8 of 8
(Image credit: Future)
Its display is the crowning 14-inch jewel with WQXGA+ 2.8K (2880 x 1800) resolution, 120Hz refresh rate, with up to 500 nits of brightness, 100%DCI-P3 color gamut, and HDR support. The result is a screen that showcases any game with a lovely depth of color and brightness. The color gamut also means that creatives can use this laptop effectively.
The keyboard is the same reliable Lenovo one, meaning wide keys that have a nice snappiness and a white backlight that’s much easier on the eyes while still useful for late-night typing. The trackpad is also the same quality type, with an equally snappy feel and high responsiveness.
Unfortunately, the webcam is also more of the same, needing work office-level lighting to make your image look good. It’s fine for conference calls but grab one of the best webcams if you need to stream.
The sound quality is also very good thanks to the speaker located above the keyboard, allowing you to hear the various layers of instrumental music, as well as vocals, and sound design. Ideal for gaming for sure.
Design score: 4.5 / 5
(Image credit: Future)
Lenovo Legion 5 Slim 14: Performance
Solid all-around performance
Doesn't play nice with ray tracing
Lenovo Legion 5 Slim 14: Benchmarks
Here's how the Lenovo Legion 5 Slim 14 performed in our suite of benchmark tests:
3DMark: Night Raid: 49,967; Fire Strike: 24,906; Time Spy: 10,540; Port Royal: 5,951 GeekBench 5: 1,951 (single-core); 11,595 (multi-core)
Cinebench: 16,671 (multi-core)
Total War: Warhammer III (1080p, Ultra): 87 fps; (1080p, Low): 207 fps Cyberpunk 2077 (1080p, Ultra): 94 fps; (1080p, Low): 122 fps Dirt 5 (1080p, Ultra): 63 fps; (1080p, Low): 95 fps 25GB File Copy: 19.3 Handbrake 1.6: 4:26 CrossMark: Overall: 1,886 Productivity: 1,834 Creativity: 1,987 Responsiveness: 1,753 Web Surfing (Battery Informant): 7:46:44 PCMark 10 Home test: 7,871 TechRadar Movie Battery test: 4 hours and 33 minutes
General performance is impressive, especially for its cheaper price point. Benchmark scores are comparable to more expensive gaming laptops including for 3DMark, PCMark10, Cinebench, and Geekbench. It shows that you don’t need tricked out specs in order to deliver great performance and that affordable laptops can offer a lot to even more hardcore and professional gamers.
Its results in non-gaming benchmarks like the 25GB File Copy, Handbrake, and Crossmark tests are quite good, pairing well with its high color gamut. Creatives can rest assured that they’ll be able to double this laptop as an editing and creative machine.
The AMD CPU gives the Lenovo Legion 5 Slim 14 an edge in terms of more CPU-heavy tasks, while the RTX 4060 runs even AAA games at high settings like a dream. If you need ray tracing and resolutions higher than 1080p, you need to prepare for those framerates to drop significantly, with the worst offender being Cyberpunk 2077.
Ventilation is solid between productivity work and normal gaming sessions, though not as excellent as I would have expected considering how much Lenovo brags about the cooling system. According to the manufacturer, it features phase-change thermal compounds, hybrid copper heat pipes, air intake and exhaust systems, and a 12V dual liquid crystal polymer fan system. But temperatures can still get quite a bit hot on the underside.
Performance score: 4.5 / 5
Lenovo Legion 5 Slim 14: Battery
(Image credit: Future)
Works well with normal use
Not so much with video streaming
The battery life is very interesting, as it scored very well on the web surfing testing, nearly netting eight hours. I also found that it lasts for about seven hours when using it for daily productivity work.
However, on the TechRadar movie test, it managed only four and a half hours. Extremely inconsistent results on opposite ends, though still better than most other gaming laptops.
Battery score: 4 / 5
Should you buy the Lenovo Legion 5 Slim 14?
Buy it if...
You want an easy-to-carry laptop Though it's a little thick to be called "Slim" the 14-inch screen and uncumbersome weight still makes it extremely easy to carry around.
Don't buy it if...
You want a better webcam The webcam in this is pretty average, especially if you plan on using it to livestream.
Lenovo Legion 5 Slim 14: Also consider
If my Lenovo Legion 5 Slim 14 review has you considering other options, here are two more laptops to consider...
How I tested the Lenovo Legion 5 Slim 14
I tested this laptop for about two weeks
I tested the gaming performance as well as productivity work
I used a variety of benchmark tests as well as high-end PC games to test this laptop.
To test out the Lenovo Legion 5 Slim 14 I used a full suite of benchmarks to rank both CPU and GPU performance, with more emphasis on the latter. I also tested out frame rate performance on max settings with a range of high-end PC games like Cyberpunk 2077, Dirt 5, Marvel’s Spider-Man Remastered, and more.
This laptop would primarily be used for gaming, specifically hardcore gaming. Due to its GPU and high color gamut, it can also be used for creative and editing projects, and its CPU means that productivity work is a breeze as well.
I’ve tested out many laptops, especially gaming ones, which gives me plenty of experience with properly benchmarking them. I also have extensive knowledge of testing out general performance such as framerate and graphics.