Organizer
Gadget news
MSI Clutch GM41 Lightweight V2 review: a basic, lightweight gaming mouse that doesn’t quite live up to its potential
12:30 am | February 6, 2025

Author: admin | Category: Computers Computing Gadgets Mice Peripherals & Accessories | Tags: | Comments: Off

MSI Clutch GM41 Lightweight V2: Two-minute review

The GM41 Lightweight is a super simple wired gaming mouse appealing to serious gamers with its light weight and laser-like focus on gaming over extravagant features.

This even extends to its looks. It sports a sleek, low-slung design with little in the way of adornments, or anything to offset the total black colorway. A large RGB light on the body is the only element that brightens up proceedings, which can be customized by holding the DPI button and pressing other buttons or moving the scroll wheel to adjust aspects such as the brightness, mode, speed, and color.

The thin, long shape will likely suit those with long hands, although it might not be wide enough for some. As it’s flat and lower to the ground than many of the best gaming mice, your hand is barely raised from your desktop, which again, some may prefer and some may not. The mouse buttons are relatively flat too, meaning claw grippers might not find them to their liking.

Its texture is smooth all over, save for the rubberized sides that help with grip, but they’re not as sticky as some others are, which I prefer.

The DPI switch is located on the bottom, which isn’t the most convenient, although this does help to keep the GM41 Lightweight as minimal as possible, and prevents mispresses from occurring.

Close-up of MSI Clutch GM41 Lightweight V2 on desk

(Image credit: Future)

There’s no removing the braided cable as it's hardwired. It also feels more durable than other braided cables, although it’s not as soft to touch. And while it’s light, it’s not as light as some of those adorning its rivals.

In order to tweak the GM41 Lightweight further, the MSI Center software can be downloaded for free. However, compared to other peripheral software, it’s disappointingly basic.

There are only a handful of rebinding options, including other mouse buttons and a few multimedia functions, but there are no keyboard assignments or system-level functions available. The same is true of performance enhancements. While you can select the increments for DPI cycling, the only other adjustments are for the polling rate, angle snapping, and lift-off distance – the latter of which only features two settings with no distance unit given. Those who like to tinker with their debounce time or toggle motion sync will be left out here.

When it comes to actual gaming, the GM41 Lightweight acquits itself reasonably well. The lightness makes for easy maneuverability, while the mouse clicks are snappy, thanks to their lack of travel. However, this lack of travel also reduces feedback somewhat. This makes spamming clicks harder, as does the relative heaviness of the clicks themselves and that aforementioned flat shape of the buttons. On a more positive note, however, I did find the mouse buttons to be also pleasingly resistant to slam clicks.

The scroll wheel, however, is very fast yet still provides enough notching to make for controlled flicks when needed. The scroll click is a little heavy, however, making it hard to actuate at times, but thankfully, it’s secure enough to prevent accidental mis-scrolls.

Underside of MSI Clutch GM41 Lightweight V2 on desk

(Image credit: Future)

The side buttons are very thin, but light enough to use with ease. They offer slightly more travel than you might expect, but they are satisfyingly damped, making them more enjoyable to use than you might expect.

However, while the cable is light, I found it did create drag when making large swipes. The issue seems to stem from the lack of angle on the strain relief, as it did little to elevate the first portion of the cable from my desktop surface. Depending on the layout of your setup, this might not be an issue, but you may benefit from using the GM41 Lightweight with a mouse bungee.

Thankfully, the PTFE skates allow for smooth gliding on hard and soft surfaces, although they are quite thin, and there’s no spare set included in the box. But if you stick to mouse pads, then you should have no problem moving the GM41 Lightweight around.

If you’re looking for a basic gaming mouse with no extra buttons or features, then the GM41 Lightweight is a solid choice. Its main rivals boast similar prices, such as BenQ’s range of EC mice. However, there are others, such as the Cooler Master MM311 and the Logitech G305 Lightspeed, that are cheaper and offer brilliant gaming performance, all without requiring a cable.

MSI Clutch GM41 Lightweight V2: Price & availability

Close-up of mouse buttons on MSI Clutch GM41 Lightweight V2

(Image credit: Future)
  • $54 / £29 (about AU$85)
  • Available now
  • Black only

The GM41 Lightweight costs $54 / £29 (about AU$85) and is available in one colorway: black. It doesn’t come with any replaceable parts, such as grip tape or a spare set of skates.

Its price is in line with other 1K wired gaming mice. The BenQ Zowie EC2-C, which is our pick as the best mouse for CS:GO and CS2, costs about the same. However, that mouse is heavier at 73g, and that’s excluding the cable, but we still found its performance to be excellent.

There are wireless gaming mice for less. The Cooler Master MM311, for instance, is our budget champion and also features a 1K polling rate, although it has no rechargeable battery. Likewise, the Logitech G305 Lightspeed is only marginally cheaper than the GM41 Lightweight, and is our pick as the best wireless gaming mouse for those on a budget.

MSI Clutch GM41 Lightweight V2: Specs

Should you buy the MSI Clutch GM41 Lightweight V2?

Buy it if...

You want something simple
The stripped-back nature of the GM41 Lightweight means there’s nothing to distract or delay you from gaming – just plug and play.

You want something light
At 65g, the GM41 Lightweight is undeniably lean, which makes fast movements a breeze. There is some drag though, which might be improved with a mouse bungee.

Don't buy it if...

You want extra features
The GM41 Lightweight has no extra buttons besides the usual, and the software doesn’t offer much in the way of customization and tweaking.

You want the best performance
With a 1K polling rate and lack of advanced settings, the GM41 Lightweight might not offer enough precision and tweakability for elite gamers.

MSI Clutch GM41 Lightweight V2: Also consider

Cooler Master MM311
As budget mice go, you’d be hard pressed to do better than the MM311. It offers a 1K polling rate and great performance, yet undercuts many gaming mice on the market, wireless and wired. It doesn’t have a rechargeable battery, though, and at 77g it’s considerably heavier than the GM41 Lightweight, but it could be a better choice for those who prefer no trailing cables to deal with. Read our full Cooler Master MM311 review.

BenQ Zowie EC2-C
The BenQ Zowie EC2-C is an esports champ in our eyes, as it’s the best for shooters like Counter-Strike. It marries excellent performance with a comfortable design, and it’s also available in multiple size variants. However, like the GM41 Lightweight, it only has a 1K polling rate, so those after something more should look elsewhere. Read our BenQ Zowie EC2-C review.

How I tested the MSI Clutch GM41 Lightweight V2

  • Tested for several days
  • Played various games
  • 10+ years PC gaming experience

I tested the GM41 Lightweight for several days, during which time I used it for playing games, productivity, and general use.

In order to push the GM41 Lightweight to its limits, I played fast-paced shooters such as Counter-Strike 2, which is the ultimate test for any gaming mouse.

I have been PC gaming for over 10 years, and have used a large number of mice during that time. I have also reviewed many of them, from budget picks to high-end offerings, all with various shapes, sizes, weights, and feature sets.

Nvidia GeForce RTX 5080
5:00 pm | January 29, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Nvidia GeForce RTX 5080: Two-minute review

Nvidia GeForce RTX 5080: Price & availability

  • How much is it? MSRP is $999 / £939 / AU$2,019
  • When can you get it? The RTX 5080 goes on sale January 30, 2025
  • Where is it available? The RTX 5080 will be available in the US, UK, and Australia at launch
Where to buy the RTX 5080

Looking to pick up the RTX 5080? Check out our Where to Buy RTX 5080 live blog for updates to find stock in the US and UK.

The Nvidia GeForce RTX 5080 goes on sale on January 30, 2025, starting at $999 / £939 / AU$2,019 for the Founders Edition card from Nvidia, as well as select AIB partner cards. Third-party overclocked (OC) cards and those with other extras like liquid cooling and RGB will ultimately cost more.

The RTX 5080 launches at a much lower price than the original RTX 4080, which had a launch MSRP of $1,199 / £1,189 / AU$2,219, though the RTX 5080 does come in at the same price as the Nvidia RTX 4080 Super.

It's worth noting that the RTX 5080 is fully half the MSRP of the Nvidia RTX 5090 that launches at the same time, and given the performance of the RTX 5080, a lot of potential buyers out there will likely find the RTX 5080 to be the better value of the two cards.

  • Value: 4 / 5

Nvidia GeForce RTX 5080: Specs & features

The Nvidia GeForce RTX 5080's power connection port

(Image credit: Future / John Loeffler)
  • GDDR7 and PCIe 5.0
  • Slightly higher SM count than RTX 4080 Super
  • Moderate increase in TDP, but nothing like the RTX 5090
  • Specs & features: 4 / 5

Nvidia GeForce RTX 5080: Design

  • Slim, dual-slot form factor
  • Better cooling
  • Design: 4.5 / 5

Nvidia GeForce RTX 5080: Performance

An Nvidia GeForce RTX 5090 slotted into a test bench

(Image credit: Future)
  • DLSS 4
A note on my data

The charts shown below are the most recent test data I have for the cards tested for this review and may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

  • Performance: 4 / 5

Should you buy the Nvidia GeForce RTX 5080?

A masculine hand holding an RTX 5090

(Image credit: Future)

Buy the Nvidia GeForce RTX 5080 if...

Don't buy it if...

Also consider

Nvidia GeForce RTX 4080 Super

Read the full Nvidia GeForce RTX 4080 Super review

How I tested the Nvidia GeForce RTX 5080

  • I spent about a week and a half with the RTX 5080
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

  • Originally reviewed January 2024
Nvidia GeForce RTX 5080 review: nearly RTX 4090 performance for a whole lot less
5:00 pm |

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Nvidia GeForce RTX 5080: Two-minute review

At first glance, the Nvidia GeForce RTX 5080 doesn't seem like that much of an upgrade from the Nvidia GeForce RTX 4080 it is replacing, but that's only part of the story with this graphics card.

Its performance, to be clear, is unquestioningly solid, positioning it as the third-best graphics card on the market right now, by my testing, and its new PCIe 5.0 interface and GDDR7 VRAM further distances it from the RTX 4080 and RTX 4080 Super from the last generation. It also outpaces the best AMD graphics card, the AMD Radeon RX 7900 XTX, by a healthy margin, pretty much locking up the premium, enthusiast-grade GPUs in Nvidia's corner for at least another generation.

Most impressively, it does this all for the same price as the Nvidia GeForce RTX 4080 Super and RX 7900 XTX: $999 / £939 / AU$2,019. This is also a rare instance where a graphics card launch price actually recedes from the high watermark set by its predecessor, as the RTX 5080 climbs down from the inflated price of the RTX 4080 when it launched back in 2022 for $1,199 / £1,189 / AU$2,219.

Then, of course, there's the new design of the card, which features a slimmer dual-slot profile, making it easier to fit into your case (even if the card's length remains unchanged). The dual flow-through fan cooling solution does wonders for managing the extra heat generated by the extra 40W TDP, and while the 12VHPWR cable connector is still present, the 3-to-1 8-pin adapter is at least somewhat less ridiculous the RTX 5090's 4-to-1 dongle.

The new card design also repositions the power connector itself to make it less cumbersome to plug a cable into the card, though it does pretty much negate any of the 90-degree angle cables that gained popularity with the high-end RTX 40 series cards.

Finally, everything is built off of TSMC's 4nm N4 process node, making it one of the most cutting-edge GPUs on the market in terms of its architecture. While AMD and Intel will follow suit with their own 4nm GPUs soon (AMD RDNA 4 also uses TSMC's 4nm process node and is due to launch in March), right now, Nvidia is the only game in town for this latest hardware.

None of that would matter though if the card didn't perform, however, but gamers and enthusiasts can rest assured that even without DLSS 4, you're getting a respectable upgrade. It might not have the wow factor of the beefier RTX 5090, but for gaming, creating, and even AI workloads, the Nvidia GeForce RTX 5080 is a spectacular balance of performance, price, and innovation that you won't find anywhere else at this level.

Nvidia GeForce RTX 5080: Price & availability

An RTX 5080 sitting on its retail packaging

(Image credit: Future)
  • How much is it? MSRP is $999 / £939 / AU$2,019
  • When can you get it? The RTX 5080 goes on sale January 30, 2025
  • Where is it available? The RTX 5080 will be available in the US, UK, and Australia at launch
Where to buy the RTX 5080

Looking to pick up the RTX 5080? Check out our Where to buy RTX 5080 live blog for updates to find stock in the US and UK

The Nvidia GeForce RTX 5080 goes on sale on January 30, 2025, starting at $999 / £939 / AU$2,019 for the Founders Edition and select AIB partner cards, while overclocked (OC) and more feature-rich third-party cards will be priced higher.

This puts the Nvidia RTX 5080 about $200 / £200 / AU$200 cheaper than the launch price of the last-gen RTX 4080, while also matching the price of the RTX 4080 Super.

Both of those RTX 40 series GPUs should see some downward price pressure as a result of the RTX 5080 release, which might complicate the value proposition of the RTX 5080 over the other,

The RTX 5080 is also launching at the same MSRP as the AMD Radeon RX 7900 XTX, which is AMD's top GPU right now. And with AMD confirming that it does not intend to launch an enthusiast-grade RDNA 4 GPU this generation, the RTX 5080's only real competition is from other Nvidia graphics cards like the RTX 4080 Super or RTX 5090.

This makes the RTX 5080 a great value proposition for those looking to buy a premium 4K graphics card, as its price-to-performance ratio is very strong.

  • Value: 4 / 5

Nvidia GeForce RTX 5080: Specs & features

A masculine hand holding an Nvidia GeForce RTX 5080 showing off the power connector

(Image credit: Future)
  • GDDR7 VRAM and PCIe 5.0
  • Still just 16GB VRAM
  • Slightly higher 360W TDP

While the Nvidia RTX 5080 doesn't push the spec envelope quite as far as the RTX 5090 does, its spec sheet is still impressive.

For starters, like the RTX 5090, the RTX 5080 uses the faster, next-gen PCIe 5.0 interface that allows for faster data processing and coordination with the CPU, which translates directly into higher performance.

You also have new GDDR7 VRAM in the RTX 5080, only the second card to have it after the RTX 5090, and it dramatically increases the memory bandwidth and speed of the RTX 5080 compared to the RTX 4080 and RTX 4080 Super. Those latter two cards both use slower GDDR6X memory, so even though all three cards have the same amount of memory (16GB) and memory bus-width (256-bit), the RTX 5080 has a >25% faster effective memory speed of 30Gbps, compared to the 23Gbps of the RTX 4080 Super and the 22.4Gbps on the base RTX 4080.

This is all on top of the Blackwell GPU inside the card, which is built on TSMC's 4nm process, compared to the Lovelace GPUs in the RTX 4080 and 4080 Super, which use TSMC's 5nm process. So even though the transistor count on the RTX 5080 is slightly lower than its predecessor's, the smaller transistors are faster and more efficient.

The RTX 5080 also has a higher SM count, 84, compared to the RTX 4080's 76 and the RTX 4080 Super's 80, meaning the RTX 5080 has the commensurate increase in shader cores, ray tracing cores, and Tensor cores. It also has a slightly faster boost clock (2,617MHz) than its predecessor and the 4080 Super variant.

Finally, there is a slight increase in the card's TDP, 360W compared to the RTX 4080 and RTX 4080 Super's 320W.

  • Specs & features: 4.5 / 5

Nvidia GeForce RTX 5080: Design

An Nvidia GeForce RTX 5080 leaning against its retail packaging with the RTX 5080 logo visible

(Image credit: Future)
  • Slimmer dual-slot form factor
  • Dual flow-through cooling system

The redesign of the Nvidia RTX 5080 is identical to that of the RTX 5090, featuring the same slimmed-down dual slot profile as Nvidia's flagship card.

If I were to guess, the redesign of the RTX 5080 isn't as essential as it is for the RTX 5090, which needed a way to bring better cooling for the much hotter 575W TDP, and the RTX 5080 (and eventually the RTX 5070) just slotted into this new design by default.

That said, it's still a fantastic change, especially as it makes the RTX 5080 thinner and lighter than its predecessor.

The dual flow through cooling system on the Nvidia GeForce RTX 5080

(Image credit: Future)

The core of the redesign is the new dual flow-through cooling solution, which uses an innovative three-part PCB inside to open up a gap at the front of the card, allowing a second fan to blow cooler air over the heat sink fins drawing heat away from the GPU.

A view of the comparative slot width of the Nvidia GeForce RTX 5080 and RTX 4080

(Image credit: Future)

This means that you don't need as thick of a heat sink to pull away heat, which allows the card itself to get the same thermal performance from a thinner form factor, moving from the triple-slot RTX 4080 design down to a dual-slot RTX 5080. In practice, this also allows for a slight increase in the card's TDP, giving the card a bit of a performance boost as well, just from implementing a dual flow-through design.

Given that fact, I would not be surprised if other card makers follow suit, and we start getting much slimmer graphics cards as a result.

A masculine hand holding an Nvidia GeForce RTX 5080 showing off the power connector

(Image credit: Future)

The only other design choice of note is the 90-degree turn of the 16-pin power port, which should make it easier to plug the 12VHPWR connector into the card. The RTX 4080 didn't suffer nearly the same kinds of issues with its power connectors as the RTX 4090 did, so this design choice really flows down from engineers trying to fix potential problems with the much more power hungry 5090. But, if you're going to implement it for your flagship card, you might as well put it on all of the Founders Edition cards.

Unfortunately, this redesign means that if you invested in a 90-degree-angled 12VHPWR cable, it won't work on the RTX 5080 Founders Edition, though third-party partner cards will have a lot of different designs, so you should be able to find one that fits your cable situation..

  • Design: 4.5 / 5

Nvidia GeForce RTX 5080: Performance

An Nvidia GeForce RTX 5080 slotted and running on a test bench

(Image credit: Future)
  • Excellent all-around performance
  • Moderately more powerful than the RTX 4080 and RTX 4080 Super, but nearly as fast as the RTX 4090 in gaming
  • You'll need DLSS 4 to get the best results
A note on my data

The charts shown below are the most recent test data I have for the cards tested for this review and may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

A note on the RTX 4080 Super

In my testing for this review, the RTX 4080 Super scored consistently lower than it has in the past, which I believe is an issue with my card specifically that isn't reflective of its actual performance. I'm including the data from the RTX 4080 Super for transparency's sake, but I wouldn't take these numbers as-is. I'll be retesting the RTX 4080 Super soon, and will update my data with new scores once I've troubleshot the issue.

Performance is king, though, and so naturally all the redesign and spec bumps won't amount to much if the RTX 5080 doesn't deliver better performance as a result, and fortunately it does—though maybe not as much as some enthusiasts would like.

Overall, the RTX 5080 manages to score about 13% better than the RTX 4080 and about 19% better than the AMD Radeon RX 7900 XTX, a result that will disappoint some (especially after seeing the 20-25% uplift on the RTX 5090) who were hoping for something closer to 20% or better.

If we were just to go off those numbers, some might call them disappointing, regardless of all the other improvements to the RTX 5080 in terms of design and specs. All this needs to be put in a broader context though, because my perspective changed once I compared the RTX 5080 to the RTX 4090.

Overall, the RTX 5080 is within 12% of the overall performance of the RTX 4090, and within 9% of the RTX 4090's gaming performance, which is a hell of a thing and simply can't be ignored, even by enthusiasts.

Starting with the card's synthetic benchmarks, the card scores about 13% better than the RTX 4080 and RX 7900 XTX, with the RTX 5080 consistently beating out the RTX 4080 and substantially beating the RX 7900 XTX in ray-traced workloads (though the RX 7900 XTX does pull down a slightly better average 1080p rasterization score, to its credit.

Compared to the RTX 4090, the RTX 5080 comes in at about 15% slower on average, with its worst performance coming at lower resolutions. At 4K, though, the RTX 5080 comes in just 7% slower than the last-gen flagship.

In terms of compute performance, the RTX 5080 trounces the RX 7900 XTX, as expected, by about 38%, with a more modest 9% improvement over the RTX 4080. Against the RTX 4090, however, the RTX 5080 comes within just 5% of the RTX 4090's Geekbench compute scores. If you're looking for a cheap AI card, the RTX 5080 is definitely going to be your jam.

On the creative side, PugetBench for Creators Adobe Photoshop benchmark still isn't working for the RTX 5080 Super, so I can't tell you much about its creative raster performance yet (though I will update these charts once that issue is fixed), but going off the 3D modeling and video editing scores, the RTX 5080 is an impressive GPU, as expected.

The entire 3D modeling industry is effectively built on Nvidia's CUDA, so against the RTX 5080, the RX 7900 XTX doesn't stand a chance as the 5080 more than doubles the RX 7900 XTX's Blender Benchmark performance. Gen-on-gen though, the RTX 5080 comes in with about 8% better performance.

Against the RTX 4090, the RTX 5080 comes within 15% on its performance, and for good measure, if you're rocking an RTX 3090 and you're curious about the RTX 5080, the RTX 5080 outperforms the RTX 3090 by about 75% in Blender Benchmark. If you're on an RTX 3090 and want to upgrade, you'll probably still be better off with an RTX 4090, but if you can't find one, the RTX 5080 is a great alternative.

In terms of video editing performance, the RTX 5080 doesn't do as well as its predecessor in PugetBench for Creators Adobe Premiere and effectively ties in my Handbrake 4K to 1080p encoding test. I expect that once the RTX 5080 launches, Puget Systems will be able to update its tools for the new RTX 50 series, so these scores will likely change, but for now, it is what it is, and you're not going to see much difference in your video editing workflows with this card over its predecessor.

An Nvidia GeForce RTX 5080 slotted into a motherboard

(Image credit: Future)

The RTX 5080 is Nvidia's premium "gaming" card, though, so its gaming performance is what's going to matter to the vast majority of buyers out there. For that, you won't be disappointed. Working just off DLSS 3 with no frame generation, the RTX 5080 will get you noticeably improved framerates gen-on-gen at 1440p and 4K, with substantially better minimum/1% framerates as well for smoother gameplay. Turn on DLSS 4 with Multi-Frame Generation and the RTX 5080 does even better, blowing well past the RTX 4090 in some titles.

DLSS 4 with Multi-Frame Generation is game developer-dependent, however, so even though this is the flagship gaming feature for this generation of Nvidia GPUs, not every game will feature it. For testing purposes, then, I stick to DLSS 3 without Frame Generation (and the AMD and Intel equivalents, where appropriate), since this allows for a more apples-to-apples comparison between cards.

At 1440p, the RTX 5080 gets about 13% better average fps and minimum/1% fps overall, with up to 18% better ray tracing performance. Turn on DLSS 3 to balanced and ray tracing to its highest settings and the RTX 5080 gets you about 9% better average fps than its predecessor, but a massive 58% higher minimum/1% fps, on average.

Compared to the RTX 4090, the RTX 5080's average 1440p fps comes within 7% of the RTX 4090's, and within 2% of its minimum/1% fps, on average. In native ray-tracing performance, the RTX 5080 slips to within 14% of the RTX 4090's average fps and within 11% of its minimum/1% performance. Turn on balanced upscaling, however, and everything changes, with the RTX 5080 comes within just 6% of the RTX 4090's ray-traced upscaled average fps, and beats the RTX 4090's minimum/1% fps average by almost 40%.

Cranking things up to 4K, and the RTX 5080's lead over the RTX 4080 grows a good bit. With no ray tracing or upscaling, the RTX 5080 gets about 20% faster average fps and minimum/1% fps than the RTX 4080, overall. Its native ray tracing performance is about the same, however, and it's minimum/1% fps average actually falls behind the RTX 4080's, both with and without DLSS 3.

Against the RTX 4090, the RTX 5080 comes within 12% of its average fps and within 8% of its minimum/1% performance without ray tracing or upscaling. It falls behind considerably in native 4K ray tracing performance (which is to be expected, given the substantially higher RT core count for the RTX 4090), but when using DLSS 3, that ray tracing advantage is cut substantially and the RTX 5080 manages to come within 14% of the RTX 4090's average fps, and within 12% of its minimum/1% fps overall.

Taken together, the RTX 5080 makes some major strides in reaching RTX 4090 performance across the board, getting a little more than halfway across their respective performance gap between the RTX 4080 and RTX 4090.

The RTX 5080 beats its predecessor by just over 13% overall, and comes within 12% of the RTX 4090's overal performance, all while costing less than both RTX 40 series card's launch MSRP, making it an incredible value for a premium card to boot.

  • Performance: 4 / 5

Should you buy the Nvidia GeForce RTX 5080?

A masculine hand holding up an Nvidia GeForce RTX 5080 against a green background

(Image credit: Future)

Buy the Nvidia GeForce RTX 5080 if...

You want fantastic performance for the price
You're getting close to RTX 4090 performance for under a grand (or just over two, if you're in Australia) at MSRP.

You want to game at 4K
This card's 4K gaming performance is fantastic, coming within 12-14% of the RTX 4090's in a lot of games.

You're not willing to make the jump to an RTX 5090
The RTX 5090 is an absolute beast of a GPU, but even at its MSRP, it's double the price of the RTX 5080, so you're right to wonder if it's worth making the jump to the next tier up.

Don't buy it if...

You want the absolute best performance possible
The RTX 5080 comes within striking distance of the RTX 4090 in terms of performance, but it doesn't actually get there, much less reaching the vaunted heights of the RTX 5090.

You're looking for something more affordable
At this price, it's an approachable premium graphics card, but it's still a premium GPU, and the RTX 5070 Ti and RTX 5070 are just around the corner.

You only plan on playing at 1440p
While this card is great for 1440p gaming, it's frankly overkill for that resolution. You'll be better off with the RTX 5070 Ti if all you want is 1440p.

Also consider

Nvidia GeForce RTX 4090
With the release of the RTX 5090, the RTX 4090 should see it's price come down quite a bit, and if scalpers drive up the price of the RTX 5080, the RTX 4090 might be a better bet.

Read the full Nvidia GeForce RTX 4090 review

Nvidia GeForce RTX 5090
Yes, it's double the price of the RTX 5080, and that's going to be a hard leap for a lot of folks, but if you want the best performance out there, this is it.

Read the full Nvidia GeForce RTX 5090 review

How I tested the Nvidia GeForce RTX 5080

  • I spent about a week and a half with the RTX 5080
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about a week testing the RTX 5080, using my updated suite of benchmarks like Black Myth Wukong, 3DMark Steel Nomad, and more.

I also used this card as my primary work GPU where I relied on it for photo editing and design work, while also testing out a number of games on it like Cyberpunk 2077, Black Myth Wukong, and others.

I've been testing graphics cards for TechRadar for a couple of years now, with more than two dozen GPU reviews under my belt. I've extensively tested and retested all of the graphics cards discussed in this review, so I'm intimately familiar with their performance. This gives me the best possible position to judge the merits of the RTX 5080, and whether it's the best graphics card for your needs and budget.

  • Originally reviewed January 2024
TopMate C12 Laptop Cooling Pad review: this laptop cooler sports a bold look – but could be much colder
6:13 pm | January 27, 2025

Author: admin | Category: Computers Computing Gadgets | Tags: | Comments: Off

TopMate C12 Laptop Cooling Pad review

The TopMate C12 Laptop Cooling Pad is a mid-range cooling pad for high-performance gaming and creative laptops. Not only does it rock three 110mm fans and three 70mm fans, but it offers six different fan speeds, with the smaller fans capable of hitting 2400rpm. As such, I was expecting to see some seriously frosty cooling here.

Unfortunately, in our standard 3DMark stress test run on our Acer Predator Helios 300 with an Nvidia GeForce RTX 3080, I found the TopMate C12 more Chicago than Siberia in terms of chilliness. Our baseline test of the laptop alone saw it rise from 20.2ºC to 52.1ºC, an increase of 31.9ºC, while the TopMate on full fan power curbed its heating from 21.2ºC to 44ºC. While this reduced 22.8ºC temperature rise is a definite improvement, it’s nowhere near as impressive as the Liangstar Laptop Cooling Pad, for example, which reduced this down to 15.3ºC.

On the plus side, the TopMate C12 Laptop Cooling Pad runs about as quiet as you could realistically expect. Ten minutes into our stress test, I used a sound level meter to measure the combined noise of the cooling pad and laptop – it registered 58.5dB from three inches away and 45dB from my head height at 21 inches away. That’s bang in line with any of the best laptop cooling pads I’ve tested, and better than the 60dB produced by the uncooled laptop.

Offering eight adjustable heights, you can adjust the angle of your laptop from 6.5 to 50 degrees, which really enabled me to position things at the most ergonomic angle for my height. Chunky flip-up rests keep the laptop firmly in position; however, while these were fine for a bulky gaming laptop, I did find they dug into my wrists when typing on an Ultrabook that these dug into my wrists, so bear this in mind if your laptop is on the slimmer side. In terms of additional flourishes, the RGB lighting is well designed and offers 10 different settings if psychedelic gaming rainbows aren’t your thing.

All in all, the TopMate C12 Laptop Cooling Pad offers perfectly decent performance and a solid, aesthetically pleasing build. This may convince that it’s as good a choice as any other, but there’s one other factor I’d urge you to consider before making your purchase: its price. At $29.99 / £29.77 / AU$49.77, it costs more than some of the other cooling pads I’ve reviewed while offering marginally weaker performance. If you really want something that delivers great bang for your buck, I’d argue that the $19.99 / £19.99 Tecknet N5 Laptop Cooling Pad or $19.99 / £19.93 / AU$47.95 Liangstar Laptop Cooling Pad offer a better deal.

A closeup of the TopMate C12 Laptop Cooling Pad showing its USB ports and RGB lighting.

(Image credit: Future)

TopMate C12 Laptop Cooling Pad review: price & availability

  • Released January 19, 2022
  • MSRP of $29.99 / £29.77 / AU$49.77

First launched on January 19, 2022, the TopMate C12 Laptop Cooling Pad is now available for $29.99 / £29.77 / AU$49.77. It’s worth keeping your eyes peeled for better prices, though: it has occasionally been reduced to £25.30 / AU$42.30 – while this is a modest drop, it does help it compete better with some of the best cooling pads for value.

But, for the most part, it’s easy to find similarly powerful cooling pads that cost a bit less – both the Tecknet N5 Laptop Cooling Pad and the Liangstar Laptop Cooling Pad performed better in our benchmarking tests, while also costing less at $19.99 / £19.99 and $19.99 / £19.93 / AU$47.95 respectively.

A closeup of the TopMate C12 Laptop Cooling Pad showing its controls and RGB lighting.

(Image credit: Future)

Should I buy the TopMate C12 Laptop Cooling Pad?

Buy if if…

You want stable, ergonomic design
Not only is this cooling pad adjustable and comfortable to use, but it’s also rock-solid. Thanks to its flip-up rests, your laptop shouldn’t slip or slide about, even during frantic Counter-Strike matches.

You want competitively quiet running
The TopMate runs as quietly as the best laptop cooling pads we’ve tested. Even at max cooling, it should kick out less noise than your laptop’s fans straining on their own.

Don’t buy it if…

You want the best cooling
Despite the fact it comes packing an array of six fans, some of which top out at 2400rpm, I didn’t find the TopMate capable of as effective cooling as some comparable pads we tested. So if you only care about how many degrees you can shave off, you should probably look elsewhere.

You want a bargain
Don’t get me wrong: this is still an affordable cooling pad, especially compared to the $150 / £125 / AU$150 you'll pay for some products. Nevertheless, this pad is still more expensive than some others we’ve tested, and falls short of their cooling power. So you can definitely get better cooling for your cash.

The TopMate C12 Laptop Cooling Pad showing RGB lighting, featuring the Acer Predator Helios 300.

(Image credit: Future)

TopMate C12 Laptop Cooling Pad review: also consider

Liangstar Laptop Cooling Pad
If you’re after the cheat codes for affordable yet arctic cooling, this is my personal tip. The Liangstar Laptop Cooling Pad is reasonably priced, costing just $19.99 / £23.59 / AU$65.66, and yet during tests it kept our laptop from rising any more than 15.3 degrees – that’s a full seven degrees cooler than the TopMate. It’s well worth a look.

How I tested the TopMate C12 Laptop Cooling Pad

  • Tested it over several days
  • Ran a stress test and measure temperature difference with a thermal camera
  • Measure fan noise 10 minutes into test using a sound level meter

Testing the TopMate C12 Laptop Cooling Pad, I ran it through several all of the standardised benchmarks we use for all laptop cooling pads. First, I checked the hottest point of our Acer Predator Helios 300 with an Nvidia GeForce RTX 3080 testing laptop, ran a 3DMark stress test for 15 minutes with the cooling pad set to max speed, and then re-checked the temperature.

I also tested how noisy it was with a sound level meter. Ten minutes into the test, I measured sound levels from three inches away, as well as from head height to get the absolute and subjective volumes of the pads fans combined with the gaming laptop’s cooling system. I then compared this to benchmarks of the noise generated during a stress test by the laptop’s fans alone.

I also used the laptop cooling pad while gaming and conducting everyday office tasks to test its overall design, sturdiness and ergonomics. For this, I drew on my 30 years experience as a gamer and laptop user, not to mention my 10 years’ experience covering tech and gadgets.

Razer Basilisk V3 Pro 35K review: a large, feature-rich gaming mouse that doesn’t quite best its rivals
3:35 pm | January 24, 2025

Author: admin | Category: Computers Computing Gadgets Mice Peripherals & Accessories | Tags: | Comments: Off

Razer Basilisk V3 Pro 35K: Two-minute review

The Razer Basilisk V3 Pro 35K is a large gaming mouse with plenty of clever features and multiple connectivity options, making it a versatile pointer suitable for various setups and use cases.

It adopts the familiar design popularized by the Logitech G502, with its long sloping front and protruding thumb slot. It’s fairly smart and understated, especially in its black variant, and the RGB lighting around the scroll wheel, logo and bottom edge is subtle and tasteful.

Feeling premium, the materials are among what the best gaming mice have to offer. The plastic chassis is lightly textured and the sides are finished with high- quality rubberized grips. These offer enough security without becoming sticky or collecting dust and dirt over time, as grips on other mice are prone to.

The buttons are of a similar quality. The mouse clicks feel robust and well damped, while the side buttons are snappy with a pleasingly deep travel, making them easy to use. The same is true of the DPI button on the top too.

The scroll wheel has a side-tilt function, which works well since the raised height of the wheel itself makes it easy to tilt left and right. Vertical scrolls are well notched; although it isn’t as tight as you’ll find on other gaming mice, it suffices. This is perhaps a small compromise given that it has a free spin ability, activated by pressing the button above it.

Side view of Razer Basilisk V3 Pro 35K

(Image credit: Future)

What’s more, in Razer’s Synapse peripheral software, you can set the scroll wheel to activate free spin automatically, triggered when you flick it more vigorously. This works well for the most part, although there can be a slight delay as the lock disengages after recognizing a hard flick. However, it’s still viable enough for practical use.

The mouse also features a sniper button, which is well positioned for easy access and clicks in towards the user, again facilitating easy presses. It too feels well damped and built to withstand the rigors of intense gaming.

Weighing in at 112g, the Basilisk V3 Pro 35K is on the heavier side relative to pro-focused gaming mice. Despite this, it’s still quite maneuverable, and the weight feels more evenly distributed than other heavy mice, meaning it feels lighter than it actually is; I’ve tested lighter mice that are tougher to move around than this.

The PTFE skates are quite thin, but on padded surfaces it’s silky smooth and I had no issues with gliding. It does tend to scratch a little on hard ones, though, and unfortunately, there’s no spare set included in the box.

What is included, though, is a USB cable for wired play. It’s quite flexible but more weighty than other gaming mouse cables. As a result, I experienced a small element of drag when using it, but it was manageable enough to carry on playing.

Close-up of underneath of Razer Basilisk V3 Pro 35K

(Image credit: Future)

The Basilisk V3 Pro 35K is compatible with Synapse V4. While the layout is clear and easy to navigate, there are a few issues worth noting. For instance, when using the mouse in wired mode, the tab for it sometimes disappeared momentarily. Also, the battery readouts were initially sporadic, dropping suddenly at times and failing to indicate charging when a wired connection was active, merely displaying a 100% level even though this clearly wasn’t the case. However, these are minor bugs that’ll hopefully be quashed with future software and firmware updates.

When it does work as intended, however, Synapse offers plenty of customization options to sink your teeth into. Standard rebinding options are present, allowing you to map the buttons to other buttons, keyboard keys, and a generous selection of Windows shortcuts. There’s also the Hypershift function, which allows you to map a secondary layer of binds to all buttons, save for the one button you designate as the Hypershift button itself.

Disappointingly, though, there are only a few performance tweaks, such as customizable DPI cycling increments and polling rate changes that top out at 1K, which might not suffice for elite players (they’ll have to purchase the HyperPolling Wireless Dongle for that privilege). However, there is something called Asymmetric Cut-Off, which allows you to set the lift-off and landing distances independently, with plenty of adjustments on the slider for both – not something you see in many other peripheral software.

In use, the Basilisk V3 Pro 35K performs admirably. Gliding is smooth and the Razer Optical Mouse Switches Gen-3 are responsive, although the clicks are a little on the heavy side, which doesn’t make them the best for spamming. That aforementioned heavy weight can compromise swiping speeds too, so those who like low DPI settings might have a hard time here.

Also, the HyperSpeed Wireless Dongle, which is supposed to reduce latency, didn’t appear to make much difference over the standard 2.4GHz dongle during my tests. However, I did experience a few dropouts when using the latter, although my setup could’ve been the culprit, as objects may have been blocking the receiver.

The multiple connectivity modes work fine for the most part, although there is a slight delay when switching between the 2.4GHz and Bluetooth modes across two devices, and a press is required first to wake up the Basilisk V3 Pro 35K to the new device.

Battery life is a claimed 120 hours in HyperSpeed wireless mode, or 210 hours in Bluetooth, but during my tests, which involved switching between the two modes regularly, I was getting closer to sub-100 hours, which is still an admirable performance.

The Basilisk V3 Pro 35K acquits itself well enough, but it doesn’t really achieve anything that spectacular. At this price, it faces some stiff competition, chiefly from Razer’s own DeathAdder and the Logitech G502 X Plus. For performance and features, these two probably edge ahead of the Basilisk V3 Pro 35K, so while it’s perfectly capable, it’s probably not the strongest feature-filled mouse out there.

Razer Basilisk V3 Pro 35K: Price & availability

Close-up of mouse buttons on Razer Basilisk V3 Pro 35K

(Image credit: Future)
  • $159.99 / £159.99 / AU$279.95
  • Available in black and white
  • Rivals similarly priced

The Razer Basilisk V3 Pro 35K costs $159.99 / £159.99 / AU$279.95 and is available now in both black and white colorways. The HyperSpeed Wireless Dongle is included, as is a USB cable for wired play and charging.

It’s the same price as the Razer DeathAdder V3 Pro, which is considerably lighter in weight (63g), but also lighter on features, since its pared-back design is focused on gaming prowess above all else. For instance, it comes with Razer’s HyperPolling Wireless Dongle, which boosts the maximum polling rate to 8K for great precision and smoothness.

Its nearest rival from another brand is the Logitech G502 X Plus, the best wireless gaming mouse for features. Both have a similar price and shape; however, the G502 does feature two more buttons next to the left click, which makes it slightly more versatile.

Razer Basilisk V3 Pro 35K: Specs

Should you buy the Razer Basilisk V3 Pro 35K?

Buy it if...

You prize build quality
True to many of Razer’s peripherals, the Basilisk V3 Pro 35K is engineered to a very high standard, with the chassis and buttons feeling exceptional to hold and press.

You want more buttons and features
A four-way scroll wheel with Smart reel, a sniper button, and three connectivity modes mean there are few situations the Basilisk V3 Pro 35K can’t handle.

Don't buy it if...

You have small hands
The Basilisk V3 Pro 35K is quite long and reasonably wide, so those with small hands might have a hard time getting to grips with it… literally.

You’re an elite player
The heavy weight and a native lack of 8K polling will likely mean it won’t be fast or precise enough in competitive play.

Razer Basilisk V3 Pro 35K: Also consider

Razer DeathAdder V3 Pro
If gaming performance is your main concern, you can’t do much better than the DeathAdder V3 Pro. It’s very light for a wireless mouse, and packs in an 8K polling rate to boot, all for the same price as the Basilisk V3 Pro 35K. However, it doesn’t have as many buttons, and there’s no Bluetooth connectivity either, so this isn’t a mouse for those who want wide-reaching versatility.

Logitech G502 X Plus
One of the most popular gaming mice around, the G502 X Plus can be had for a similar price to the Basilisk V3 Pro 35K, and it’s similarly feature-packed. However, it just edges ahead thanks to its two extra buttons, giving you more functionality. Its gaming performance is nothing to scoff at, either.

How I tested the Razer Basilisk V3 Pro 35K

  • Tested for over a week
  • Used for gaming and productivity
  • Over a decade of PC gaming experience

I tested the Basilisk V3 Pro 35K for over a week, during which time I used it for gaming and productivity purposes.

I played the FPS titles Counter-Strike 2 and S.T.A.L.K.E.R 2: Heart of Chornobyl in order to test the speed and accuracy of the Basilisk V3 Pro 35K. I also tested it on multiple systems and used every connectivity method, as well as the HyperSpeed Wireless Dongle.

I have been PC gaming for over 10 years and have experienced a number of mice. I have also reviewed various gaming mice, spanning a range of shapes, sizes, and price points.

Nvidia GeForce RTX 5090: the supercar of graphics cards
5:00 pm | January 23, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Nvidia GeForce RTX 5090: Two-minute review

The Nvidia GeForce RTX 5090 is a difficult GPU to approach as a professional reviewer because it is the rare consumer product that is so powerful, and so good at what it does, you have to really examine if it is actually a useful product for people to buy.

Right out the gate, let me just lay it out for you: depending on the workload, this GPU can get you up to 50% better performance versus the GeForce RTX 4090, and that's not even factoring in multi-frame generation when it comes to gaming, though on average the performance is still a respectable improvement of roughly 21% overall.

Simply put, whatever it is you're looking to use it for, whether gaming, creative work, or AI research and development, this is the best graphics card for the job if all you care about is pure performance.

Things get a bit more complicated if you want to bring energy efficiency into the equation. But if we're being honest, if you're considering buying the Nvidia RTX 5090, you don't care about energy efficiency. This simply isn't that kind of card, and so as much as I want to make energy efficiency an issue in this review, I really can't. It's not intended to be efficient, and those who want this card do not care about how much energy this thing is pulling down—in fact, for many, the enormous TDP on this card is part of its appeal.

Likewise, I can't really argue too much with the card's price, which comes in at $1,999 / £1,939 / AU$4,039 for the Founders Edition, and which will likely be much higher for AIB partner cards (and that's before the inevitable scalping begins). I could rage, rage against the inflation of the price of premium GPUs all I want, but honestly, Nvidia wouldn't charge this much for this card if there wasn't a line out the door and around the block full of enthusiasts who are more than willing to pay that kind of money for this thing on day one.

Do they get their money's worth? For the most part, yes, especially if they're not a gamer but a creative professional or AI researcher. If you're in the latter camp, you're going to be very excited about this card.

If you're a gamer, you'll still get impressive gen-on-gen performance improvements over the celebrated RTX 4090, and the Nvidia RTX 5090 is really the first consumer graphics card I've tested that can get you consistent, high-framerate 8K gameplay even before factoring in Multi-Frame Generation. That marks the RTX 5090 as something of an inflection point of things to come, much like the Nvidia RTX 2080 did back in 2018 with its first-of-its-kind hardware ray tracing.

Is it worth it though?

That, ultimately, is up to the enthusiast buyer who is looking to invest in this card. At this point, you probably already know whether or not you want it, and many will likely be reading this review to validate those decisions that have already been made.

In that, rest easy. Even without the bells and whistles of DLSS 4, this card is a hearty upgrade to the RTX 4090, and considering that the actual price of the RTX 4090 has hovered around $2,000 for the better part of two years despite its $1,599 MSRP, if the RTX 5090 sticks close to its launch price, it's well worth the investment. If it gets scalped to hell and sells for much more above that, you'll need to consider your purchase much more carefully to make sure you're getting the most for your money. Make sure to check out our where to buy an RTX 5090 guide to help you find stock when it goes on sale.

Nvidia GeForce RTX 5090: Price & availability

  • How much is it? MSRP is $1,999 / £1,939 / AU$4,039
  • When can you get it? The RTX 5090 goes on sale January 30, 2025
  • Where is it available? The RTX 5090 will be available in the US, UK, and Australia at launch
Where to buy the RTX 5090

Looking to pick up the RTX 5090? Check out our Where to buy RTX 5090 live blog for updates to find stock in the US and UK

The Nvidia GeForce RTX 5090 goes on sale on January 30, 2025, starting at $1,999 / £1,939 / AU$4,039 for the Nvidia Founders Edition and select AIB partner cards. Overclocked (OC) and other similarly tweaked cards and designs will obviously run higher.

It's worth noting that the RTX 5090 is 25% more expensive than the $1,599 launch price of the RTX 4090, but in reality, we can expect the RTX 5090 to sell for much higher than its MSRP in the months ahead, so we're really looking at an asking price closer to the $2,499.99 MSRP of the Turing-era Nvidia Titan RTX (if you're lucky).

Of course, if you're in the market for the Nvidia RTX 5090, you're probably not squabbling too much about the price of the card. You're already expecting to pay the premium, especially the first adopter premium, that comes with this release.

That said, this is still a ridiculously expensive graphics card for anyone other than an AI startup with VC backing, so it's worth asking yourself before you confirm that purchase if this card is truly the right card for your system and setup.

  • Value: 3 / 5

Nvidia GeForce RTX 5090: Specs & features

The Nvidia GeForce RTX 5090's power connection port

(Image credit: Future / John Loeffler)
  • First GPU with GDDR7 VRAM and PCIe 5.0
  • Slightly slower clocks
  • Obscene 575W TDP

There are a lot of new architectural changes in the Nvidia RTX 50 series GPUs that are worth diving into, especially the move to a transformer AI model for its upscaling, but let's start with the new specs for the RTX 5090.

First and foremost, the flagship Blackwell GPU is the first consumer graphics card to feature next-gen GDDR7 video memory, and it is substantially faster than GDDR6 and GDDR6X (a roughly 33% increase in Gbps over the RTX 4090). Add in the much wider 512-bit memory interface and you have a total memory bandwidth of 1,790GB/s.

This, more than even the increases VRAM pool of 32GB vs 24GB for the RTX 4090, makes this GPU the first really capable 8K graphics card on the market. 8K textures have an enormous footprint in memory, so moving them through the rendering pipelines to generate playable framerates isn't really possible with anything less than this card has.

Yes, you can, maybe, get playable 8K gaming with some RTX 40 or AMD Radeon RX 7000 series cards if you use aggressive upscaling, but you won't really be getting 8K visuals that'll be worth the effort. In reality, the RTX 5090 is what you want if you want to play 8K, but good luck finding an 8K monitor at this point. Those are still years away from really going mainstream (though there are a growing number of 8K TVs).

If you're settling in at 4K though, you're in for a treat, since all that bandwidth means faster 4K texture processing, so you can get very fast native 4K gaming with this card without having to fall back on upscaling tech to get you to 60fps or higher.

The GeForce RTX logo on the Nvidia GeForce RTX 5090

(Image credit: Future / John Loeffler)

The clock speeds on the RTX 5090 are slightly slower, which is good, because the other major top-line specs for the RTX 5090 are its gargantuan TDP of 575W and its PCIe 5.0 x16 interface. For the TDP, this thermal challenge, according to Nvidia, required major reengineering of the PCB inside the card, which I'll get to in a bit.

The PCIe 5.0 x16 interface, meanwhile, is the first of its kind in a consumer GPU, though you can expect AMD and Intel to quickly follow suit. Why this matters is because a number of newer motherboards have PCIe 5.0 lanes ready to go, but most people have been using those for PCIe 5.0 m.2 SSDs.

If your motherboard has 20 PCIe 5.0 lanes, the RTX 5090 will take up 16 of those, leaving just four for your SSD. If you have one PCIe 5.0 x4 SSD, you should be fine, but I've seen motherboard configurations that have two or three PCIe 5.0 x4 m.2 slots, so if you've got one of those and you've loaded them up with PCIe 5.0 SSDs, you're likely to see those SSDs drop down to the slower PCIe 4.0 speeds. I don't think it'll be that big of a deal, but it's worth considering if you've invested a lot into your SSD storage.

As for the other specs, they're more or less similar to what you'd find in the RTX 4090, just more of it. The new Blackwell GB202 GPU in the RTX 5090 is built on a TSMC 4nm process, compared to the RTX 4090's TSMC 5nm AD102 GPU. The SM design is the same, so 128 CUDA cores, one ray tracing core, and four tensor cores per SM. At 170 SMs, you've got 21,760 CUDA cores, 170 RT cores, and 680 Tensor cores for the RTX 5090, compared to the RTX 4090's 128 SMs (so 16,384 CUDA, 128 RT, and 512 Tensor cores).

  • Specs & features: 4.5 / 5

Nvidia GeForce RTX 5090: Design

The Nvidia GeForce RTX 5090 sitting on its packaging

(Image credit: Future / John Loeffler)
  • Slim, dual-slot form factor
  • Better cooling

So there's a significant change to this generation of Nvidia Founders Edition RTX flagship cards in terms of design, and it's not insubstantial.

Holding the RTX 5090 Founders Edition in your hand, you'll immediately notice two things: first, you can comfortably hold it in one hand thanks to it being a dual-slot card rather than a triple-slot, and second, it's significantly lighter than the RTX 4090.

A big part of this is how Nvidia designed the PCB inside the card. Traditionally, graphics cards have been built with a single PCB that extends from the inner edge of the PC case, down through the PCIe slot, and far enough back to accommodate all of the modules needed for the card. On top of this PCB, you'll have a heatsink with piping from the GPU die itself through a couple of dozen aluminum fins to dissipate heat, with some kind of fan or blower system to push or pull cooler air through the heated fins to carry away the heat from the GPU.

The problem with this setup is that if you have a monolithic PCB, you can only really extend the heatsinks and fans off of the PCB to help cool it since a fan blowing air directly into a plastic wall doesn't do much to help move hot air out of the graphics card.

A split view of the Nvidia GeForce RTX 5090's dual fan passthrough design

(Image credit: Future / John Loeffler)

Nvidia has a genuinely novel innovation on this account, and that's ditching the monolithic PCB that's been a mainstay of graphics cards for 30 years. Instead, the RTX 5090 (and presumably subsequent RTX 50-series GPUs to come), splits the PCB into three parts: the video output interface at the 'front' of the card facing out from the case, the PCIe interface segment of the card, and the main body of the PCB that houses the GPU itself as well as the VRAM modules and other necessary electronics.

This segmented design allows a gap in the front of the card below the fan, so rather than a fan blowing air into an obstruction, it can fully pass over the fins of the GPU's heatsink, substantially improving the thermals.

As a result, Nvidia is able to shrink the width of the card down considerably, moving from a 2.4-inch width to a 1.9-inch width, or a roughly 20% reduction on paper. That said, it feels substantially smaller than its predecessor, and it's definitely a card that won't completely overwhelm your PC case the way the RTX 4090 does.

The 4 8-pin to 16-pin 12VHPWR adapter included with the Nvidia GeForce RTX 5090

(Image credit: Future / John Loeffler)

That said, the obscene power consumption required by this card means that the 8-pin adapter included in the RTX 5090 package is a comical 4-to-1 dongle that pretty much no PSU in anyone's PC case can really accommodate.

Most modular PSUs give you three PCIe 8-pin power connectors at most, so let's just be honest about this setup. You're going to need to get a new ATX 3.0 PSU with at least 1000W to run this card at a minimum (it's officially recommended PSU is 950W, but just round up, you're going to need it), so make sure you factor that into your budget if you pick this card up

Otherwise, the look and feel of the card isn't that different than previous generations, except the front plate of the GPU where the RTX 5090 branding would have gone is now missing, replaced by a finned shroud to allow air to pass through. The RTX 5090 stamp is instead printed on the center panel, similar to how it was done on the Nvidia GeForce RTX 3070 Founders Edition.

As a final touch, the white back-lit GeForce RTX logo and the X strips on the front of the card, when powered, add a nice RGB-lite touch that doesn't look too guady, but for RGB fans out there, you might think it looks rather plain.

  • Design: 4.5 / 5

Nvidia GeForce RTX 5090: Performance

An Nvidia GeForce RTX 5090 slotted into a test bench

(Image credit: Future)
  • Most powerful GPU on the consumer market
  • Substantially faster than RTX 4090
  • Playable 8K gaming
A note on my data

The charts shown below are the most recent test data I have for the cards tested for this review and may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

So how does the Nvidia GeForce RTX 5090 stack up against its predecessor, as well as the best 4K graphics cards on the market more broadly?

Very damn well, it turns out, managing to improve performance over the RTX 4090 in some workloads by 50% or more, while leaving everything else pretty much in the dust.

Though when looked at from 30,000 feet, the overall performance gains are respectable gen-on-gen but aren't the kind of earth-shattering gains the RTX 4090 made over the Nvidia GeForce RTX 3090.

Starting with synthetic workloads, the RTX 5090 scores anywhere from 48.6% faster to about 6.7% slower than the RTX 4090 in various 3DMark tests, depending on the workload. The only poor performance for the RTX 5090 was in 3DMark Night Raid, a test where both cards so completely overwhelm the test that the difference here could be down to CPU bottlenecking or other issues that aren't easily identifiable. On every other 3DMark test, though, the RTX 5090 scores 5.6% better or higher, more often than not by 20-35%. In the most recent;y released test, Steel Nomad, the RTX 5090 is nearly 50% faster than the RTX 4090.

On the compute side of things, the RTX 5090 is up to 34.3% faster in Geekbench 6 OpenGL compute test and 53.9% faster in Vulcan, making it an absolute monster for AI researchers to leverage.

On the creative side, the RTX 5090 is substantially faster in 3D rendering, scoring between 35% and 49.3% faster in my Blender Benchmark 4.30 tests. There's very little difference between the two cards when it comes to video editing though, as they essentially tie in PugetBench for Creators' Adobe Premiere test and in Handbrake 1.7 4K to 1080p encoding.

The latter two results might be down to CPU bottlenecking, as even the RTX 4090 pushes right up against the performance ceiling set by the CPU in a lot of cases.

When it comes to gaming, the RTX 5090 is substantially faster than the RTX 4090, especially at 4K. In non-upscaled 1440p gaming, you're looking at a roughly 18% better average frame rate and a 22.6% better minimum/1% framerate for the RTX 5090. With DLSS 3 upscaling (but no frame generation), you're looking at 23.3% better average and 23% better minimum/1% framerates overall with the RTX 5090 vs the RTX 4090.

With ray tracing turn on without upscaling, you're getting 26.3% better average framerates and about 23% better minimum/1% framerates, and with upscaling turned on to balanced (again, no frame generation), you're looking at about 14% better average fps and about 13% better minimum/1% fps for the RTX 5090 against the RTX 4090.

At 4K, however, the faster memory and wider memory bus really make a difference. Without upscaling and ray tracing turned off, you're getting upwards of 200 fps at 4K for the RTX 5090 on average, compared to the RTX 4090's 154 average fps, a nearly 30% increase. The average minimum/1% fps for the RTX 5090 is about 28% faster than the RTX 4090, as well. With DLSS 3 set to balanced, you're looking at a roughly 22% better average framerate overall compared to the RTX 4090, with an 18% better minimum/1% framerate on average as well.

With ray tracing and no upscaling, the difference is even more pronounced with the RTX 5090 getting just over 34% faster average framerates compared to the RTX 4090 (with a more modest 7% faster average minimum/1% fps). Turn on balanced DLSS 3 with full ray tracing and you're looking at about 22% faster average fps overall for the RTX 5090, but an incredible 66.2% jump in average minimum/1% fps compared to the RTX 4090 at 4K.

Again, none of this even factors in single frame generation, which can already substantially increase framerates in some games (though with the introduction of some input latency). Once Multi-Frame Generation rolls out at launch, you can expect to see these framerates for the RTX 5090 run substantially higher. Pair that with Nvidia Reflex 2 to help mitigate the input latency issues frame generation can introduce, and the playable performance of the RTX 5090 will only get better with time, and it's starting from a substantial lead right out of the gate.

In the end, the overall baseline performance of the RTX 5090 comes in about 21% better than the RTX 4090, which is what you're really looking for when it comes to a gen-on-gen improvement.

That said, you have to ask whether the performance improvement you do get is worth the enormous increase in power consumption. That 575W TDP isn't a joke. I maxed out at 556W of power at 100% utilization, and I hit 100% fairly often in my testing and while gaming.

The dual flow-through fan design also does a great job of cooling the GPU, but at the expense of turning the card into a space heater. That 575W of heat needs to go somewhere, and that somewhere is inside your PC case. Make sure you have adequate airflow to vent all that hot air, otherwise everything in your case is going to slowly cook.

As far as performance-per-price, this card does slightly better than the RTX 4090 on value for the money, but that's never been a buying factor for this kind of card anyway. You want this card for its performance, plain and simple, and in that regard, it's the best there is.

  • Performance: 5 / 5

Should you buy the Nvidia GeForce RTX 5090?

A masculine hand holding an RTX 5090

(Image credit: Future)

Buy the Nvidia GeForce RTX 5090 if...

You want the best performance possible
From gaming to 3D modeling to AI compute, the RTX 5090 serves up best-in-class performance.

You want to game at 8K
Of all the graphics cards I've tested, the RTX 5090 is so far the only GPU that can realistically game at 8K without compromising on graphics settings.

You really want to flex
This card comes with a lot of bragging rights if you're into the PC gaming scene.

Don't buy it if...

You care about efficiency
At 575W, this card might as well come with a smokestack and a warning from your utility provider about the additional cost of running it.

You're in any way budget-conscious
This card starts off more expensive than most gaming PCs and will only become more so once scalpers get their hands on them. And that's not even factoring in AIB partner cards with extra features that add to the cost.

You have a small form-factor PC
There's been some talk about the new Nvidia GPUs being SSF-friendly, but even though this card is thinner than the RTX 4090, it's just as long, so it'll be hard to fit it into a lot of smaller cases.

Also consider

Nvidia GeForce RTX 4090
I mean, honestly, this is the only other card you can compare the RTX 5090 to in terms of performance, so if you're looking for an alternative to the RTX 5090, the RTX 4090 is pretty much it.

Read the full Nvidia GeForce RTX 4090 review

How I tested the Nvidia GeForce RTX 5090

  • I spent about a week and a half with the RTX 5090
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about a week and a half testing the Nvidia GeForce RTX 5090, both running synthetic tests as well as using it in my day-to-day PC for both work and gaming.

I used my updated testing suite, which uses industry standard benchmark tools like 3DMark, Geekbench, Pugetbench for Creators, and various built-in gaming benchmarks. I used the same testbench setup listed to the right for the purposes of testing this card, as well as all of the other cards I tested for comparison purposes.

I've tested and retested dozens of graphics cards for the 20+ graphics card reviews I've written for TechRadar over the last few years, and so I know the ins and outs of these PC components. That's why you can trust my review process to help you make the right buying decision for your next GPU, whether it's the RTX 5090 or any of the other graphics cards I review.

  • Originally reviewed January 2024
Logitech G309 Lightspeed review: a wireless gaming mouse that offers just enough to keep gamers interested
12:54 pm |

Author: admin | Category: Computers Computing Gadgets Mice Peripherals & Accessories | Comments: Off

Logitech G309 Lightspeed review

The Logitech G309 Lightspeed is a mid-range wireless gaming mouse, aimed towards the more casual gamer who doesn’t need the elite features or performance of more premium offerings.

In terms of appearance, the G309 Lightspeed keeps things simple. There’s no gaming imagery or designs present (there’s not even any RGB lighting), looking more akin to a productivity mouse than anything else. Only a small Logitech G logo offers any sort of contrast to the monochrome colorway.

The overall shape is sleek, as the bulbous rear thins out towards the front. It’s not as wide or as long as the G502 Lightspeed, one of the best gaming mice around. But the mouse buttons do feel long, and their relatively flat profile means your fingers are more horizontal than you might expect; claw-grippers, therefore, might not find the G309 Lightspeed to their liking.

Build quality is also good. The plastic shell feels reasonably thin but sturdy nonetheless, and thankfully the large lid for the battery compartment feels secure yet easy enough to open when needed. The mouse buttons are snappy and light, but with enough solidity to inspire confidence.

The scroll wheel is notched tightly enough to prevent misfires, yet loose enough for speedy performance. The scroll wheel button puts up too much resistance, though, and is awkward to press at times. The side buttons, however, are light and responsive yet solid.

The PTFE skates are quite thin, making the G309 Lightspeed a mouse I would recommend using on padded surfaces only. There’s no replacement skates included in the box either, as some other gaming mice do.

Top of Logitech G309 Lightspeed

(Image credit: Future)

Weighing in at 86g, the G309 Lightspeed does make itself felt in the hand. The weight is mostly concentrated in the center-rear, which helps it feel balanced, but those who set a low DPI may struggle to perform big swipes. This isn’t helped by the lack of grip: the sides aren’t indented to allow your thumb and fingers to get a proper hold of the body, and the smooth-textured plastic material fails to offer much traction. However, grip tape is included for both the sides and the mouse buttons, which does help to improve the situation.

If you use the G309 Lightspeed with the Logitech’s Powerplay wireless charging mat, the weight drops to 68g as you shed the AA battery. This makes the G309 Lightspeed much more amenable to large sweeps. However, Powerplay is an additional extra that adds to the overall price of the mouse.

Logitech claims the Lightspeed Hybrid switches feel mechanical despite being optical. I have to say that these claims hold water. They are light enough for ultra-fast clicking, yet provide enough feedback to feel what’s going on, all of which makes the G309 Lightspeed very satisfying to game with.

The G309 Lightspeed is also smooth and precise when gliding and aiming, although the 1K maximum polling rate might not be enough for those after the absolute best FPS performance. However, the G309 Lightspeed still performs on this front.

The Lightspeed Wireless connection also gave me no trouble when gaming. Switching between two devices, one connected via Bluetooth and the other the Lightspeed USB dongle, was also quick and hassle-free.

Side buttons of Logitech G309 Lightspeed

(Image credit: Future)

Via Logitech’s G Hub software, various tweaks and customizations to the G309 Lightspeed are possible. There are profiles available for a whole host of popular games, and DPI presets for various genres and use cases, including productivity, first-person, MMORPG, simulation and strategy. You can also tweak the sensitivity of each of the five DPI increments yourself, from 100 up to 25,600.

Along with the DPI settings, you can also reassign the mouse buttons. Options include rebinding buttons to other buttons and keys, as well as common shortcuts and system-level functions, such as cycling audio devices, copying and pasting, launching apps, and controlling media playback. There are also configurable actions specific to Discord, Streamlabs, Overwolf and OBS.

You can change the poll rate too, as well as the switch-type from hybrid to mechanical. However, the G309 Lightspeed lacks some of the more advanced tweaks seen on other gaming mice. For instance, there’s no motion sync toggle, or lift-off and debounce time adjustments.

Battery life is quoted as being up to 300 hours when using the Lightspeed wireless connection, or 600 when using Bluetooth. Although I wasn’t able to spend this much time with the G309 Lightspeed, I can say that over almost a week’s worth of use via both Bluetooth and the Lightspeed wireless USB dongle, the battery life dipped by around 5%, so Logitech’s claims do seem reasonably accurate.

At this price point, the G309 Lightspeed represents good value, considering its performance and specs. The multiple wireless connectivity options, as well as the integration with Logitech’s G Hub software, are welcome features and work as intended. However, the 1K poll rate may not be enough for some, and there are other gaming mice out there, such as the Cooler Master MM311 and Logitech’s own G305, that may prove to be better value depending on what you want from a gaming mouse.

Logitech G309 Lightspeed: Price & availability

Underneath of Logitech G309 Lightspeed

(Image credit: Future)
  • $79 / £79 / AU$149
  • Available now
  • Cheaper alternatives available

The G309 Lightspeed costs $79 / £79 / AU$149 and is available now. It comes in two colorways, black and white. There’s also a Kamisato Ayaka Special Edition available in certain territories, such as the US.

Despite the AA battery, the G309 Lightspeed can be used with Logitech’s Powerplay wireless charging mat, with the brand currently offering a 30% saving on it in a bundle deal.

The G309 Lightspeed sits somewhere towards the lower end of Logitech’s Lightspeed wireless range of gaming mice. The G502 and G903 sit above it in terms of spec, with their upgraded hardware and additional features. However, the G502 is only marginally more expensive, although it does have an inbuilt battery and many additional buttons.

Meanwhile, the G305 is even cheaper, although this only has one onboard memory profile slot, as opposed to five on the G309 Lightspeed. There’s also no Bluetooth connectivity, but it does have an inbuilt battery and a USB-C port.

If you’re really on a budget, the Cooler Master MM311 is even cheaper and about the best wireless gaming mouse around in terms of value for money. Like the G309 Lightspeed, the MM311 also has a 1K polling rate and requires a AA battery.

Logitech G309 Lightspeed: Specs

Should you buy the Logitech G309 Lightspeed?

Buy it if...

You want good gaming performance
For the price and spec, the G309 Lightspeed will provide many players with enough precision and snap.

You want good software
G Hub, for the most part, is an easy-to-use and versatile tool, offering numerous customization options for the G309 Lightspeed, although more advanced tweaks aren’t possible.

Don't buy it if...

You want an elite performer
With a weight of 86g / 68g and a maximum polling rate of 1KHz, the G309 Lightspeed might not cut it for pro-level players.

You want to play and charge
There’s no USB port or inbuilt battery, so you can’t just plug in a cable and continue gaming. There’s an optional charging mat available, but it’ll cost you.

Logitech G309 Lightspeed review: Also consider

Cooler Master MM311
Our pick as the best gaming mouse for those on a budget, the MM311 undercuts just about every other wireless gaming mouse on the market, yet still offers top-tier quality. It also requires an AA battery like the G309 Lightspeed, but it’s lighter at 77g.

Read our full Cooler Master MM311 review.

Logitech G305 Lightspeed
For less money than the G309 Lightspeed, you could opt for the G305 Lightspeed. It has an inbuilt battery and a USB-C port, both of which are absent from the G309. It also has the same 1K polling rate, although it does miss out on Bluetooth connectivity and only has 1 profile slot on its onboard memory. If you want to keep things as simple as possible, though, this could be a contender.

Read our full Logitech G305 Lightspeed review.

How I tested the Logitech G309 Lightspeed

  • Tested for about a week
  • Used for gaming and productivity
  • 10+ years gaming experience

I tested the G309 Lightspeed for about a week. During that time, I used the G309 Lightspeed for gaming, as well as for productivity and general use.

I played games that put the G309 Lightspeed through its paces, including Counter-Strike 2 and I Am Your Beast. I also used as many features and made as many tweaks as possible via the G Hub software, in order to test its usability and effectiveness.

I have over 10 years of PC gaming experience, and during that time I have used multiple gaming mice, including those made by Logitech. I have also reviewed numerous gaming mice with varying specs and price points, from budget offerings to esport-grade devices.

Read more about how we test

First reviewed: January 2025

GravaStar Mercury M1 Pro: a wireless gaming mouse with a brash design that unfortunately hampers performance
12:00 pm |

Author: admin | Category: Computers Computing Gadgets Mice Peripherals & Accessories | Tags: | Comments: Off

GravaStar Mercury M1 Pro: two-minute review

The GravaStar Mercury M1 Pro is made for pro-level players with its advanced features and looks aimed squarely at the gaming market. But its design will be divisive, to say the least.

The GravaStar Mercury M1 Pro's industrial spiderweb body with its faux scuffs and scratches (on the Battle Worn Edition) would’ve looked dated 20 years ago. The dull gray color of this variant only adds to the dourness. The center of the mouse features a large RGB light, which certainly makes it stand out even more. However, I struggle to imagine anyone who would find the Mercury M1 Pro to their taste.

You might think the pitted design would help keep the weight down, but the metal frame and center mass of the Mercury M1 Pro makes itself felt; at 88g, it’s certainly heavier than the best gaming mouse around.

Its long shape and webbing also make for bad ergonomics. The metallic material doesn’t offer the best grip, and the mouse buttons are quite short and feel like a stretch to reach – those with smaller hands may struggle in particular with this aspect.

The concave side walls also fail to offer much support or grip when lifting off before swipes. Optional grip tape is included for the sides and the mouse buttons, but this didn’t improve matters for me, and the padding material feels cheap and too slippery to be of much use.

On top of this, the mouse buttons feel too weighty to allow for quickfire clicks despite their cheap-feeling plastic construction, and simply don’t feel comfortable under the fingertips. However, the scroll wheel and side buttons do feel solid and secure enough to use.

The thin skates seemed quite scratchy at first, even on padded surfaces, until I realized they were covered by imperceptibly thin peel-away sheets - so make sure to take these off before using the Mercury M1 Pro. Afterward, the scratchiness was gone and gliding felt a lot smoother. Replacement skates are also included, but these don’t offer any additional padding.

GravaStar Mercury M1 Pro on desk with USB cable and dongle

(Image credit: Future)

Via the GravaStar software, you can make various customizations and tweaks. All the standard options are present, such as the ability to rebind five of the six buttons on the Mercury M1 Pro (the main left button cannot be altered).

These include some useful system-level shortcuts and functions, including custom keystroke combinations (with or without modifier keys), media playback controls, and both vertical and horizontal scrolling. However, the selection is a little sparse compared to those offered by some other peripheral software.

Other rebinds include a fire mode, which lets you assign a button to rapid-fire left click up to three times (or infinitely until the button is released) in intervals of between 10 and 255 ( I presume milliseconds, as the software doesn’t actually specify the unit). There’s a DPI lock mode to set the sensitivity of the Mercury M1 Pro to a fixed amount rather than cycling through the six available increments (although these increments can be set yourself in the software too).

There’s also a macro recording feature, and – of course – multiple patterns and color options to choose from for the RGB lighting.

Catering to the elite, there are also tweaks for the lift-off distance (between 1 and 2mm) and debounce times (between 4ms and 8ms). You can also toggle motion sync, ripple control, and angle snapping on or off.

However, what won’t suffice for many pro-level gamers is the 4K polling rate. This is achieved using the other USB dongle included in the box of the Battle Worn Edition, which is much larger than the default bundled 1K dongle and looks like something that might birth a xenomorph (though why you'd want that on your desk, I have no idea). Many high-end gaming mice offer an 8K polling rate, which makes for a more noticeable improvement in precision and smoothness. Here, though, I struggled to discern a performance increase between 1K and 4K, which begs the question: if you’re going to include a separate dongle to allow for higher polling rates, then why stop short of 8K?

When it comes to actually playing games, the Mercury M1 Pro fails to impress in this regard either. The aforementioned ergonomic issues make gliding and clicking awkward; there’s just not enough snappiness or ease of movement to make it viable for pro-level play in competitive online games. It’s precise enough for more casual sessions, but the feel in the hand doesn’t make the Mercury M1 Pro much fun to use.

As for battery life, GravaStar's website merely states that it’s “prolonged”, whatever that means. During my tests, in which I used a mixture of power and connectivity modes, it only dropped by 5% after a day's worth of use, which is admittedly solid performance.

The various connectivity methods worked well, although switching between the 1K and 4K dongles isn’t straightforward, requiring re-pairing every time, which only adds to the inconvenience. However, the process is relatively quick and easy, and you likely won’t be switching much between these modes on the same machine anyway.

It’s hard to find much to commend in the Mercury M1 Pro, especially since it commands a high-end price. For the same money, you could have a Cherry XTRFY M68 PRO, which offers a higher 8K polling rate and much better design and performance, though it lacks any tweaking software. Various Razer offerings, such as the DeathAdder V3 Pro, roundly beat it in every regard.

Side view of GravaStar Mercury M1 Pro

(Image credit: Future)

GravaStar Mercury M1 Pro: Price & availability

Underneath of GravaStar Mercury M1 Pro with USB dongle

(Image credit: Future)
  • $129 / £101 / AU$198
  • Battle Worn Edition includes 4K dongle
  • Top end of the market

The Mercury M1 Pro costs $129 / £101 / AU$198 and is available now. It comes in two colorways: Gunmetal Gray and Silver Mist, the latter of which is the Battle Worn Edition. This features the aforementioned faux wearing and comes with the 4K dongle, hence its price hike over the Gunmetal Gray variant, which costs $99 / £78 / AU$152.

It’s cheaper than some of the best wireless gaming mouse options in our view, including the Razer DeathAdder V3 Pro. However, that mouse can achieve an 8K polling rate (albeit requiring an additional adapter to do so). It also integrates with Synapse, Razer’s peripheral software that offers many advanced customizations.

For about the same price as the Mercury M1 Pro, you could also get the Cherry XTRFY M68 Pro. Again, this mouse has an 8K polling rate, which is supported right out of the box. At 55g, it’s also a lot lighter than the Mercury M1 Pro. However, it doesn’t have any peripheral software, so all adjustments must be made via some rather cumbersome button combinations.

GravaStar Mercury M1 Pro: Specs

Should you buy the GravaStar Mercury M1 Pro?

Buy it if...

You want something brash
There’s no denying you’ll be making a statement with the Mercury M1 Pro, as its bright lights and doom-metal stylings stand out.

You want good software
The accompanying software is easy to use, runs well, and offers the advanced tweaks and customizations pro-level gamers crave.

Don't buy it if...

You want good looks
Of course, beauty is in the eye of the beholder, but the Mercury M1 Pro is hard to love, especially the dreary Battle Worn Edition.

You want good ergonomics
That design and long profile makes it hard to use the Mercury M1 Pro comfortably, especially in the heat of simulated battle.

You want the best performance
Offering a 4K polling rate but not 8K is somewhat baffling, and its hefty weight isn’t ideal for speed.

GravaStar Mercury M1 Pro: Also consider

Cherry XTRFY M68 Pro
For the same price as the Battle Worn Edition of the Mercury M1 Pro, you could have the M68 Pro instead. This offers an 8K polling rate as opposed to the 4K maximum of the Mercury M1 Pro, as well as better performance and a superior design.

Razer DeathAdder V3 Pro
It’s more expensive than the Mercury M1 Pro, but as one of the best gaming mice on the market, the DeathAdder V3 Pro has everything a pro gamer needs: a light weight, excellent performance, great customization options, and an 8K polling rate is possible with the optional HyperPolling dongle. Read our full Razer DeathAdder V3 Pro review.

How I tested the GravaStar Mercury M1 Pro

  • Tested for several days
  • Used for gaming and productivity
  • 10+ years PC gaming experience

I tested the Mercury M1 Pro for several days, and used it for gaming, productivity, and general use.

I played titles such as Counter-Strike 2 and I Am Your Beast – fast-paced shooters that are the perfect testing grounds for gaming mice. I also used as many features present in the GravaStar software as possible, and used all connectivity modes, including the 4K dongle.

I have been PC gaming for over 10 years and during that time, I have used a number of gaming mice. I have also reviewed a wide variety of products in this segment, spanning various sizes, feature sets, and prices, from big-name brands and lesser-known manufacturers.

Read more about how we test.

First reviewed: November 2024

Cherry XTRFY M68 Pro review: a gaming mouse with odd proportions but stellar performance
12:29 pm | January 22, 2025

Author: admin | Category: Computers Computing Gadgets Mice Peripherals & Accessories | Tags: , | Comments: Off

Cherry XTRFY M68 Pro: two-minute review

At first glance, the Cherry XTRFY M68 Pro is rather restrained in its appearance as gaming mice go. Its straightfoward design is rather inconspicuous, and the white model I've reviewed here is only offset by black side buttons and accents on the scroll wheel. The logo is inconspicuous and there’s no RGB lighting either. So far, so conventional.

However, the design of the Cherry XTRFY M68 Pro's front end is less so. The mouse buttons are raised steeply at the back, and curve down sharply before cutting short, which means the contact point is biased towards your fingertips, more so than I’ve experienced with many other gaming mice.

The M68 Pro weighs just 55g, and there aren’t too many full-size wireless gaming mice lighter than this. It undercuts the Razer DeathAdder V3 Pro, our pick as the best wireless gaming mouse, by eight grams. The Turtle Beach Burst 2 Air is one of the rare few that’s even leaner, tipping the scales at just 47g.

The two primary buttons on the M68 Pro feel solid yet easy to fully depress, and the same is true of the side buttons. The scroll wheel is notched well, making for smooth yet controlled spins, and the rubber layer offers plenty of grip, helping to make clicks feel secure. Overall, the M68 Pro seems to be a well-constructed gaming mouse.

The skates are quite thin, so you do feel hard desktop surfaces a little when maneuvering. Despite this, the M68 Pro doesn’t scratch or bottom out, as other gaming mice with insufficient padding do. This is impressive, especially considering there are only two skates on the top and bottom (although the included spare set features a piece for the sensor too). This is still a mouse I would recommend only using with a mouse pad, though, in order to get the best out of it.

The USB port is also located on the right rear side of the M68 Pro, a placement that will no doubt prove contentious, bucking the typical front-end trend (perhaps the snub-nose design leaves no internal room for this placement).

However, it does have the advantage of eliminating drag, although the included braided cable for wired play is light enough to make this no concern regardless. The port is also deeply recessed, so there’s no fear of accidental disconnection. The one major proviso, though, is that you’re right-handed and have your desktop on the right; if it’s on your left, then the cable is liable to get in the way.

Pressing the bottom button, which toggles between various settings and modes, can be awkward, as it sits flush with the underside surface. This is especially the case when pressing in combination with the two side buttons to toggle between the sensor modes, and I struggled to activate it consistently.

Front view of Cherry XTRFY M68 Pro on table

(Image credit: Future)

In fact, altering most of the settings on the M68 Pro is very awkward, since many require various odd button combinations. Worst of all, the buttons still register normal input when you're trying to activate the combinations, which means you have to be very careful about what window you’re on and where the cursor is.

These combinations are a necessary compromise given the lack of buttons and software, but keeping them active when holding them down is a serious misstep. Another gripe I have is that selected parameters are only indicated via different colors displayed on the small side LED, so you’ll likely have to keep referring back to the manual til they’re ingrained in your memory.

Once you get your head around these button combinations, though, there’s a wide selection of adjustments present on the M68 Pro. Despite having no software, there are eight onboard CPI settings to choose from, offering enough scope to dial in levels precise enough for most gamers. There are also four debounce time increments, ranging from 2ms to 12ms, which again should be more than enough to meet individual preferences.

There are only two lift-off distances (1mm and 2mm) to choose from, although this isn’t out of the ordinary, and while there are six polling rates to choose from (or four when wired, topping out at 1K), I can’t see many people wanting more than three for low (125Hz), mid (1K), and high (8K) values. There’s also an option to toggle motion sync on and off.

Underneath of Cherry XTRFY M68 Pro on table

(Image credit: Future)

Once you start gaming, though, the M68 Pro starts to shine. That aforementioned fingertip emphasis encouraged by sloping mouse buttons makes for a more tactile experience. It meant I could get a better grip when lifting off before swipes, and somehow gave me a greater sense of connection to the on-screen action, especially when aiming. The different sensor modes are also effective, with Pro Gaming mode being remarkably snappy and precise, perfect for FPS titles.

Cherry claims the M68 Pro’s battery can last up to 90 hours on a single charge. Although I wasn’t able to get an exact measurement, I can say that after several days of testing, the LED was still green, indicating a percentage of between 75-100%.

To show the battery level, you have to hold down both side buttons and the left mouse button for three seconds, another inconvenient method if you’re still connected to your machine, since again the buttons will still operate. There are four colors representing the 25% increments, and since there’s no software, there’s no way to get a more accurate assessment than this.

All things considered, the M68 Pro is a high-caliber wireless gaming mouse with a few design quirks that may please some but deter others. It’s not what you’d call cheap, but it does beat perhaps its closest rival, the Razer DeathAdder V3 Pro, when it comes to pricing.

The DeathAdder can be tweaked with Razer’s in-depth Synapse software, though, which is more convenient. And if you can live without an 8K polling rate and an inbuilt battery, the Cooler Master MM311 is a viable alternative with exceptional value, given it’s still wireless.

Cherry XTRFY M68 Pro with accessories on table

(Image credit: Future)

Cherry XTRFY M68 Pro: price & availability

  • $129 / £139 / AU$214
  • Black, white, and Team Vitality editions
  • Mid-range value

The M68 Pro costs $129 / £139 / AU$214. There’s one white colorway and two black variants: one with gray accents and another with yellow that also sports the logo of esports organization Team Vitality in place of Cherry XTRFY’s.

It’s cheaper than our pick for the best wireless gaming mouse, the Razer DeathAdder V3 Pro, yet still competes with it spec-for-spec, as both have an 8K polling rate and the M68 Pro is even lighter.

However, there are more budget-friendly wireless options around, such as the Cooler Master MM311, which is the best gaming mouse for those on a budget – it’s massively cheaper than the M68 Pro and many other wireless gaming mice for that matter. However, its polling rate tops out at 1K, and it requires a single AA battery. Still, if you’re not after eSport-level performance, this may be a better alternative.

Cherry XTRFY M68 Pro: specs

Should I buy the Cherry XTRFY M68 Pro?

Buy it if...

You want elite performance
The 8K polling rate will delight many pro-level players, and the overall performance when gaming is terrific.

You want something you can really grip
The short, raking front end makes the M68 Pro a joy to hold in my opinion - although it may not be to every gamer’s taste.

Don't buy it if...

You want multiple connectivity modes
There’s only one way to connect wirelessly, and using the mouse in wired mode may not suit everyone’s setup, considering the USB-C port is on the right.

You’re on a budget
Although it’s slightly cheaper than some big-name rivals, including Razer, there are alternatives offering better value if you don’t need that ultra-high polling rate.

Cherry XTRFY M68 Pro: also consider

Cooler Master MM311
If you can live without that 8K polling rate – which more casual gamers can – then the MM311 is an excellent budget choice. In our review, we found it was still capable enough thanks to its great performance. It doesn’t have an inbuilt battery, requiring an AA battery to power, but that’s a minor inconvenience considering its price tag. Read our full Cooler Master MM311 review.

Razer DeathAdder V3 Pro
Another state-of-the-art gaming mouse with an 8K polling rate, the DeathAdder V3 Pro is only marginally more expensive than the M68 Pro, yet it has the advantage of integrating with Razer’s excellent Synapse software. It’s hard to find fault with this gaming mouse, which is why it currently claims the top spot as the best wireless gaming mouse overall in our view. Read our Razer DeathAdder V3 Pro review.

How I tested the Cherry XTRFY M68 Pro

  • Tested for several days
  • Played various games
  • 10+ years PC gaming experience

I tested the M68 Pro for several days. During this time, I used it for gaming, working, and casual tasking.

I played titles such as Counter-Strike 2, Metal Gear Solid 3: Snake Eater - Master Collection Version, and Fear the Spotlight, in order to cover a variety of genres.

I have been PC gaming for over 10 years, and have experienced many mice during that time. I have also reviewed numerous gaming mice, all with various connectivity options, sizes, polling rates, and features.

First reviewed: October 2024

Read more about how we test

Lenovo Legion Go S
4:00 am |

Author: admin | Category: Computers Computing Gadgets Gaming Computers | Tags: , , | Comments: Off

Lenovo Legion Go S: Two-minute review

In our 2023 review of the Lenovo Legion Go, we described it as a "PC handheld built for PC gamers." Its stunning 8.8-inch QHD+ display with a 144Hz refresh rate, combined with AMD’s Ryzen Z1 Extreme CPU and RDNA 3 graphics, delivered exceptional gaming performance. At CES 2025, Lenovo introduced the smaller-profile Lenovo Legion Go S, shaking up the design of its original Legion Go handheld while introducing the one feature that so many PC gaming handheld fans have been clamoring for: SteamOS.

The new Legion Go S begins with the stunning 8-inch WQXGA LCD display, boasting a 1920 x 1200p resolution, a 120Hz refresh rate, and VVR support. While the controller layout remains mostly unchanged, this version opts for a non-detachable design, a few missing buttons under the left D-pad, and features a smaller touchpad.

Image 1 of 3

The top ports of the Lenovo Legion Go S

(Image credit: Future)
Image 2 of 3

The back of a Lenovo Legion Go S

(Image credit: Future)
Image 3 of 3

The bottom of a Lenovo Legion Go S showing the microsd card slot

(Image credit: Future)

Additional highlights include two USB4 ports, a headphone jack, and a microSD slot. The Legion Go S is also available in two color options, which vary based on the operating system buyers select. Both come packed with either AMD Ryzen Z1 Extreme or Ryzen Z2 Go, which was co-developed alongside Lenovo as a Legion Go S-exclusive.

The grip of the Lenovo Legion Go S

(Image credit: Future)

The Nebula Violet version comes packed in with SteamOS, making the Legion Go S the world’s first officially licensed PC gaming handheld powered by Valve’s popular operating system, first featured on the Steam Deck.

A Lenovo Legion Go S in a masculine hand

(Image credit: Future)

During my time with the handheld at CES, I tried a few games on it, including Portal 2 and Teenage Mutant Ninja Turtles: Splintered Fate. Outside of having some pretty fantastic performance for less graphically intensive games, SteamOS worked incredibly well during my time with it.

The SteamOS interface on the Lenovo Legion Go S

(Image credit: Future)

Playing around with the menu and selecting games felt as snappy as the Steam Deck, if not better, considering the newer hardware inside. Pushing various buttons and triggers felt as good as the bigger Lenovo Legion Go as well.

The Windows 11 OS interface on the Lenovo Legion Go S

(Image credit: Future)

Many have complained about Microsoft’s lack of care for the rising handheld gaming PC market. Though the SteamOS version felt like an evolved Steam Deck, the Windows 11 version didn’t provide that same feeling on the Glacier White version of the Legion Go S.

The first game I tried on that model was Forza Horizon 5, one of the best open-world racing games available despite being a nearly five-year-old game. It ran quite well at mid-to-high settings.

There were additional games available on the handheld I tried, including Spyro Reignited Trilogy and Indiana Jones and The Great Circle, though the latter wouldn’t run during our hands-on with it.

Though Windows 11 does open up the opportunity to easily use Xbox Gamepass or other store launchers like Steam, Epic Store, and GOG, Microsoft’s OS continues to hold this configuration back like it does with other handhelds.

Lenovo Legion Go S: Price & availability

A Lenovo Legion Go S on a desk

(Image credit: Future)

There will be multiple price points for the Lenovo Legion Go S, based around spec configurations and the OS you choose.

The base SteamOS model will launch in May, priced at $499, and featuring the AMD Ryzen Z1 Extreme chip, 16GB RAM, and 512GB SSD. For $100 more, users can get one with an AMD Ryzen Z2 Go processor alongside 1TB SSD.

The Windows 11 version featuring the Ryzen Z2 Go chip, 32GB RAM, and 1TB SSD drops this month for $729, with a cheaper $599 configuration with the Z2 Go, 16GB RAM, and 1TB of storage, expected to launch in May.

Lenovo Legion Go S: Specs

Lenovo Legion Go S: Final thoughts

The back of the glacier white lenovo legion go s

(Image credit: Future)

The Lenovo Legion Go S makes some clever strides by offering two distinct versions that cater to different gaming preferences.

So far, the SteamOS version stands out as a seamless and polished handheld experience, integrating with Valve’s Steam platform to deliver smooth performance, responsive controls, and an intuitive interface. The Legion Go S truly feels like a refined evolution of the aging Steam Deck.

Meanwhile, the Windows 11 model provides flexibility for broader gaming options, but struggles with the same software limitations seen in other Windows-based handhelds, like difficult menu navigation, inconsistent touchscreen responses, and more.

Considering the Windows 11 version is coming out the gate first, it’ll be interesting to see how far it can be pushed with some of the more graphically demanding games currently on the market once we get it in hand for a proper review, but no matter which system you go with, the Lenovo Legion Go S could very well be the PC gaming handheld to beat in 2025.

« Previous PageNext Page »