Organizer
Gadget news
Nvidia GeForce RTX 5080 review: nearly RTX 4090 performance for a whole lot less
5:00 pm | January 29, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Nvidia GeForce RTX 5080: Two-minute review

At first glance, the Nvidia GeForce RTX 5080 doesn't seem like that much of an upgrade from the Nvidia GeForce RTX 4080 it is replacing, but that's only part of the story with this graphics card.

Its performance, to be clear, is unquestioningly solid, positioning it as the third-best graphics card on the market right now, by my testing, and its new PCIe 5.0 interface and GDDR7 VRAM further distances it from the RTX 4080 and RTX 4080 Super from the last generation. It also outpaces the best AMD graphics card, the AMD Radeon RX 7900 XTX, by a healthy margin, pretty much locking up the premium, enthusiast-grade GPUs in Nvidia's corner for at least another generation.

Most impressively, it does this all for the same price as the Nvidia GeForce RTX 4080 Super and RX 7900 XTX: $999 / £939 / AU$2,019. This is also a rare instance where a graphics card launch price actually recedes from the high watermark set by its predecessor, as the RTX 5080 climbs down from the inflated price of the RTX 4080 when it launched back in 2022 for $1,199 / £1,189 / AU$2,219.

Then, of course, there's the new design of the card, which features a slimmer dual-slot profile, making it easier to fit into your case (even if the card's length remains unchanged). The dual flow-through fan cooling solution does wonders for managing the extra heat generated by the extra 40W TDP, and while the 12VHPWR cable connector is still present, the 3-to-1 8-pin adapter is at least somewhat less ridiculous the RTX 5090's 4-to-1 dongle.

The new card design also repositions the power connector itself to make it less cumbersome to plug a cable into the card, though it does pretty much negate any of the 90-degree angle cables that gained popularity with the high-end RTX 40 series cards.

Finally, everything is built off of TSMC's 4nm N4 process node, making it one of the most cutting-edge GPUs on the market in terms of its architecture. While AMD and Intel will follow suit with their own 4nm GPUs soon (AMD RDNA 4 also uses TSMC's 4nm process node and is due to launch in March), right now, Nvidia is the only game in town for this latest hardware.

None of that would matter though if the card didn't perform, however, but gamers and enthusiasts can rest assured that even without DLSS 4, you're getting a respectable upgrade. It might not have the wow factor of the beefier RTX 5090, but for gaming, creating, and even AI workloads, the Nvidia GeForce RTX 5080 is a spectacular balance of performance, price, and innovation that you won't find anywhere else at this level.

Nvidia GeForce RTX 5080: Price & availability

An RTX 5080 sitting on its retail packaging

(Image credit: Future)
  • How much is it? MSRP is $999 / £939 / AU$2,019
  • When can you get it? The RTX 5080 goes on sale January 30, 2025
  • Where is it available? The RTX 5080 will be available in the US, UK, and Australia at launch
Where to buy the RTX 5080

Looking to pick up the RTX 5080? Check out our Where to buy RTX 5080 live blog for updates to find stock in the US and UK

The Nvidia GeForce RTX 5080 goes on sale on January 30, 2025, starting at $999 / £939 / AU$2,019 for the Founders Edition and select AIB partner cards, while overclocked (OC) and more feature-rich third-party cards will be priced higher.

This puts the Nvidia RTX 5080 about $200 / £200 / AU$200 cheaper than the launch price of the last-gen RTX 4080, while also matching the price of the RTX 4080 Super.

Both of those RTX 40 series GPUs should see some downward price pressure as a result of the RTX 5080 release, which might complicate the value proposition of the RTX 5080 over the other,

The RTX 5080 is also launching at the same MSRP as the AMD Radeon RX 7900 XTX, which is AMD's top GPU right now. And with AMD confirming that it does not intend to launch an enthusiast-grade RDNA 4 GPU this generation, the RTX 5080's only real competition is from other Nvidia graphics cards like the RTX 4080 Super or RTX 5090.

This makes the RTX 5080 a great value proposition for those looking to buy a premium 4K graphics card, as its price-to-performance ratio is very strong.

  • Value: 4 / 5

Nvidia GeForce RTX 5080: Specs & features

A masculine hand holding an Nvidia GeForce RTX 5080 showing off the power connector

(Image credit: Future)
  • GDDR7 VRAM and PCIe 5.0
  • Still just 16GB VRAM
  • Slightly higher 360W TDP

While the Nvidia RTX 5080 doesn't push the spec envelope quite as far as the RTX 5090 does, its spec sheet is still impressive.

For starters, like the RTX 5090, the RTX 5080 uses the faster, next-gen PCIe 5.0 interface that allows for faster data processing and coordination with the CPU, which translates directly into higher performance.

You also have new GDDR7 VRAM in the RTX 5080, only the second card to have it after the RTX 5090, and it dramatically increases the memory bandwidth and speed of the RTX 5080 compared to the RTX 4080 and RTX 4080 Super. Those latter two cards both use slower GDDR6X memory, so even though all three cards have the same amount of memory (16GB) and memory bus-width (256-bit), the RTX 5080 has a >25% faster effective memory speed of 30Gbps, compared to the 23Gbps of the RTX 4080 Super and the 22.4Gbps on the base RTX 4080.

This is all on top of the Blackwell GPU inside the card, which is built on TSMC's 4nm process, compared to the Lovelace GPUs in the RTX 4080 and 4080 Super, which use TSMC's 5nm process. So even though the transistor count on the RTX 5080 is slightly lower than its predecessor's, the smaller transistors are faster and more efficient.

The RTX 5080 also has a higher SM count, 84, compared to the RTX 4080's 76 and the RTX 4080 Super's 80, meaning the RTX 5080 has the commensurate increase in shader cores, ray tracing cores, and Tensor cores. It also has a slightly faster boost clock (2,617MHz) than its predecessor and the 4080 Super variant.

Finally, there is a slight increase in the card's TDP, 360W compared to the RTX 4080 and RTX 4080 Super's 320W.

  • Specs & features: 4.5 / 5

Nvidia GeForce RTX 5080: Design

An Nvidia GeForce RTX 5080 leaning against its retail packaging with the RTX 5080 logo visible

(Image credit: Future)
  • Slimmer dual-slot form factor
  • Dual flow-through cooling system

The redesign of the Nvidia RTX 5080 is identical to that of the RTX 5090, featuring the same slimmed-down dual slot profile as Nvidia's flagship card.

If I were to guess, the redesign of the RTX 5080 isn't as essential as it is for the RTX 5090, which needed a way to bring better cooling for the much hotter 575W TDP, and the RTX 5080 (and eventually the RTX 5070) just slotted into this new design by default.

That said, it's still a fantastic change, especially as it makes the RTX 5080 thinner and lighter than its predecessor.

The dual flow through cooling system on the Nvidia GeForce RTX 5080

(Image credit: Future)

The core of the redesign is the new dual flow-through cooling solution, which uses an innovative three-part PCB inside to open up a gap at the front of the card, allowing a second fan to blow cooler air over the heat sink fins drawing heat away from the GPU.

A view of the comparative slot width of the Nvidia GeForce RTX 5080 and RTX 4080

(Image credit: Future)

This means that you don't need as thick of a heat sink to pull away heat, which allows the card itself to get the same thermal performance from a thinner form factor, moving from the triple-slot RTX 4080 design down to a dual-slot RTX 5080. In practice, this also allows for a slight increase in the card's TDP, giving the card a bit of a performance boost as well, just from implementing a dual flow-through design.

Given that fact, I would not be surprised if other card makers follow suit, and we start getting much slimmer graphics cards as a result.

A masculine hand holding an Nvidia GeForce RTX 5080 showing off the power connector

(Image credit: Future)

The only other design choice of note is the 90-degree turn of the 16-pin power port, which should make it easier to plug the 12VHPWR connector into the card. The RTX 4080 didn't suffer nearly the same kinds of issues with its power connectors as the RTX 4090 did, so this design choice really flows down from engineers trying to fix potential problems with the much more power hungry 5090. But, if you're going to implement it for your flagship card, you might as well put it on all of the Founders Edition cards.

Unfortunately, this redesign means that if you invested in a 90-degree-angled 12VHPWR cable, it won't work on the RTX 5080 Founders Edition, though third-party partner cards will have a lot of different designs, so you should be able to find one that fits your cable situation..

  • Design: 4.5 / 5

Nvidia GeForce RTX 5080: Performance

An Nvidia GeForce RTX 5080 slotted and running on a test bench

(Image credit: Future)
  • Excellent all-around performance
  • Moderately more powerful than the RTX 4080 and RTX 4080 Super, but nearly as fast as the RTX 4090 in gaming
  • You'll need DLSS 4 to get the best results
A note on my data

The charts shown below are the most recent test data I have for the cards tested for this review and may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

A note on the RTX 4080 Super

In my testing for this review, the RTX 4080 Super scored consistently lower than it has in the past, which I believe is an issue with my card specifically that isn't reflective of its actual performance. I'm including the data from the RTX 4080 Super for transparency's sake, but I wouldn't take these numbers as-is. I'll be retesting the RTX 4080 Super soon, and will update my data with new scores once I've troubleshot the issue.

Performance is king, though, and so naturally all the redesign and spec bumps won't amount to much if the RTX 5080 doesn't deliver better performance as a result, and fortunately it does—though maybe not as much as some enthusiasts would like.

Overall, the RTX 5080 manages to score about 13% better than the RTX 4080 and about 19% better than the AMD Radeon RX 7900 XTX, a result that will disappoint some (especially after seeing the 20-25% uplift on the RTX 5090) who were hoping for something closer to 20% or better.

If we were just to go off those numbers, some might call them disappointing, regardless of all the other improvements to the RTX 5080 in terms of design and specs. All this needs to be put in a broader context though, because my perspective changed once I compared the RTX 5080 to the RTX 4090.

Overall, the RTX 5080 is within 12% of the overall performance of the RTX 4090, and within 9% of the RTX 4090's gaming performance, which is a hell of a thing and simply can't be ignored, even by enthusiasts.

Starting with the card's synthetic benchmarks, the card scores about 13% better than the RTX 4080 and RX 7900 XTX, with the RTX 5080 consistently beating out the RTX 4080 and substantially beating the RX 7900 XTX in ray-traced workloads (though the RX 7900 XTX does pull down a slightly better average 1080p rasterization score, to its credit.

Compared to the RTX 4090, the RTX 5080 comes in at about 15% slower on average, with its worst performance coming at lower resolutions. At 4K, though, the RTX 5080 comes in just 7% slower than the last-gen flagship.

In terms of compute performance, the RTX 5080 trounces the RX 7900 XTX, as expected, by about 38%, with a more modest 9% improvement over the RTX 4080. Against the RTX 4090, however, the RTX 5080 comes within just 5% of the RTX 4090's Geekbench compute scores. If you're looking for a cheap AI card, the RTX 5080 is definitely going to be your jam.

On the creative side, PugetBench for Creators Adobe Photoshop benchmark still isn't working for the RTX 5080 Super, so I can't tell you much about its creative raster performance yet (though I will update these charts once that issue is fixed), but going off the 3D modeling and video editing scores, the RTX 5080 is an impressive GPU, as expected.

The entire 3D modeling industry is effectively built on Nvidia's CUDA, so against the RTX 5080, the RX 7900 XTX doesn't stand a chance as the 5080 more than doubles the RX 7900 XTX's Blender Benchmark performance. Gen-on-gen though, the RTX 5080 comes in with about 8% better performance.

Against the RTX 4090, the RTX 5080 comes within 15% on its performance, and for good measure, if you're rocking an RTX 3090 and you're curious about the RTX 5080, the RTX 5080 outperforms the RTX 3090 by about 75% in Blender Benchmark. If you're on an RTX 3090 and want to upgrade, you'll probably still be better off with an RTX 4090, but if you can't find one, the RTX 5080 is a great alternative.

In terms of video editing performance, the RTX 5080 doesn't do as well as its predecessor in PugetBench for Creators Adobe Premiere and effectively ties in my Handbrake 4K to 1080p encoding test. I expect that once the RTX 5080 launches, Puget Systems will be able to update its tools for the new RTX 50 series, so these scores will likely change, but for now, it is what it is, and you're not going to see much difference in your video editing workflows with this card over its predecessor.

An Nvidia GeForce RTX 5080 slotted into a motherboard

(Image credit: Future)

The RTX 5080 is Nvidia's premium "gaming" card, though, so its gaming performance is what's going to matter to the vast majority of buyers out there. For that, you won't be disappointed. Working just off DLSS 3 with no frame generation, the RTX 5080 will get you noticeably improved framerates gen-on-gen at 1440p and 4K, with substantially better minimum/1% framerates as well for smoother gameplay. Turn on DLSS 4 with Multi-Frame Generation and the RTX 5080 does even better, blowing well past the RTX 4090 in some titles.

DLSS 4 with Multi-Frame Generation is game developer-dependent, however, so even though this is the flagship gaming feature for this generation of Nvidia GPUs, not every game will feature it. For testing purposes, then, I stick to DLSS 3 without Frame Generation (and the AMD and Intel equivalents, where appropriate), since this allows for a more apples-to-apples comparison between cards.

At 1440p, the RTX 5080 gets about 13% better average fps and minimum/1% fps overall, with up to 18% better ray tracing performance. Turn on DLSS 3 to balanced and ray tracing to its highest settings and the RTX 5080 gets you about 9% better average fps than its predecessor, but a massive 58% higher minimum/1% fps, on average.

Compared to the RTX 4090, the RTX 5080's average 1440p fps comes within 7% of the RTX 4090's, and within 2% of its minimum/1% fps, on average. In native ray-tracing performance, the RTX 5080 slips to within 14% of the RTX 4090's average fps and within 11% of its minimum/1% performance. Turn on balanced upscaling, however, and everything changes, with the RTX 5080 comes within just 6% of the RTX 4090's ray-traced upscaled average fps, and beats the RTX 4090's minimum/1% fps average by almost 40%.

Cranking things up to 4K, and the RTX 5080's lead over the RTX 4080 grows a good bit. With no ray tracing or upscaling, the RTX 5080 gets about 20% faster average fps and minimum/1% fps than the RTX 4080, overall. Its native ray tracing performance is about the same, however, and it's minimum/1% fps average actually falls behind the RTX 4080's, both with and without DLSS 3.

Against the RTX 4090, the RTX 5080 comes within 12% of its average fps and within 8% of its minimum/1% performance without ray tracing or upscaling. It falls behind considerably in native 4K ray tracing performance (which is to be expected, given the substantially higher RT core count for the RTX 4090), but when using DLSS 3, that ray tracing advantage is cut substantially and the RTX 5080 manages to come within 14% of the RTX 4090's average fps, and within 12% of its minimum/1% fps overall.

Taken together, the RTX 5080 makes some major strides in reaching RTX 4090 performance across the board, getting a little more than halfway across their respective performance gap between the RTX 4080 and RTX 4090.

The RTX 5080 beats its predecessor by just over 13% overall, and comes within 12% of the RTX 4090's overal performance, all while costing less than both RTX 40 series card's launch MSRP, making it an incredible value for a premium card to boot.

  • Performance: 4 / 5

Should you buy the Nvidia GeForce RTX 5080?

A masculine hand holding up an Nvidia GeForce RTX 5080 against a green background

(Image credit: Future)

Buy the Nvidia GeForce RTX 5080 if...

You want fantastic performance for the price
You're getting close to RTX 4090 performance for under a grand (or just over two, if you're in Australia) at MSRP.

You want to game at 4K
This card's 4K gaming performance is fantastic, coming within 12-14% of the RTX 4090's in a lot of games.

You're not willing to make the jump to an RTX 5090
The RTX 5090 is an absolute beast of a GPU, but even at its MSRP, it's double the price of the RTX 5080, so you're right to wonder if it's worth making the jump to the next tier up.

Don't buy it if...

You want the absolute best performance possible
The RTX 5080 comes within striking distance of the RTX 4090 in terms of performance, but it doesn't actually get there, much less reaching the vaunted heights of the RTX 5090.

You're looking for something more affordable
At this price, it's an approachable premium graphics card, but it's still a premium GPU, and the RTX 5070 Ti and RTX 5070 are just around the corner.

You only plan on playing at 1440p
While this card is great for 1440p gaming, it's frankly overkill for that resolution. You'll be better off with the RTX 5070 Ti if all you want is 1440p.

Also consider

Nvidia GeForce RTX 4090
With the release of the RTX 5090, the RTX 4090 should see it's price come down quite a bit, and if scalpers drive up the price of the RTX 5080, the RTX 4090 might be a better bet.

Read the full Nvidia GeForce RTX 4090 review

Nvidia GeForce RTX 5090
Yes, it's double the price of the RTX 5080, and that's going to be a hard leap for a lot of folks, but if you want the best performance out there, this is it.

Read the full Nvidia GeForce RTX 5090 review

How I tested the Nvidia GeForce RTX 5080

  • I spent about a week and a half with the RTX 5080
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about a week testing the RTX 5080, using my updated suite of benchmarks like Black Myth Wukong, 3DMark Steel Nomad, and more.

I also used this card as my primary work GPU where I relied on it for photo editing and design work, while also testing out a number of games on it like Cyberpunk 2077, Black Myth Wukong, and others.

I've been testing graphics cards for TechRadar for a couple of years now, with more than two dozen GPU reviews under my belt. I've extensively tested and retested all of the graphics cards discussed in this review, so I'm intimately familiar with their performance. This gives me the best possible position to judge the merits of the RTX 5080, and whether it's the best graphics card for your needs and budget.

  • Originally reviewed January 2024
Nvidia GeForce RTX 5080
5:00 pm |

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Nvidia GeForce RTX 5080: Two-minute review

Nvidia GeForce RTX 5080: Price & availability

  • How much is it? MSRP is $999 / £939 / AU$2,019
  • When can you get it? The RTX 5080 goes on sale January 30, 2025
  • Where is it available? The RTX 5080 will be available in the US, UK, and Australia at launch
Where to buy the RTX 5080

Looking to pick up the RTX 5080? Check out our Where to Buy RTX 5080 live blog for updates to find stock in the US and UK.

The Nvidia GeForce RTX 5080 goes on sale on January 30, 2025, starting at $999 / £939 / AU$2,019 for the Founders Edition card from Nvidia, as well as select AIB partner cards. Third-party overclocked (OC) cards and those with other extras like liquid cooling and RGB will ultimately cost more.

The RTX 5080 launches at a much lower price than the original RTX 4080, which had a launch MSRP of $1,199 / £1,189 / AU$2,219, though the RTX 5080 does come in at the same price as the Nvidia RTX 4080 Super.

It's worth noting that the RTX 5080 is fully half the MSRP of the Nvidia RTX 5090 that launches at the same time, and given the performance of the RTX 5080, a lot of potential buyers out there will likely find the RTX 5080 to be the better value of the two cards.

  • Value: 4 / 5

Nvidia GeForce RTX 5080: Specs & features

The Nvidia GeForce RTX 5080's power connection port

(Image credit: Future / John Loeffler)
  • GDDR7 and PCIe 5.0
  • Slightly higher SM count than RTX 4080 Super
  • Moderate increase in TDP, but nothing like the RTX 5090
  • Specs & features: 4 / 5

Nvidia GeForce RTX 5080: Design

  • Slim, dual-slot form factor
  • Better cooling
  • Design: 4.5 / 5

Nvidia GeForce RTX 5080: Performance

An Nvidia GeForce RTX 5090 slotted into a test bench

(Image credit: Future)
  • DLSS 4
A note on my data

The charts shown below are the most recent test data I have for the cards tested for this review and may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

  • Performance: 4 / 5

Should you buy the Nvidia GeForce RTX 5080?

A masculine hand holding an RTX 5090

(Image credit: Future)

Buy the Nvidia GeForce RTX 5080 if...

Don't buy it if...

Also consider

Nvidia GeForce RTX 4080 Super

Read the full Nvidia GeForce RTX 4080 Super review

How I tested the Nvidia GeForce RTX 5080

  • I spent about a week and a half with the RTX 5080
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

  • Originally reviewed January 2024
Nvidia GeForce RTX 5090: the supercar of graphics cards
5:00 pm | January 23, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Nvidia GeForce RTX 5090: Two-minute review

The Nvidia GeForce RTX 5090 is a difficult GPU to approach as a professional reviewer because it is the rare consumer product that is so powerful, and so good at what it does, you have to really examine if it is actually a useful product for people to buy.

Right out the gate, let me just lay it out for you: depending on the workload, this GPU can get you up to 50% better performance versus the GeForce RTX 4090, and that's not even factoring in multi-frame generation when it comes to gaming, though on average the performance is still a respectable improvement of roughly 21% overall.

Simply put, whatever it is you're looking to use it for, whether gaming, creative work, or AI research and development, this is the best graphics card for the job if all you care about is pure performance.

Things get a bit more complicated if you want to bring energy efficiency into the equation. But if we're being honest, if you're considering buying the Nvidia RTX 5090, you don't care about energy efficiency. This simply isn't that kind of card, and so as much as I want to make energy efficiency an issue in this review, I really can't. It's not intended to be efficient, and those who want this card do not care about how much energy this thing is pulling down—in fact, for many, the enormous TDP on this card is part of its appeal.

Likewise, I can't really argue too much with the card's price, which comes in at $1,999 / £1,939 / AU$4,039 for the Founders Edition, and which will likely be much higher for AIB partner cards (and that's before the inevitable scalping begins). I could rage, rage against the inflation of the price of premium GPUs all I want, but honestly, Nvidia wouldn't charge this much for this card if there wasn't a line out the door and around the block full of enthusiasts who are more than willing to pay that kind of money for this thing on day one.

Do they get their money's worth? For the most part, yes, especially if they're not a gamer but a creative professional or AI researcher. If you're in the latter camp, you're going to be very excited about this card.

If you're a gamer, you'll still get impressive gen-on-gen performance improvements over the celebrated RTX 4090, and the Nvidia RTX 5090 is really the first consumer graphics card I've tested that can get you consistent, high-framerate 8K gameplay even before factoring in Multi-Frame Generation. That marks the RTX 5090 as something of an inflection point of things to come, much like the Nvidia RTX 2080 did back in 2018 with its first-of-its-kind hardware ray tracing.

Is it worth it though?

That, ultimately, is up to the enthusiast buyer who is looking to invest in this card. At this point, you probably already know whether or not you want it, and many will likely be reading this review to validate those decisions that have already been made.

In that, rest easy. Even without the bells and whistles of DLSS 4, this card is a hearty upgrade to the RTX 4090, and considering that the actual price of the RTX 4090 has hovered around $2,000 for the better part of two years despite its $1,599 MSRP, if the RTX 5090 sticks close to its launch price, it's well worth the investment. If it gets scalped to hell and sells for much more above that, you'll need to consider your purchase much more carefully to make sure you're getting the most for your money. Make sure to check out our where to buy an RTX 5090 guide to help you find stock when it goes on sale.

Nvidia GeForce RTX 5090: Price & availability

  • How much is it? MSRP is $1,999 / £1,939 / AU$4,039
  • When can you get it? The RTX 5090 goes on sale January 30, 2025
  • Where is it available? The RTX 5090 will be available in the US, UK, and Australia at launch
Where to buy the RTX 5090

Looking to pick up the RTX 5090? Check out our Where to buy RTX 5090 live blog for updates to find stock in the US and UK

The Nvidia GeForce RTX 5090 goes on sale on January 30, 2025, starting at $1,999 / £1,939 / AU$4,039 for the Nvidia Founders Edition and select AIB partner cards. Overclocked (OC) and other similarly tweaked cards and designs will obviously run higher.

It's worth noting that the RTX 5090 is 25% more expensive than the $1,599 launch price of the RTX 4090, but in reality, we can expect the RTX 5090 to sell for much higher than its MSRP in the months ahead, so we're really looking at an asking price closer to the $2,499.99 MSRP of the Turing-era Nvidia Titan RTX (if you're lucky).

Of course, if you're in the market for the Nvidia RTX 5090, you're probably not squabbling too much about the price of the card. You're already expecting to pay the premium, especially the first adopter premium, that comes with this release.

That said, this is still a ridiculously expensive graphics card for anyone other than an AI startup with VC backing, so it's worth asking yourself before you confirm that purchase if this card is truly the right card for your system and setup.

  • Value: 3 / 5

Nvidia GeForce RTX 5090: Specs & features

The Nvidia GeForce RTX 5090's power connection port

(Image credit: Future / John Loeffler)
  • First GPU with GDDR7 VRAM and PCIe 5.0
  • Slightly slower clocks
  • Obscene 575W TDP

There are a lot of new architectural changes in the Nvidia RTX 50 series GPUs that are worth diving into, especially the move to a transformer AI model for its upscaling, but let's start with the new specs for the RTX 5090.

First and foremost, the flagship Blackwell GPU is the first consumer graphics card to feature next-gen GDDR7 video memory, and it is substantially faster than GDDR6 and GDDR6X (a roughly 33% increase in Gbps over the RTX 4090). Add in the much wider 512-bit memory interface and you have a total memory bandwidth of 1,790GB/s.

This, more than even the increases VRAM pool of 32GB vs 24GB for the RTX 4090, makes this GPU the first really capable 8K graphics card on the market. 8K textures have an enormous footprint in memory, so moving them through the rendering pipelines to generate playable framerates isn't really possible with anything less than this card has.

Yes, you can, maybe, get playable 8K gaming with some RTX 40 or AMD Radeon RX 7000 series cards if you use aggressive upscaling, but you won't really be getting 8K visuals that'll be worth the effort. In reality, the RTX 5090 is what you want if you want to play 8K, but good luck finding an 8K monitor at this point. Those are still years away from really going mainstream (though there are a growing number of 8K TVs).

If you're settling in at 4K though, you're in for a treat, since all that bandwidth means faster 4K texture processing, so you can get very fast native 4K gaming with this card without having to fall back on upscaling tech to get you to 60fps or higher.

The GeForce RTX logo on the Nvidia GeForce RTX 5090

(Image credit: Future / John Loeffler)

The clock speeds on the RTX 5090 are slightly slower, which is good, because the other major top-line specs for the RTX 5090 are its gargantuan TDP of 575W and its PCIe 5.0 x16 interface. For the TDP, this thermal challenge, according to Nvidia, required major reengineering of the PCB inside the card, which I'll get to in a bit.

The PCIe 5.0 x16 interface, meanwhile, is the first of its kind in a consumer GPU, though you can expect AMD and Intel to quickly follow suit. Why this matters is because a number of newer motherboards have PCIe 5.0 lanes ready to go, but most people have been using those for PCIe 5.0 m.2 SSDs.

If your motherboard has 20 PCIe 5.0 lanes, the RTX 5090 will take up 16 of those, leaving just four for your SSD. If you have one PCIe 5.0 x4 SSD, you should be fine, but I've seen motherboard configurations that have two or three PCIe 5.0 x4 m.2 slots, so if you've got one of those and you've loaded them up with PCIe 5.0 SSDs, you're likely to see those SSDs drop down to the slower PCIe 4.0 speeds. I don't think it'll be that big of a deal, but it's worth considering if you've invested a lot into your SSD storage.

As for the other specs, they're more or less similar to what you'd find in the RTX 4090, just more of it. The new Blackwell GB202 GPU in the RTX 5090 is built on a TSMC 4nm process, compared to the RTX 4090's TSMC 5nm AD102 GPU. The SM design is the same, so 128 CUDA cores, one ray tracing core, and four tensor cores per SM. At 170 SMs, you've got 21,760 CUDA cores, 170 RT cores, and 680 Tensor cores for the RTX 5090, compared to the RTX 4090's 128 SMs (so 16,384 CUDA, 128 RT, and 512 Tensor cores).

  • Specs & features: 4.5 / 5

Nvidia GeForce RTX 5090: Design

The Nvidia GeForce RTX 5090 sitting on its packaging

(Image credit: Future / John Loeffler)
  • Slim, dual-slot form factor
  • Better cooling

So there's a significant change to this generation of Nvidia Founders Edition RTX flagship cards in terms of design, and it's not insubstantial.

Holding the RTX 5090 Founders Edition in your hand, you'll immediately notice two things: first, you can comfortably hold it in one hand thanks to it being a dual-slot card rather than a triple-slot, and second, it's significantly lighter than the RTX 4090.

A big part of this is how Nvidia designed the PCB inside the card. Traditionally, graphics cards have been built with a single PCB that extends from the inner edge of the PC case, down through the PCIe slot, and far enough back to accommodate all of the modules needed for the card. On top of this PCB, you'll have a heatsink with piping from the GPU die itself through a couple of dozen aluminum fins to dissipate heat, with some kind of fan or blower system to push or pull cooler air through the heated fins to carry away the heat from the GPU.

The problem with this setup is that if you have a monolithic PCB, you can only really extend the heatsinks and fans off of the PCB to help cool it since a fan blowing air directly into a plastic wall doesn't do much to help move hot air out of the graphics card.

A split view of the Nvidia GeForce RTX 5090's dual fan passthrough design

(Image credit: Future / John Loeffler)

Nvidia has a genuinely novel innovation on this account, and that's ditching the monolithic PCB that's been a mainstay of graphics cards for 30 years. Instead, the RTX 5090 (and presumably subsequent RTX 50-series GPUs to come), splits the PCB into three parts: the video output interface at the 'front' of the card facing out from the case, the PCIe interface segment of the card, and the main body of the PCB that houses the GPU itself as well as the VRAM modules and other necessary electronics.

This segmented design allows a gap in the front of the card below the fan, so rather than a fan blowing air into an obstruction, it can fully pass over the fins of the GPU's heatsink, substantially improving the thermals.

As a result, Nvidia is able to shrink the width of the card down considerably, moving from a 2.4-inch width to a 1.9-inch width, or a roughly 20% reduction on paper. That said, it feels substantially smaller than its predecessor, and it's definitely a card that won't completely overwhelm your PC case the way the RTX 4090 does.

The 4 8-pin to 16-pin 12VHPWR adapter included with the Nvidia GeForce RTX 5090

(Image credit: Future / John Loeffler)

That said, the obscene power consumption required by this card means that the 8-pin adapter included in the RTX 5090 package is a comical 4-to-1 dongle that pretty much no PSU in anyone's PC case can really accommodate.

Most modular PSUs give you three PCIe 8-pin power connectors at most, so let's just be honest about this setup. You're going to need to get a new ATX 3.0 PSU with at least 1000W to run this card at a minimum (it's officially recommended PSU is 950W, but just round up, you're going to need it), so make sure you factor that into your budget if you pick this card up

Otherwise, the look and feel of the card isn't that different than previous generations, except the front plate of the GPU where the RTX 5090 branding would have gone is now missing, replaced by a finned shroud to allow air to pass through. The RTX 5090 stamp is instead printed on the center panel, similar to how it was done on the Nvidia GeForce RTX 3070 Founders Edition.

As a final touch, the white back-lit GeForce RTX logo and the X strips on the front of the card, when powered, add a nice RGB-lite touch that doesn't look too guady, but for RGB fans out there, you might think it looks rather plain.

  • Design: 4.5 / 5

Nvidia GeForce RTX 5090: Performance

An Nvidia GeForce RTX 5090 slotted into a test bench

(Image credit: Future)
  • Most powerful GPU on the consumer market
  • Substantially faster than RTX 4090
  • Playable 8K gaming
A note on my data

The charts shown below are the most recent test data I have for the cards tested for this review and may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

So how does the Nvidia GeForce RTX 5090 stack up against its predecessor, as well as the best 4K graphics cards on the market more broadly?

Very damn well, it turns out, managing to improve performance over the RTX 4090 in some workloads by 50% or more, while leaving everything else pretty much in the dust.

Though when looked at from 30,000 feet, the overall performance gains are respectable gen-on-gen but aren't the kind of earth-shattering gains the RTX 4090 made over the Nvidia GeForce RTX 3090.

Starting with synthetic workloads, the RTX 5090 scores anywhere from 48.6% faster to about 6.7% slower than the RTX 4090 in various 3DMark tests, depending on the workload. The only poor performance for the RTX 5090 was in 3DMark Night Raid, a test where both cards so completely overwhelm the test that the difference here could be down to CPU bottlenecking or other issues that aren't easily identifiable. On every other 3DMark test, though, the RTX 5090 scores 5.6% better or higher, more often than not by 20-35%. In the most recent;y released test, Steel Nomad, the RTX 5090 is nearly 50% faster than the RTX 4090.

On the compute side of things, the RTX 5090 is up to 34.3% faster in Geekbench 6 OpenGL compute test and 53.9% faster in Vulcan, making it an absolute monster for AI researchers to leverage.

On the creative side, the RTX 5090 is substantially faster in 3D rendering, scoring between 35% and 49.3% faster in my Blender Benchmark 4.30 tests. There's very little difference between the two cards when it comes to video editing though, as they essentially tie in PugetBench for Creators' Adobe Premiere test and in Handbrake 1.7 4K to 1080p encoding.

The latter two results might be down to CPU bottlenecking, as even the RTX 4090 pushes right up against the performance ceiling set by the CPU in a lot of cases.

When it comes to gaming, the RTX 5090 is substantially faster than the RTX 4090, especially at 4K. In non-upscaled 1440p gaming, you're looking at a roughly 18% better average frame rate and a 22.6% better minimum/1% framerate for the RTX 5090. With DLSS 3 upscaling (but no frame generation), you're looking at 23.3% better average and 23% better minimum/1% framerates overall with the RTX 5090 vs the RTX 4090.

With ray tracing turn on without upscaling, you're getting 26.3% better average framerates and about 23% better minimum/1% framerates, and with upscaling turned on to balanced (again, no frame generation), you're looking at about 14% better average fps and about 13% better minimum/1% fps for the RTX 5090 against the RTX 4090.

At 4K, however, the faster memory and wider memory bus really make a difference. Without upscaling and ray tracing turned off, you're getting upwards of 200 fps at 4K for the RTX 5090 on average, compared to the RTX 4090's 154 average fps, a nearly 30% increase. The average minimum/1% fps for the RTX 5090 is about 28% faster than the RTX 4090, as well. With DLSS 3 set to balanced, you're looking at a roughly 22% better average framerate overall compared to the RTX 4090, with an 18% better minimum/1% framerate on average as well.

With ray tracing and no upscaling, the difference is even more pronounced with the RTX 5090 getting just over 34% faster average framerates compared to the RTX 4090 (with a more modest 7% faster average minimum/1% fps). Turn on balanced DLSS 3 with full ray tracing and you're looking at about 22% faster average fps overall for the RTX 5090, but an incredible 66.2% jump in average minimum/1% fps compared to the RTX 4090 at 4K.

Again, none of this even factors in single frame generation, which can already substantially increase framerates in some games (though with the introduction of some input latency). Once Multi-Frame Generation rolls out at launch, you can expect to see these framerates for the RTX 5090 run substantially higher. Pair that with Nvidia Reflex 2 to help mitigate the input latency issues frame generation can introduce, and the playable performance of the RTX 5090 will only get better with time, and it's starting from a substantial lead right out of the gate.

In the end, the overall baseline performance of the RTX 5090 comes in about 21% better than the RTX 4090, which is what you're really looking for when it comes to a gen-on-gen improvement.

That said, you have to ask whether the performance improvement you do get is worth the enormous increase in power consumption. That 575W TDP isn't a joke. I maxed out at 556W of power at 100% utilization, and I hit 100% fairly often in my testing and while gaming.

The dual flow-through fan design also does a great job of cooling the GPU, but at the expense of turning the card into a space heater. That 575W of heat needs to go somewhere, and that somewhere is inside your PC case. Make sure you have adequate airflow to vent all that hot air, otherwise everything in your case is going to slowly cook.

As far as performance-per-price, this card does slightly better than the RTX 4090 on value for the money, but that's never been a buying factor for this kind of card anyway. You want this card for its performance, plain and simple, and in that regard, it's the best there is.

  • Performance: 5 / 5

Should you buy the Nvidia GeForce RTX 5090?

A masculine hand holding an RTX 5090

(Image credit: Future)

Buy the Nvidia GeForce RTX 5090 if...

You want the best performance possible
From gaming to 3D modeling to AI compute, the RTX 5090 serves up best-in-class performance.

You want to game at 8K
Of all the graphics cards I've tested, the RTX 5090 is so far the only GPU that can realistically game at 8K without compromising on graphics settings.

You really want to flex
This card comes with a lot of bragging rights if you're into the PC gaming scene.

Don't buy it if...

You care about efficiency
At 575W, this card might as well come with a smokestack and a warning from your utility provider about the additional cost of running it.

You're in any way budget-conscious
This card starts off more expensive than most gaming PCs and will only become more so once scalpers get their hands on them. And that's not even factoring in AIB partner cards with extra features that add to the cost.

You have a small form-factor PC
There's been some talk about the new Nvidia GPUs being SSF-friendly, but even though this card is thinner than the RTX 4090, it's just as long, so it'll be hard to fit it into a lot of smaller cases.

Also consider

Nvidia GeForce RTX 4090
I mean, honestly, this is the only other card you can compare the RTX 5090 to in terms of performance, so if you're looking for an alternative to the RTX 5090, the RTX 4090 is pretty much it.

Read the full Nvidia GeForce RTX 4090 review

How I tested the Nvidia GeForce RTX 5090

  • I spent about a week and a half with the RTX 5090
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about a week and a half testing the Nvidia GeForce RTX 5090, both running synthetic tests as well as using it in my day-to-day PC for both work and gaming.

I used my updated testing suite, which uses industry standard benchmark tools like 3DMark, Geekbench, Pugetbench for Creators, and various built-in gaming benchmarks. I used the same testbench setup listed to the right for the purposes of testing this card, as well as all of the other cards I tested for comparison purposes.

I've tested and retested dozens of graphics cards for the 20+ graphics card reviews I've written for TechRadar over the last few years, and so I know the ins and outs of these PC components. That's why you can trust my review process to help you make the right buying decision for your next GPU, whether it's the RTX 5090 or any of the other graphics cards I review.

  • Originally reviewed January 2024
Intel Arc B570 review: great value, but overshadowed by the far superior Arc B580
5:00 pm | January 16, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , | Comments: Off

Intel Arc B570: Two-minute review

The Intel Arc B570 is the kind of graphics card I desperately want to love, but my tech-addled PC gaming heart belongs to another.

I'm not talking about the recently-announced Nvidia RTX 50 series GPUs (though we'll see about those in due time). No, I've fallen for the Intel Arc B580, easily one of the best graphics cards on the market thanks to its fantastic 1440p and 1080p gaming performance. And, unfortunately, its price is so good that it's hard to really recommend the Arc B570 in good conscience.

To be fair, the Intel Arc B570's $219 / £219 (around AU$350) MSRP arguably makes it the best cheap graphics card going right now simply by default. The next cheapest current-gen GPU (as of January 2025) from AMD (the Radeon RX 7600) and Nvidia (the GeForce RTX 4060) are roughly 20% to 25% more expensive, and it's still $30 / £30 (about AU$90) cheaper than the Arc B580.

But the problem is that despite some impressive specs for a card this cheap, and solid 1080p performance, for just a little bit more you can get a far more future-proofed GPU that will let you game without compromise at a higher 1440p resolution if you go for the Arc B580. Of course, that's assuming you can get that card at its normal retail price and not the jacked-up prices being charged online by profiteering retailers and third-party sellers.

An Intel Arc B570 seen from the back

(Image credit: Future / John Loeffler)

But looking at the Arc B570 strictly on its merits, ignoring any external factors that are subject to change, and it's undeniable that the Arc B570 is one of the best 1080p graphics cards you can buy, especially considering its price.

At this price price point, you really have to compare the Arc B570 against cards that are several years old, like the Nvidia GeForce RTX 1060 to really put things in perspective. For example, the Nvidia GeForce RTX 3050 had a launch price $30 higher than the Arc B570, and even though I no longer have that card to compare Intel's latest against in a head-to-head matchup like I'd like, it really wasn't that good of a card to justify its price. Say what you will about the Arc B570, but in no universe can you say that you're not getting your money's worth with this GPU.

The heartbreak, then, is just that this card is simply overshadowed by its slightly more expensive sibling. If the Intel Arc B570 was priced at $199, it would be walking away with a definitive budget win. Hell, it still is, but with so little separating the B570 and the B580, pretty much every potential buyer is better off borrowing that extra bit of cash from a friend, sibling, parent, or even a stranger, and picking up the more powerful B580.

Intel Arc B570: Price & availability

An Intel Arc B570 on top of its retail packaging

(Image credit: Future / John Loeffler)
  • How much is it? Starting at $219 / £219 (around AU$350)
  • When can you get it? You can get it from January 16, 2025
  • Where is it available? You can get it in the US, UK, and Australia

The Intel Arc B570 goes on sale in the US, UK, and Australia on January 16, 2025, for $219 / £219 (around AU$350).

This puts it at just $30 / £30 (about AU$90) cheaper than the Intel Arc B580 released in December 2024. That said, it is a good deal cheaper than the competing AMD Radeon RX 7600 and Nvidia RTX 4060, both of which run at least 20% more expensive for roughly the same performance.

I'll dig into the performance-per-dollar of this card in a bit, but I can tell you now that it's one of the best you find on a modern GPU, but it still comes in a distant second to the Intel Arc B580, making it hard card to recommend unless you are seriously strapped for cash or the B580 is being scalped at too high a price.

  • Value: 4.5 / 5

Intel Arc B570: Specs

The top trim of an Intel Arc B570

(Image credit: Future / John Loeffler)
  • 10GB VRAM is a nice-to-have feature
  • Decently-sized memory bus
  • Specs: 4 / 5

Intel Arc B570: Performance

An Intel Arc B570 running on an open test bench

(Image credit: Future / John Loeffler)
  • Great 1080p performance
  • Doable 1440p (within reason)
  • Arc B580 is way better for not a whole lot more money

Ultimately, what matters is performance, and the top-line numbers for the Intel Arc B570 are impressive for a card at its price point, but it is almost exclusively a 1080p graphics card unless you make a lot of compromises for 1440p resolution that frankly aren't going to be worth it in the end.

In terms of creative workloads or AI, this isn't the card for you. I'd simply go for the RTX 4060 if you're really strapped for cash but need something more than a basic cheap gaming GPU.

It also has to be noted that its 1080p gaming performance isn't going to match its more expensive competition on a lot of games, so if you're looking for a graphics card that consistently gets you 60fps at 1080p on max settings without question, you might be better off with some of this card's more expensive competitors.

That said, on average across the several games in my testing suite, including titles like Cyberpunk 2077, F1 2024, Total War: Warhammer III, and others, this card did manage an average 1080p fps of 60. with an average minimum fps of 34.

Of course, it played better on some games more than others, and some games you won't be able to play at max settings for a playable frame rate (like Black Myth Wukong), but over the course of all the titles I played, it's more than passable for 1080p, with the occasionally playable 1440p experience.

For its price, it's genuinely excellent, especially for getting you a card capable of ray-traced gameplay, but for just a little bit more, you can get a lot better with the B580.

  • Performance: 3.5 / 5

Should you buy the Intel Arc B570?

An Intel Arc B570 being held by a masculine hand

(Image credit: Future / John Loeffler)

Buy the Intel Arc B570 if...

You are on a very tight budget
There aren't a lot of current-gen GPUs available at this price point, and even then, this is the cheapest so far.View Deal

You only care about basic 1080p gaming
If you are only looking for a cheap 1080p GPU with some modern extras like ray tracing, this card could be a compelling value at MSRP. View Deal

Don't buy it if...

You want to game at 1440p
Despite its extra VRAM and decent memory bus, it just doesn't have the specs for consistent 1440p gaming without some serious compromises.View Deal

You have some wiggle room in your budget
If you are even slightly flexible in your budget, the Arc B580 is a much, much better option for not a whole lot more money.View Deal

Also consider

Intel Arc B580
OK, so I'm going to be honest, the only other card you should be considering is the Arc B580. If you have any room in your budget, get this card instead. It's so much better for just a little more of an investment.

Read the full Intel Arc B580 review

How I tested the Intel Arc B570

  • I spent about a week with the Intel Arc B570
  • I used it primarily as a gaming GPU with some light creative work
  • I ran the Arc B570 thgouh my revamped testing suite

I tested the Intel Arc B570 using my newly revamped testing suite, including the latest 3DMark tests like Steel Nomad and Solar Bay, as well as the newest gaming benchmarks like Black Myth Wukong and F1 2024.

I used the Arc B570 as my primary GPU on my work PC, using it for basic productivity, creative, and moderate gaming in the office.

I've been testing GPUs for TechRadar for more than two years now, and have extensively benchmarked all of the latest GPUs several times over, so I am well aware of where this card's performance sits amongst its competition as well as how good of a value it is at its price point.

  • Originally reviewed January 2025
Lenovo ThinkPad P16v Gen 2 mobile workstation review
10:12 pm | January 15, 2025

Author: admin | Category: Computers Gadgets Pro | Tags: , , | Comments: Off

Lenovo's ThinkPad lineup has always been a significant grouping of offerings for business professionals. The Lenovo ThinkPad P16v Gen 2 is no different. It targets professionals who need workstation-grade performance on the go.

The ThinkPad P16 is one of the best Lenovo ThinkPad laptops around - ideal for heavy computational and graphical work. Compared to the P16, I view the P16v Gen 2 as a ThinkPad P16 lite. But that's not any official branding; it's just my viewpoint. It's a slightly less powerful P16, but still very much enterprise-focused and workstation-esque.

Lenovo ThinkPad P16v Gen 2: Price and Availability

The Lenovo ThinkPad P16v Gen 2 starts at $1,791.92 (pre-tax) and quickly scales up to well over $3,500 before any pre-installed software options if you want to max out the hardware offerings.

These and custom builds are available on Lenovo's website, and pre-built models are available in places like Amazon or other computer retailers.

Lenovo ThinkPad P16v Gen 2

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P16v Gen 2: Unboxing and First Impressions

The ThinkPad P16v Gen 2 laptop comes in the Lenovo packaging, a beefy yellow-tipped Lenovo charger (though you can also charge via USB-C, albeit slower), and other essential documentation. I was immediately reminded of the P16, though the P16v is a bit slimmer and lighter (4.89 lb vs. 6.5 lb).

Another thing that I noticed right away was the port offering and location. I'll discuss this more later, but right off the bat, I was surprised to see a full ethernet port and ports on the back; then again, though thin, this is a workstation. Lastly, I genuinely like the matte black finish on this laptop. It feels professional, and I like it for the same reasons. Though I love some sweet backpack colors, I will always choose black. I love some splashes of color from Apple these days, but I always prefer simple colors. It's clean, goes with everything, and it looks professional.

Lenovo ThinkPad P16v Gen 2: Design and Build Quality

Specs

CPU: Intel Core Ultra 7 165H to Ultra 9 185H options
GPU: NVIDIA RTX 2000 Ada Gen or RTX 3000 Ada Gen
Display: 16” WUXGA (1920 x 1200), IPS, 100% sRGB to 16" WQUXGA (3840 x 2400), IPS, 100%DCI-P3, 60Hz
Storage: 2x 2TB SSD M.2 drives
RAM: 8GB DDR5, upgradable to 96GB .

Unsurprisingly, the Lenovo ThinkPad P16v Gen 2 is very similar to the ThinkPad P16 in design, much like the name. The P16v Gen 2 is slimmer and more portable than a ThinkPad P16. However, it still feels relatively robust and like any of the best mobile workstations I've tried, with actual portability in mind. Thanks to the real estate left behind due to the 16" screen, Lenovo could add a full numpad to the right of the entire keyboard, and better yet, it's comfortable to type on.

The port offering on this computer is excellent for the modern employee needing workstation-grade power. There is an SD Card Reader, an optional Smart Card reader, a full-size HDMI port, a USB-A Port, two Thunderbolt 4 ports, and a full RJ45 Ethernet port. What's fascinating and pretty brilliant is that one of the Thunderbolt ports and the Ethernet port are on the back of the ThinkPad P16v Gen 2. This makes it super easy to plug into a Thunderbolt Docking station and/or that ethernet port, both of which you'd want running away from your desk or workspace exactly how they will when plugged into the back of your laptop.

Lenovo ThinkPad P16v Gen 2

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P16v Gen 2: In use

I've had this laptop in my rotation for the last couple of weeks, and it has been a pretty good computer. It can easily handle my productivity suite of tasks, content creation and video editing, and photo editing. It can handle the 3D modeling software for my 3D printer and all of it at once. I really appreciate the ethernet port and Thunderbolt 4 port on the back, as I could have the not-so-flexible ethernet port run away from my computer when I needed to hardline into the internet at one of my job sites. Whenever I am at my desk, I can easily plug into the docking station I have set up running to my monitors and peripherals.

Lenovo ThinkPad P16v Gen 2

(Image credit: Collin Probst // Future)

Another thing worth mentioning is the reliability and usability of the ThinkPad keyboards. While I never want to use the famous TouchPoint embedded within the keyboard, it's handy when I think about using it. On top of that, the typing experience is quite comfortable, even for all-day typing, as I do.

Lenovo has also chosen to utilize the space granted by the 16-inch screen to fit in a numpad. Some laptops, even with 16-inch screens, will just fit the exact size keyboard in the center of the allotted space. Lenovo chose to utilize that space fitting in a full-numberpad. For those who work with spreadsheets, phone numbers, or numbers in general, having a dedicated numpad makes data entry exponentially faster, and that's easy to do with the ThinkPad P16v Gen 2, adding to the allure for the business professional.

Lenovo ThinkPad P16v Gen 2

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P16v Gen 2: Final verdict

The ThinkPad P16v Gen 2 delivers an exceptional balance of power, portability, and professional features. While it doesn’t quite match the raw performance of the P16, its lighter build and price point make it an excellent choice for professionals on the move who need a reliable machine.


For more workplace computing, we've tested the best business laptops.

Lenovo ThinkPad P1 Gen 7 mobile workstation review
10:41 am | December 25, 2024

Author: admin | Category: Computers Gadgets Pro | Tags: , , | Comments: Off

The Lenovo ThinkPad P1 Gen 7 is Lenovo's take on an all-around perfect portable workstation machine. The Gen 7, of course, replaces the Gen 6 and now boasts up to an Intel Core Ultra 9 185H and an NVIDIA RTX 4070. However, it can also be built with integrated graphics and an Intel Core Ultra 5 with a light 16GB of RAM.

Much like Dell's Precision line-up, the ThinkPad P series is designed for professionals needing a computer that can handle computationally demanding tasks like 3D rendering, video editing, coding, data analysis, and things of that nature. Like many of the best Lenovo ThinkPad laptops I've reviewed, while casual users can use it, this price point focuses on professional users who rely on their machines to be workhorses and get work done.

Lenovo ThinkPad P1 Gen 7: Price and Availability

The Lenovo ThinkPad P1 Gen 7 starts at the base level for under $2,000 with an Intel Core Ultra 5, 16GB of RAM, and integrated graphics. This can be upgraded to a machine that costs over $5,000 when equipped with an Intel Core Ultra 9, NVIDIA RTX 4070 Graphics, 64GB of RAM, and 4TB SSD. What's great about this is that yes. At the same time, this is not an entry-level computer. Thanks to the customization options available for processor, memory, storage, and graphics, it can be kitted to fit just about any professional need. That said, check out our Lenovo coupon codes to see if you can save on the ThinkPad P1 Gen 7.

Lenovo ThinkPad P1 Gen 7

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P1 Gen 7: Unboxing and First Impressions

Out of the box, it's clear this is not designed to be a super-lightweight-ultra-portable-thinnest-device-ever kind of machine. It's beefy. But not in a way that resembles the laptops of a decade ago. As we've seen from many of the best mobile workstations, it's sleek where it can be but houses a lot under the hood -- or keyboard. Depending on the GPU configuration, the P1 Gen 7 has a 135W or 170W charger, the appropriate manuals, and any accessories purchased at Lenovo. The minimalist matte-black design exudes sleek professionalism. However, one thing to note is that it is prone to smudges.

Lenovo ThinkPad P1 Gen 7

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P1 Gen 7: Design and Build Quality

Specs

CPU: Up to an Intel Core Ultra 9 185H
GPU: Up to an NVIDIA RTX 4070
Display: Up to 4K OLED
RAM: Up to 64GB LPDDR5X
Storage: Up to 8TB SSD with built-in RAID options

Overall, the laptop is 17mm thick and 4.3lb. That's not huge in the world of laptops, though it is larger than some of the laptops I am working with. The P1 Gen 7 is made of a combination of Magnesium and Aluminum and has a durability rating of MIL-STD 810H. It can withstand your daily wear and tear and the burdens of being an everyday workhorse.

Completing the all-too-famous ThinkPad design, the TrackPoint is prominently in the center of the keyboard, and the overall design language matches what is frequently found with ThinkPad.

Lenovo ThinkPad P1 Gen 7

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P1 Gen 7: In use

I have used this computer extensively in my workflow for the past few months. Overall, it is an impressive machine. It is remarkably powerful, easily handles multitasking and demanding performance programs, and has a sleek and attractive design. What more could you ask for in a computer? It even has a better port offering than the ever-popular Dell Powerhouses and better port offerings than MacBooks. I have only heard the fans kick on during heavily intensive or many heavy tasks stacked together. Outside of that, I have not heard the fan kick on for my day-to-day professional work even once.

Some more features that make this computer great would be the Wi-Fi 7 antennae, great port offering, a solid trackpad, a comfortable keyboard, and a decent battery.

Lenovo ThinkPad P1 Gen 7

(Image credit: Collin Probst // Future)

I've enjoyed using this computer for everything in my day to day. The keyboard is comfortable enough for long email sessions or writing articles (like this one). The trackpad is responsive enough that I don't need to bring a mouse in my backpack when I am away from my desk for the day. The ports are fantastic. I can leave my dongles at home since this laptop has everything I could need on a given notice built into the computer. Another thing that makes this computer great is that it is super portable. Yes, it's powerful and practical, but it's also surprisingly easy to carry around from place to place in my studio, office, coffee shop, bag, house, and so on. It's simple, and it doesn't get in the way. It's great for my professional workflow.

Lenovo ThinkPad P1 Gen 7

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P1 Gen 7: Final verdict

The Lenovo ThinkPad P1 Gen 7 is an impressive example of what mobile workstations can be. Though premium priced, its versatility, build quality, and performance justify its cost for professionals seeking the best tools to do their work reliably.


For more workplace hardware, we've reviewed the best business laptops

Oppo Reno 12 FS review: a colorful budget contender
12:00 pm | December 21, 2024

Author: admin | Category: Computers Gadgets Oppo Phones Phones | Tags: , , | Comments: Off

Oppo Reno 12 FS 5G: Two-minute review

The Oppo Reno 12 FS 5G is a budget phone done the Oppo way – better-than-average specs, a great sense of style, and cost-cutting decisions that mostly land on the side of confusion rather than frustration. This is a phone that, despite its shortcomings, punches well above its weight, most obviously with its great 6.67-inch display. Though far from perfect, this is a capable budget handset that handles 90% of what we use our phones for every day without complaint.

At a fixed price point of £299, the biggest selling point the Reno 12 FS has is its excellent value for money. There are few other models on the market that offer this much utility for so little money, and a combination of 12GB of RAM and 512GB of storage is almost unheard of in this price bracket.

The Reno 12 FS continues to impress with its software experience – that is, once you get clear of the awful bloatware the phone ships with. ColorOS 14 is otherwise a swift and snappy experience, with terrific customization, though slowdown can hold the phone back at seemingly random times. It’s clear that the included MediaTek Dimensity 6300 chipset can’t quite keep up with modern demands.

As for cameras, the Reno 12 FS struggles to keep pace with close competitors like the Samsung Galaxy A35, or even the iPhone SE. The main 50MP camera can be coaxed into producing photos that are acceptable, so long as conditions are ideal, but don’t bother with the 8MP ultrawide and 2MP macro camera. It would have been wiser for Oppo to have spent those resources on a better single-camera system.

Overall, the Reno 12 FS 5G will work for a specific type of user, and should appeal far more to media consumers than media producers. If you’re looking for a device to keep you connected, browse the internet, and watch videos, this is a solid choice – but those looking for photography power or totally impressive performance should look elsewhere.

Oppo Reno 12 FS 5G review: Price and availability

  • Costs £299, available in one configuration
  • Sole model comes with 512GB of storage
  • Not available in the US or Australia

The Oppo Reno 12 FS 5G marked Oppo’s re-entry to the UK market, and now forms a core part of the brand’s steadily growing phone lineup. It comes in a single model, with 12GB of RAM and a huge 512GB of storage for £299. It’s not available in Australia, where it’s missing from an otherwise fairly robust range of phones, including the base-model Reno 12 for AU$799 (about £400). Oppo doesn't sell its phones in the US, though sister company OnePlus sells very similar models.

Half a terabyte of storage and as much RAM as a Galaxy S24 for under £300 is no small feat, and while the Reno 12 FS doesn’t exactly sport a flagship chipset, it generally packs enough power for day to day use and even some gaming. That’s pretty phenomenal value for money already. Those who want a capable all-rounder for light use and the occasional session of Call of Duty Mobile won't be disappointed.

At the time of writing, the Reno 12 FS 5G sits towards the lower end of the Oppo smartphone lineup – I mention this because the Oppo phones for sale in the UK have changed continually over the last few months as the company establishes its presence once more. It walks the line between budget and mid-range tiers and aims for the best of both – with a great display and fresh design, but a lacking camera system and cheap-feeling construction. It would have benefitted from a simpler, more focused allocation of resources.

Value score: 4 / 5

Oppo Reno 12 FS 5G review: Specs

Oppo Reno 12 FS 5G review: Design

The rear panel of the Oppo Reno 12 FS 5G, showcasing the breathing light feature

The Breathing Light on the Oppo Reno 12 FS (Image credit: Future)
  • Simple but solid silhouette
  • Breathing Light LED is a fun addition
  • Cheap materials that mark easily

For such a simple phone, I do quite like the design of the Oppo Reno 12 FS 5G. This is a wide, thin slab that fits a lot of screen onto a relatively efficient form factor, and has no trouble getting around bags and pockets thanks to its slim profile and rounded edges. I especially like the nearly flat camera housing, which is the subtlest I’ve seen on a smartphone this year.

The cameras therein may not be amazing (more on that later), but having a phone that almost lays flat on a table feels like some kind of nostalgia trip, especially compared to the awkwardly rocking iPhone, Samsung, and OnePlus flagships we’ve gotten used to. The ports and buttons are as basic as they come but the Reno 12 FS isn’t trying to be much more than usable, and at this price point that’s all I’d expect.

The circular camera module holds another secret, however. Around this housing lies a ring of LEDs, which Oppo calls the Breathing Light. This refers to the light’s ability to react to different sources of sound and information. It’ll flicker in time with music, for example, and fill up as the phone charges.

The Breathing Light is a surprisingly fun addition that adds a lot to what is otherwise essentially just a thin ingot. It’s not made of the most premium materials, with a rear panel that creates a weird amount of friction in the hand and plastic rails that pick up nicks and dents easily. The creatively titled Black Green color is the only option, and luckily exactly to my taste, but if you’re into other colors you’re out of luck.

The camera module follows the Xiaomi 14T school of thought by giving the flash its own lens-sized ring. I’m not opposed to the symmetry this provides, but it feels slightly like an effort to make the Reno 12 FS 5G seem more premium than it actually is. The same could be said for the phone’s curved bezels, which actually hold up a flat screen. It would be more reassuring to see a simpler design and more investment in performance: nobody is expecting a work of art at this price point anyhow.

Design score: 3.5 / 5

Oppo Reno 12 FS 5G review: Display

The Oppo Reno 12 FS 5G displaying its magazine unlock against a river

(Image credit: Future)
  • 1080 x 2400 resolution
  • 120Hz refresh rate
  • Peak brightness of 2100 nits

The Oppo Reno12 FS 5G comes equipped with an excellent 6.67-inch FHD+ OLED display, with a 120Hz refresh rate. It is easily the phone’s biggest selling point after its bargain price. For the money, this is a beautiful panel that’s ideal for games, watching videos, or simply scrolling through posts and articles.

With a maximum local brightness of 2100 nits, the Reno 12 FS gets plenty bright, and can just about hold its own in direct sunlight. In fact, I’d recommend using it at higher brightness levels most of the time, as colors can lose contrast and saturation towards the darker end of the slider. Colors are noticeably deeper here than on other displays, which will be a knock or a boost depending on taste.

The display feels responsive to use, which pays off during gaming sessions. Oppo is very good at shaving unnecessary milliseconds off of everyday tasks, and this display works in tandem with the smoothness of ColorOS to provide a genuinely nice experience when the hardware can keep up. It even comes fitted with a screen protector! There are panels with richer colors and sharper images, but for £299 this is one of the best you’ll get.

Display score: 4 / 5

Oppo Reno 12 FS 5G review: Software

The Oppo Reno 12 FS app drawer, river in the background

(Image credit: Future)
  • Android 14 with ColorOS 14
  • Absolutely full of bloatware
  • Otherwise solid with great customization options

The software experience on the Oppo Reno 12 FS 5G makes the most of the phone’s limited hardware. ColorOS is fast becoming my favorite implementation of Android thanks to its swift navigation, easy-to-use settings, and exceptional customization options. However, as with other Oppo phones, what could be an entirely slick experience is marred by an unfortunate amount of bloatware.

Though the Reno 12 FS isn’t exactly a fast phone, ColorOS is generally responsive and loaded with useful options. There is some unpredictable slowdown in the UI, though, which is either down to hardware limitations or poor optimization. The phone comes loaded with Google Gemini, but not Circle to Search, and the pre-installed Oppo apps are fine, though most users will defer to Google’s options instead.

On the topic of apps, the amount of bloatware here really is an issue. Switching on the phone for the first time almost felt like I’d picked up someone else’s handset by mistake, with the pages of the homescreen taken up by apps and games I'd never heard of. The most offensive of these are the ones that are blatant advertisements – this robs the setup experience of its sheen and the user of a sense of proper ownership.

Some of that ownership can be reclaimed with the stellar customization options on the Reno 12 FS. ColorOS has some of the best wallpapers and theme settings of any phone OS I’ve used, Android or no, and they really bring the Reno12 FS to life. There are uniquely generated lock screens, wallpapers that react to your taps, and plenty of font options.

As a side note, The Oppo Reno 12 FS 5G is also the only phone I’ve ever used that has a 300% volume option. Pushing the volume past the normal maximum adds a menacing red “300%” to the top of the bar. The next time someone tells you “it goes up to 11”, you can tell them your phone goes up to 300.

Software score: 3 / 5

Oppo Reno 12 FS 5G review: Cameras

The Oppo Reno 12 FS camera module over a river background

(Image credit: Future)
  • 50MP wide camera
  • 8MP ultrawide camera
  • 2MP macro camera

The cameras on the Oppo Reno 12 FS 5G are, frankly, not great. Even holding the phone steady in brightly-lit conditions will produce images that range from just serviceable to unimpressive. It’s honestly disappointing that a 50MP main camera could produce pictures that are so lacking in detail – a reminder that resolution isn’t everything.

Using the camera app is no chore as it comes replete with plenty of options and modes, but the viewfinder consistently displays a grainy and unattractive image. The phone can produce decent final images if you give it a lot of light, but even these show a huge disparity from the preview, which leads me to believe there’s some very active post-processing going on. This theory is somewhat confirmed by the blurriness you’ll see in tree branches and grasses.

I don’t want to come down too hard on the Reno 12 FS, because it is firmly a budget phone, but some of the best cheap phones offer more in this department (the Samsung Galaxy A35 comes to mind). The secondary cameras on the Reno 12 FS – an 8MP ultra-wide and 2MP macro camera – are especially rough, to the point that I question why they were even included.

Still, for capturing home photos and videos, scanning documents, and the occasional holiday snap, the Oppo Reno 12 FS will manage. The selfie camera is also fine, but again doesn’t seem to live up to its 32MP resolution, and video recording at 1080p 60fps is serviceable. The Reno 12 FS doesn’t offer an offensively barebones experience, but those who care about photography should definitely look elsewhere.

Camera score: 2 / 5

Oppo Reno 12 FS 5G Camera Samples

Image 1 of 5

A pub across a green

(Image credit: Future)
Image 2 of 5

A basketball hoop

(Image credit: Future)
Image 3 of 5

A wide shot of a river running through a park

(Image credit: Future)
Image 4 of 5

A warm lightbulb over hanging a bouquet mounted on sheet music on an indoor wall

(Image credit: Future)
Image 5 of 5

A sign warning that the reader is about to enter a cricket ground

(Image credit: Future)

Oppo Reno 12 FS 5G review: Performance

The Oppo Reno 12 FS playing Crossy Road

(Image credit: Future)
  • MediaTek Dimensity 6300
  • GPU: Mali G57 MC2
  • 12GB of RAM

The Oppo Reno 12 FS is something of an oddball performance wise. I’ve managed to get smooth 30fps gameplay from it when booting up Call of Duty Mobile, even in extended sessions, and yet it’ll stutter randomly when swiping into the discover tab or opening YouTube. It's bothersome, but the slowdown isn’t prevalent enough to ruin an otherwise usable device. Calls are clear and messages are delivered without issue.

The Reno12 FS runs on the MediaTek Dimensity 6300 chipset and comes equipped with a very healthy 12GB of RAM. That’s a reasonable handful of silicon at this price, and I’m especially impressed by the large amount of memory on offer. It shows in the phone’s surprising capacity for multitasking; I’ve yet to have an app crash on me.

Coming from a premium handset, the Reno 12 FS is noticeably slower to open apps, scroll through web pages, and complete searches. I can quite easily get the phone to stutter while switching between apps, too. From a more neutral perspective, the phone is powerful enough for 90% of what people use their phones for, and again I have to consider the price point. The Reno 12 FS finds a reasonable balance.

Performance score: 3 / 5

Oppo Reno 12 FS 5G review: Battery

The underside charging port of the Oppo Reno 12 FS 5G

(Image credit: Future)
  • Excellent battery life with monstrous standby times
  • Confidently an all-day phone
  • 5,000 mAh capacity

The Oppo Reno12 FS 5G has a truly excellent battery life, bolstered by absolutely ridiculous standby times. I tested the Reno12 FS intermittently over the course of multiple weeks, and was frequently surprised by picking up a still-charged phone after a week or two away. In daily use, the efficient MediaTek chipset sips at the battery, never dropping by an alarming amount when browsing the web or social media. A 5,000 mAh cell powering a 1080p display is bound to last a while, but the Reno 12 FS still manages to impress.

In normal use, the Reno 12 FS offers a reassuring amount of battery, but when put to one side, the Reno 12 FS just refuses to run out of power. I appreciate standby times aren’t always at the front of buyer’s minds, but I’d be remiss to not mention it in this case – the phone seems to take up between 5% and 10% of its charge per day in standby.

When it does eventually come time to top up, the Reno 12 FS isn’t so impressive. The phone charges via USB-C and doesn’t support wireless charging. An official charging speed isn’t listed, but when plugged in to my 80W SUPERVOOC brick the phone says it's charging at 45W - a solid power draw by any measure.

Battery score: 4 / 5

Should you buy the Oppo Reno 12 FS 5G?

Buy it if...

You're on a budget

At £299, there are few phones that offer this much for so little. Not every feature is as refined as the display or operating system, but this is still a very capable phone for the price.

You want something stylish

From the classy Black Green colorway to the fresh new Breathing Light, the Reno 12 FS 5G is pure Oppo style. The software customization is top notch, too.

You want a large display

The 6.67-inch display fitted to the Reno 12 FS 5G is a big and bold green flag, and obvious evidence of the phone's value for money.

Don't buy it if...

You need strong performance

The Reno 12 FS 5G can handle the basics, but is prone to stuttering. It doesn't ruin the experience, but I wouldn't pick it for critical tasks.

You're a shutterbug

Photographers should look elsewhere - our list of the best cheap phones has plenty of options with much better camera systems than the misguided triple-camera setup on the Reno 12 FS 5G.

Oppo Reno 12 FS 5G review: Also consider

Samsung Galaxy A35

The Samsung Galaxy A35 brings similar value for money to the Reno 12 FS 5G, but with a more sophisticated camera system and the power of Samsung's platform. If you prefer to stick with well known brands, then this is a suitable swap.

Read our Samsung Galaxy A35 review

iPhone SE

If you can stretch your budget, the iPhone SE will be more consistent and powerful than the Reno 12 FS at every turn. The camera performance is notably much better than the Reno, and you get the benefit of accessing the Apple ecosystem.

Read our iPhone SE review

How I tested the Oppo Reno 12 FS 5G

I used the Oppo Reno 12 FS 5G intermittently over the course of several weeks. Over this time, I used the phone for everyday tasks, as well as more specific tests designed to push the handset’s performance. As mentioned, the phone only comes in one model, and as such my test unit came with 512GB of storage, 12GB of RAM, and the dashing Black Green colorway.

In terms of my everyday usage, I made phone calls, sent messages, and scrolled through articles on Chrome. I watched videos on YouTube and listened to music via Spotify (including testing the quirky 300% volume feature in person). I was able to get a sense of how the Reno 12 FS serves to keep users connected to others and the latest news.

I undertook more specific tests to determine the performance limits of the Reno 12 FS. These included extended play sessions on Call of Duty Mobile, a popular demanding mobile game, and stepping out in various weather conditions to test the phone’s camera system. I also observed battery levels throughout my usage.

After gathering this real-world experience, I applied my in-depth knowledge of smartphone specs and the wider phone market, as well as my journalistic training, to assess the value and performance of the handset, and help you decide whether the Reno 12 FS is for you.

Intel Arc B580 review: A spectacular success for Intel and a gateway to 1440p for gamers on a budget
5:00 pm | December 12, 2024

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Intel Arc B580: Two-minute review

When I reviewed the Arc A770 and A750, I said that these Alchemist GPUs were impressive first efforts for Intel's Arc range, but not yet at the level that they needed to be to compete with the likes of Nvidia and AMD in discrete graphics.

Well, with the release of the new Intel Arc B580 (2nd-gen Battlemage), there's no doubt that Intel has produced one of the best graphics cards of this generation, and given gamers on a budget an absolute gift just in time for the holidays.

For starters, let's talk about the price of this GPU. At just $249.99 / £249.99 / AU$439, the Arc B580 undercuts both Nvidia's and AMD's budget offerings, the RTX 4060 and RX 7600, while offering substantially better performance, making its value proposition untouchable at this price range.

While I'll dig deeper into the performance in a bit, I'll cut to the chase and point out the simple fact that neither the RTX 4060 nor the RX 7600 can game at 1440p without severely compromising graphics quality. Not only can the B580 perform this feat, it does so brilliantly.

This comes down to some very straightforward spec choices that Intel made with its Battlemage debut that, especially in hindsight, make Nvidia and AMD's respective decisions even more baffling. First, with a VRAM pool of 12GB, the B580 can hold the larger texture files needed for 1440p gaming, whereas the RTX 4060 Ti cannot, due to its 8GB VRAM loadout.

Then there's the B580's wider 192-bit memory interface, compared to the RTX 4060 Ti's and RX 7600 XT's 128-bit. While this might seem like an obscure spec, it's the secret sauce for the B580. This beefier interface allows it to process those larger texture files much faster than its competitors, so this GPU can fully leverage its bigger VRAM pool in a way that Nvidia and AMD's competing cards simply can't, even with larger VRAM configurations.

Boiling all this down, you end up with a budget-class GPU that can get you fast 1440p framerates the likes of which we haven't seen since the RTX 3060 Ti.

Even more impressive, in my mind, is that I did not encounter a single game where there was some kind of quirk or hiccup caused by the driver. With the Arc Alchemist cards last year, there were issues with some games not running well because of inadequate driver support, or a game's reliance on an older version of DirectX that the Alchemist GPUs weren't optimized for. I didn't encounter any of those problems this time around. The Intel graphics team's long, hard work on getting Arc's drivers up to par has definitely paid off.

If there's a criticism I can make of this graphics card, it's that its creative performance isn't as good as Nvidia's. But given the entire creative world's reliance on Nvidia's bespoke CUDA instruction set, neither Intel nor AMD were ever really going to be able to compete here.

Fortunately, the Intel Arc B580 is a graphics card for gaming, and for any gamer looking to play at 1440p resolution on the cheap, the B580 is really the only graphics card that can do it, making it the only GPU you should be considering at this price point.

Intel Arc B580: Price & availability

An Intel Arc B580 resting upright on its retail packaging

(Image credit: Future / John Loeffler)

The Intel Arc B580 is available in the US, UK, and Australia, and has been from December 13, 2024, starting at $249.99, £249.99, and AU$439 respectively. Third-party graphics card partners like Acer, ASRock, and others will have their own variants of the B580, and their prices may be higher, depending on the card.

The closest competition for the Arc B580 in terms of price are the Nvidia RTX 4060 and AMD RX 7600, both of which have a $20-$50 higher MSRP. And while Nvidia and AMD are preparing to roll out their next-gen graphics cards starting next month, it will still be a few months after the initial flagship launches before either company's budget offerings are announced. So, the B580 is the only current-gen GPU available for under $250 / £250 / AU$450 at the moment, and will likely remain so for many months to come.

  • Value: 5/5

Intel Arc B580: Specifications

The video output ports on the Intel Arc B580

(Image credit: Future / John Loeffler)

Intel Arc B580: Architecture & features

A masculine hand holding up the Intel Arc B580

(Image credit: Future / John Loeffler)

The Intel Arc B580 is the first discrete GPU from Intel based on its new Xe2 graphics architecture, codenamed Battlemage, and there are a lot of low-level changes over the previous-gen Intel Arc Alchemist. Many of these are small tweaks to the architectural design, such as the move from SIMD32 to SIMD16 instructions, but when taken together, all of these small changes add up to a major overhaul of the GPU.

That, in addition to using TSMC's 5nm process, means that even though the GPU itself has become physically smaller in just about every measure, it's much more powerful.

The B580 has a roughly 17% reduction in compute units from the Arc A580 and about 10% fewer transistors, but Intel says that its various architectural changes produce about 70% better performance per compute unit (or Xe core, as Intel calls it). I haven't tested or reviewed the Intel Arc A580, so I can't say for certain if that claim holds up, but there has definitely been a major performance gain gen-on-gen based on my experience with the higher-end Arc Alchemist cards. We also can't ignore the substantially faster boost clock of 2,850MHz, up from 1,700MHz for the A580.

Outside of the GPU architecture, there is also a smaller memory bus, with the A580's 256-bit interface dropping down to 192-bit for the B580, but the B580 features a 50% increase in its video memory pool, as well as a faster memory clock.

  • Specs & features: 4.5 / 5

Intel Arc B580: Design

The brand marking on the Intel Arc B580

(Image credit: Future / John Loeffler)

The Intel Arc B580 Limited Edition reference card is what you'd call the 'base' version of this GPU, but don't call it basic.

Despite its all-black-with-white-accent-lettering appearance, this is a good-looking graphics card, much like the initial Arc Alchemist GPUs before it, thanks to its matte, textured black shroud, dual-fan cooling, and rather understated aesthetic.

In a PC component world full of ostentatious, overly aggressive and flashy designs, there is something almost respectable about a graphics card in 2024 that presents itself without gimmicks, almost daring you to underestimate its capabilities due to its lack of RGB.

That said, there is one noticeable difference with this graphics card's design: the open 'window' over the internal heatsink to help with airflow and cooling. Unfortunately, the HWInfo64 utility I use to measure temperature and power draw for the GPUs I review couldn't read the Arc B580 during testing, so I can't tell you how much of a difference this window makes compared to something like the Intel Arc A750—but it certainly won't hurt its thermals.

Beyond that, the card also sports a single 8-pin power connector, in keeping with its 190W TBP, so you can pretty much guarantee that if you already have a discrete GPU in your system, you'll have the available power cables from your PSU required to use this GPU.

It's also not a very large graphics card, though it is larger than some RTX 4060 and RX 7600 GPUs (it's about 10.7-inches / 272mm), though third-party variants might be more compact. In any case, it's a dual-slot card, so it'll fit in place as an upgrade for just about any graphics card you have in your PC currently.

  • Design: 4.5 / 5

Intel Arc B580: Performance

An Intel Arc B580 running on a test bench

(Image credit: Future / John Loeffler)

OK, so now we come to why I am absolutely in love with this graphics card: performance.

Unfortunately, I don't have an Intel Arc A580 card on hand to compare this GPU to, so I can't directly measure how the B580 stacks up to its predecessor. But I can compare the B580 to its current competition, as well as the Intel Arc A750, which prior to this release was selling at, or somewhat below, the price of this graphics card, and has comparable specs.

In terms of pure synthetic performance, the Arc B580 comes in second to the Nvidia RTX 4060 Ti, performing about 10% slower overall. That said, there were some tests, like 3DMark Fire Strike Ultra, Wild Life Extreme (and Wild Life Extreme Unlimited), and Time Spy Extreme where the extra VRAM allowed the Arc B580 to pull ahead of the much more expensive Nvidia RTX 4060 Ti. The Arc B580 did manage to outperform the RTX 4060 by about 12%, however.

Creative workloads aren't the Arc B580's strongest area, with Nvidia's RTX 4060 and RTX 4060 Ti performing substantially better. This might change once PugetBench for Creators Photoshop benchmark gets updated however, as it crashed during every single test I ran, regardless of which graphics card I was using.

Notably, the Intel Arc B580 encoded 4K video to 1080p at a faster rate using Intel's H.264 codec in Handbrake 1.61 than all of the other cards tested using Nvidia or AMD's H.264 options, so this is something for game streamers to consider if they're looking for a card to process their video on the fly.

But what really matters with this GPU is gaming, and if you compare this graphics card's 1080p performance to the competition, you'll have to go with the nearly 40% more expensive Nvidia RTX 4060 Ti in order to beat it, and it's not a crushing defeat for Intel. While I found the Arc B580 is about 17% slower than the RTX 4060 Ti on average at 1080p (with no ray tracing or upscaling), it's still hitting 82 FPS on average overall and actually has a slightly higher minimum/1% FPS performance of just under 60 FPS.

The AMD RX 7600 XT, Intel Arc A750, and Nvidia RTX 4060 don't even come close to reaching these kinds of numbers, with the Arc B580 scoring a roughly 30% faster average 1080p FPS and an incredible 52% faster minimum/1% FPS advantage over the Nvidia RTX 4060, which comes in a very distant third place among the five GPUs being tested. All in all, it's an impressive performance from the Intel Battlemage graphics card.

Also worth noting is that the Intel Arc B580's ray-tracing performance is noticeably better than AMD's, and not that far behind Nvidia's, though its upscaling performance lags a bit behind AMD and Nvidia at 1080p.

Even more impressive, though, is this card's 1440p performance.

Typically, if you're going to buy any 1440p GPU, not even the best 1440p graphics card, you should expect to pay at least $400-$500 (about £320-£400 / AU$600-AU$750). And to really qualify as a 1440p GPU, you need to hit an average of 60 FPS overall, with an average FPS floor of about 40 FPS. Anything less than that, and you're going to have an uneven experience game-to-game.

In this regard, the only two graphics cards I tested that qualify are the Nvidia RTX 4060 Ti and the Intel Arc B580, and they are very close to each other in terms of 1440p performance. (I can give an honorable mention to the Nvidia RTX 4060, which almost got there, but not quite).

While Nvidia has certain built-in advantages owing to its status as the premiere GPU brand (so pretty much any game is optimized for Nvidia hardware by default), at 1440p it only barely ekes out a win over the Intel Arc B580. And that's ultimately down to its stronger native ray-tracing performance—a scenario which pretty much no one opts for. If you're going to use ray tracing, you're going to use upscaling, and in that situation, the RTX 4060 Ti and Arc B580 are effectively tied at 1440p.

And this 1440p performance in particular is why I'm so enthusiastic about this graphics card. While this is the performance section of the review, I can't help but talk about the value that this card represents for gamers—especially the growing number of 1440p-aspiring gamers out there.

Prior to the Intel Arc B580, gaming at 1440p—which is the PC gaming sweet spot; believe me, I've extensively tested nearly every GPU of the past four years at 1440p—was something reserved for the petit bourgeois of PC gamers. These are the folks not rich enough to really go in for the best 4K graphics cards, but they've got enough money to buy a 1440p monitor and a graphics card powerful enough to drive it.

This used to mean something approaching a grand just for these two items alone, locking a lot of gamers into incremental 1080p advances for two successive generations. No more.

Now, with an entry-level 1440p monitor coming in under $300 /£300 / AU$450, it's entirely possible to upgrade your rig for 1440p gaming for about $500 / £500 / AU$750 with this specific graphics card (and only this graphics card), which is absolutely doable for a hell of a lot of gamers out there who are still languishing at 1080p.

Ultimately, this, more than anything, raises the Intel Arc B580 into S-Tier for me, even though Nvidia's $399.99 RTX 4060 Ti GPU gets slightly better performance. The Nvidia RTX 4060 Ti just doesn't offer this kind of value for the vast majority of gamers out there, and even with its improved performance since its launch, the 4060 Ti is still very hard to recommend.

The Nvidia RTX 4060, meanwhile, can't keep up with the B580 despite being 20% more expensive. And with the AMD RX 7600 XT, laden with its $329.99 MSRP (about £250 / AU$480 RRP), falling noticeably behind the B580, the RX 7600 (which I haven't had a chance to retest yet) doesn't stand a chance (and has a slightly more expensive MSRP).

And, it has to be emphasized, I experienced none of the driver issues with the Intel Arc B580 that I did when I originally reviewed the Intel Arc A750 and Arc A770. Every game I tested ran perfectly well, even if something like Black Myth Wukong ran much better on the two Nvidia cards than it did on Intel's GPUs. Tweak some settings and you'll be good to go.

This was something that just wasn't the case with the previous-gen Arc graphics cards at launch, and it truly held Intel back at the time. In one of my Intel Arc Alchemist reviews, I compared that generation of graphics cards to fantastic journeyman efforts that were good, but maybe not ready to be put out on the show floor. No more. Intel has absolutely graduated to full GPU maker status, and has done so with a card more affordable than the cheapest graphics cards its competition has to offer.

Simply put, for a lot of cash-strapped gamers out there, the Intel Arc B580's performance at this price is nothing short of a miracle, and it makes me question how Intel of all companies was able to pull this off while AMD and Nvidia have not.

Even if you don't buy an Intel Arc B580, give Intel its due for introducing this kind of competition into the graphics card market. If Intel can keep this up for the B570, and hopefully the B770 and B750, then Nvidia and AMD will have no choice but to rein in their price inflation with the next-gen cards they plan to offer next year, making it a win-win for every gamer looking to upgrade.

  • Performance: 4.5 / 5

Intel Arc B580: Should you buy it?

A masculine hand holding an Intel Arc B580

(Image credit: Future / John Loeffler)

Buy the Intel Arc B580 if...

You want an extremely affordable 1440p graphics card
A 1440p graphics card can be quite expensive, but the Intel Arc B580 is incredibly affordable.

You're looking for great gaming performance
The Intel Arc B580 delivers incredible framerates for the price.

Don't buy it if...

You're looking for a budget creative GPU
While the B580 isn't terrible, if you're looking for a GPU for creative work, there are better cards out there.

You want a cheap GPU for AI workloads
The Intel Arc B580 might have dedicated AI hardware, but it still lags behind Nvidia by a good amount.

Also consider

Nvidia GeForce RTX 4060
The Nvidia RTX 4060 is a better option for a lot of creative tasks on a budget, though its gaming performance isn't as strong despite the higher price.

Read the full Nvidia GeForce RTX 4060 review

Nvidia GeForce RTX 4060 Ti
If you want a strong 1080p and 1440p gaming GPU, but also need some muscle for creative or machine learning/AI workloads, this card is what you'll want, so long as you're willing to pay the extra premium in the price.

Read the full Nvidia GeForce RTX 4060 Ti review

How I tested the Intel Arc B580

The backplate of the Intel Arc B580

(Image credit: Future / John Loeffler)
  • I tested the Intel Arc B580 for about three weeks
  • I used my updated suite of graphics card benchmark tests
  • I used the Arc B580 as my primary work GPU for creative workloads like Adobe Photoshop, as well as some in-depth game testing
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

Over the course of about three weeks, I used the Intel Arc B580 as my primary workstation GPU when I wasn't actively benchmarking it.

This included using the graphics card for various creative workloads like Adobe Photoshop and light video encoding work.

I also used the B580 for some in-depth game testing, including titles like Black Myth Wukong, Satisfactory, and other recently released games.

I've been doing graphics card reviews for TechRadar for more than two years now, and I've done extensive GPU testing previous to that on a personal basis as a lifelong PC gamer. In addition, my computer science coursework for my Master's degree utilized GPUs very heavily for machine learning and other computational workloads, and as a result, I know my way around every aspect of a GPU. As such, you can rest assured that my testing process is both thorough and sound.

  • Originally reviewed December 2024
Dell Precision 5690 mobile workstation review
10:33 pm | December 11, 2024

Author: admin | Category: Computers Gadgets Pro | Tags: , , | Comments: Off

The Precision 5690 is considered a flagship in Dell's Mobile Workstation line-up, and for good reason. And like all the best mobile workstations we've reviewed, this heavy-duty laptop caters to professionals who need the extra power, even at the cost. These individuals who need this kind of computer need exceptional portable computing power. The Precision 5690 boasts the latest Intel Meteor Lake processors, AI integration, NVIDIA's Ada-generation GPU, and a fantastic build quality, making it a highly versatile tool for even the most demanding tasks. Granted, it does take some design inspiration from the ever-popular Dell XPS range and some from the competitors at Apple; the Precision 5690 stands tall as a fantastic contender for serious productivity and workload.

Dell Precision 5690: Price and Availability

The base model costs around $2,300 and includes an Intel Core Ultra 5, integrated graphics, and an FHD+ non-touch display. Our test model is spec'd out with an Intel Core Ultra 9 185H 2.5GHz processor with 64GB of RAM, Windows 11 Pro, an NVIDIA RTX 5000 Ada Generation GPU, 4TB of internal storage, and a 4K OLED display. This test model comes in around $6,500.

Dell Precision 5690

(Image credit: Collin Probst // Future)

Dell Precision 5690: Unboxing and First Impressions

Though this computer is quite robust in power, the packaging for the Dell Precision 5690 is humble. Its minimalistic approach reflects Dell's commitment to sustainability with recycled materials. Within the product box are the laptop, a 165W USB-C Charger, a USB-C to USB-A dongle, and some protective papers ensuring the computer makes it to you without a scratch. From the moment I took the protective sleeve off, I loved the anodized aluminum chassis. It looks premium and feels like it can be used daily and shouldn't get destroyed. it's right up there with the best business laptops I've tried.

Dell Precision 5690

(Image credit: Collin Probst // Future)

Dell Precision 5690: Design and Build Quality

Specs

CPU: Intel Core Ultra 9 185H, 16 cores.
GPU: NVIDIA RTX 5000 Ada with 16GB GDDR6.
Display: 16-inch UHD+ OLED, 3840 x 2400, 400 nits, 100% DCI-P3.
RAM: Up to 64GB LPDDR5x.
Storage: Configurable up to 8TB SSD.
Connectivity: Intel Wi-Fi 7, Bluetooth 5.4, two Thunderbolt 4 ports, HDMI 2.1, SD card reader.

The Dell Precision 5690 showcases a sleek yet professional design made of a Titan Gray anodized aluminum case material. It's hefty, yet it is pretty portable for the workstation at only 4.5 lbs. Just as importantly, it can maintain a sturdy build with overall minimal flex. This is not a flimsy device, which is excellent for those who use their laptops daily. It can be used without worrying about snapping or breaking the first time used in the real world.

Coming from a guy with a personal MacBook Pro, I am particular about my touchpads; I don't put up with anything less than excellent. And the Precision 5690's touchpad is fantastic. It has a tremendously wide shape and fantastic haptic feedback and feel in general. Further, the keyboard is another thing that needs to be excellent for a laptop to be genuinely great since part of the reason people choose a laptop over a desktop is to be able to use it on the go. If the keyboard is excellent and built-in, that's one less thing that needs to be carried alongside the laptop, further bulking up the otherwise portable setup. Much like the touchpad, the keyboard is also great on this computer. Granted, it's no Keychron custom mechanical keyboard, but it's a fantastic keyboard with backlit keys. If it had a numpad, it might've been perfect.

Dell Precision 5690

(Image credit: Collin Probst // Future)

Dell Precision 5690: In use

I'll say it one more time. I am used to working on a MacBook Pro day in and out. So, I frequently compare laptops I test to that as a litmus test. The Dell Precision 5690 has made me confident that I could run with just the Precision 5690 and fly through every single thing in my workflow with the slightest of ease and then some, with only having to re-learn keyboard shortcuts. Jokes aside, this machine is incredible. Yes, there are even more powerful computers. Yes, there are more portable laptops, too. Many laptops are "more" of one thing or another, but the Dell Precision 5690 has been one of those unique computers that fits the sweet spot of being a device that does everything well.

It performs very well in all categories it's intended to. I've taken this as my only laptop for several days now, and I have zero hangups or hiccups when it comes to getting work done on this laptop, which is across all the various types of tasks I do. I have been able to do admin work, emails, spreadsheets, web-browsing, and project management, along with also getting a photo and some video editing done; I've been able to run programs, try my hand at some coding, I've gamed, I have run remote desktops, I have run benchmarks to know that this could be a serious contender even in engineering and running massive code bases, deliver exceptional performance across CAD, and CGI work,

And through all that, this laptop is still portable enough that I don't mind throwing it in a backpack and carrying it to wherever work takes me that day. Even the battery is excellent for this kind of machine. I'll keep a power bank and charger in my bag because that's the kind of person I am, but I don't feel like I need to top off every couple of hours; I can buckle down for a long working session without the anxiety of getting charged.

Dell Precision 5690

(Image credit: Collin Probst // Future)

Because of the 4K OLED display, I enjoy using this display. Some professional laptops opt for a 1080p display, which is fine, but then I go from my iPhone 16 Pro's beautiful display and I find myself wishing for a better screen. I don't have this issue here. With the Precision 5690, that beautiful screen makes watching content feel right, and adds a little extra to editing photos and videos.

Dell Precision 5690

(Image credit: Collin Probst // Future)

Dell Precision 5690: Final verdict

The Dell Precision 5690 is a workstation marvel, combining portability with uncompromising performance. While its price and limited port options may deter some, its sheer power, stunning display, and premium build make it a no-brainer for professionals needing the best. Whether you’re an architect, data scientist, creative professional, or someone who needs reliable power in their primary machine, the Precision 5690 delivers impressive results that justify its premium price tag.


For performance-driven desktops, we reviewed the best workstations.

Oppo Find X8 Pro review: don’t call it an iPhone
5:37 pm | November 22, 2024

Author: admin | Category: Computers Gadgets Oppo Phones Phones | Tags: , | Comments: Off

Oppo Find X8 Pro: Two-minute review

The Oppo Find X8 Pro is built on truly excellent hardware. It sports a sleek premium design, a luxurious 6.78-inch display, and the best mobile camera system I’ve ever used. Its snappy performance and innovative UI animations also make it one of the smoothest-feeling phones on the market, and this combination of great hardware and slick software is reflected in the Find X8 Pro's high (but arguably competitive) retail price.

However, the Find X8 Pro has clearly taken one or two (or ten) design cues from the iPhone 16 Pro, and at several points during this review, I found myself asking how much originality counts for. In many ways, the Find X8 Pro blazes past its inspiration, with smoother software, more powerful cameras, and – to my eye – a more interesting design. But Oppo can only take so much credit for a phone so substantially built on another phone maker’s ideas.

Philosophizing aside, the Oppo Find X8 Pro is full to the brim with impressive tech. Its display is sharp, colorful, and immersive, and at 6.78 inches is about as large as I’d want a phone screen to get. The back of the phone is where the real magic happens, though – the quad-camera system on the Oppo Find X8 Pro is truly class-leading, with four 50MP snappers at various levels of optical magnification.

Internally, the phone is just as solid, with a MediaTek Dimensity 9400 chipset and 16GB of RAM. The Find X8 Pro handled everything I threw at it with aplomb. I felt like I was gliding through the ColorOS 15 Android wrapper in day-to-day tasks, and no game or app seemed to vex the system at all. This software experience is unfortunately marred by a large amount of bloatware.

Overall, whether the Find X8 Pro is for you comes down to how much you care about originality. People who want an iPhone will always get an iPhone, and because of that, I'm drawn to the idea that Oppo isn't so much chasing Apple customers as it is interpreting Apple features, which might even be a boost for those who prefer Android to iOS. However you feel about that debate, though, this is a great Android phone loaded with top-flight features; with a specs sheet like this, perhaps an identity crisis is forgivable.

@techradar

♬ original sound - TechRadar

Oppo Find X8 Pro review: Price and availability

  • Costs £1,049, available in one configuration
  • Not available in the US

The Oppo Find X8 Pro costs £1,049 in the UK. It comes in two colors – Pearl White or Space Black – and ships with a non-configurable 512GB of storage and 16GB of RAM. As with all Oppo phones, it's very unlikely that the Find X8 Pro will launch in the US, though the upcoming OnePlus 13 could offer similar (if not identical) specs.

At this price, the Oppo Find X8 Pro is directly challenging premium flagships like the iPhone 16 Pro and Samsung Galaxy S24 Plus, both of which start at £999. Matching these established brands on price is a bold move from Oppo – Chinese manufacturers have traditionally sought to undercut Western competitors on price to compensate for weaker reputation. The Find X8 Pro is full of premium hardware, however, so the value is definitely there.

Oppo Find X8 Pro review: Specs

Oppo Find X8 Pro review: Design

The Oppo Find X8 Pro side-on against some bushes

(Image credit: Future / Jamie Richards)
  • Comes in two colors – Pearl White and Space Black
  • Rounded frame with new Quick Button – a shutter button for the camera app
  • Rounded quad-camera housing

The Oppo Find X8 Pro is a strikingly beautiful device. The unit I tested came in Pearl White, which casts a unique pearlescent pattern on each individual handset (there's also a muted Space Black option). It’s subtle in all but the most direct light, which for me strikes the perfect balance between understated and fascinating. Both color options are rated at both IP68 and IP69 for water resistance against both immersion and jets.

The Find X8 Pro is otherwise simple-looking, but keeps things feeling premium with well-chosen materials and attention to detail. The phone is weighty, at 215 grams, but doesn’t feel overly heavy. The camera housing on the Pearl White model is made of polished metal, rather than the glass found on premium OnePlus models, and I have to say, I’m a fan. It gives an industrial contrast to the artsy rear cover and everything on the rear panel a pleasant muted sheen.

The front panel hosts a 6.78-inch screen, curved slightly on each edge. The selfie camera is a reasonably inconspicuous punch-hole design that serves as the midpoint of the software-only Dynamic Cloud – which is, as it sounds, very similar in form and function to Apple’s Dynamic Island.

Ergonomically, the Find X8 finds a nice balance between the ultra-thin curved phones of five or so years ago and the blocky flagships of today. It feels great to hold, but is a little slippery. The phone also seems plenty durable, with weighty buttons and aluminum rails, and comes with a screen protector pre-installed.

On the topic of buttons, the new Quick Button can be found on the lower right-hand side of the frame. The Quick Button is a camera button in all but name, and currently only supports functions and shortcuts directly related to the camera. It’s a nice addition to have and sits flatter than the iPhone’s Camera Control, feeling overall less obtrusive as a result.

Design score: 4 / 5

Oppo Find X8 Pro review: Display

The Oppo Find X8 Pro against some buildings, with the display on and lockscreen visible

(Image credit: Future / Jamie Richards)
  • 1264 x 2780 resolution (19.8:9 aspect ratio)
  • 120Hz refresh rate
  • Ludicrous peak brightness of 4500 nits

The display on the Oppo Find X8 Pro is a sharp 1264 x 2780 panel with a 120Hz refresh rate that works in tandem with Oppo’s new animation technology to offer a truly fluid experience. At 6.78 inches, this is as large as I’d want a phone screen to be, and this size lends itself to dual senses of openness and immersion.

The display on the Find X8 Pro isn’t the highest resolution on the market, but it’s certainly enough to make images and video look razor-sharp. There’s plenty of color, and though I’ve definitely seen panels with richer contrast, the Find X8 is well beyond serviceable. The large size and overall sharpness of this panel lends itself well to all types of games, from the landscape shoot-em-up Call of Duty Mobile to charming vertical RPGs like Mousebusters.

The Find X8 Pro’s screen can reach a respectable 800 nits of brightness in typical use, with an absolute maximum of 4500 nits. That is ludicrously bright and far past the realm of actual usefulness. I found the phone to be reasonably bright in normal use, though colors can appear slightly blown out at the higher end of the brightness slider. I never found myself struggling to read the display outside, though the auto-brightness can sometimes make the screen a little too dim indoors.

Display score: 4 / 5

Oppo Find X8 Pro review: Software

The Oppo Find X8 home screen

(Image credit: Future / Jamie Richards)
  • Android 15 with ColorOS 15
  • Unacceptable amount of bloatware
  • Google Gemini-powered AI

ColorOS 15 is one of the smoothest experiences I’ve had with a smartphone operating system, neck-and-neck with OxygenOS 15 – which adds up, considering they’re basically the same thing. AI is provided courtesy of Google Gemini, with support for Circle to Search, writing tools, document summarization, voice memo transcription, and photo editing tools.

Oppo has imbued ColorOS with some of the highest quality animations I've ever seen on a mobile OS. This translates into exceptionally smooth navigation, and in combination with Oppo’s other fantastic UI animations, depth of field effects, and other visual tricks, gives the operating system a playful sense of elasticity and responsiveness I’ve seen nowhere else in the smartphone market, bar maybe the iPhone.

That leads us to an unavoidable fact about ColorOS 15 – the liberal inspiration it's taken from iOS. Everything from the default wallpapers to the way the date and time sit on the lock screen to the layout of the settings app feels like an echo of the iPhone. The Dynamic Cloud, while useful, is barely distinct from the iPhone's Dynamic Island, and the Quick Settings tab is almost a one-for-one recreation of the iOS 18 control center. Oppo is clearly well-versed in making fantastic software that runs like it's being chased, but it’d be nice to see more of the company’s own personality come through.

Another unfortunate mark on an otherwise exceptionally fast software experience is the absolutely unacceptable amount of bloatware the phone ships with; a ridiculous inclusion on a device of this price that regrettably tarnished my first impressions of the phone. I also couldn't get Google Wallet to enable contactless payments – unrelated, but important.

Software score: 3 / 5

Oppo Find X8 Pro review: Cameras

The Oppo Find X8 Pro's camera housing

(Image credit: Future / Jamie Richards)
  • 50MP wide camera
  • 50MP ultra-wide camera
  • 50MP telephoto with 3x optical zoom
  • 50MP telephoto with 6x optical zoom

The camera system on the Oppo Find X8 Pro is absolutely superb. This is a robust, flexible, and staggeringly powerful camera setup that excels in most situations, particularly with its optical zoom and night photography. While there are a wealth of modes, features, and shooting options built into the Find X8’s camera app, the phone is truly brilliant at offering a fast and reliable point and shoot experience – I never had to consciously consider choosing night mode, or portrait mode, as the default photo tab worked so well. The new Quick Button – a shutter button in all but name – elevates this phone to something closer to a traditional digital camera, and the hardware is certainly there.

Each of the four cameras affixed to the Oppo Find X8 Pro has a 50MP sensor, ensuring consistent quality across its wide optical zoom range. You get an ultra-wide camera, main wide camera, 3x telephoto, and 6x telephoto. All of these cameras feel like powerful tools rather than tacked-on gimmicks, and despite my noted disdain for ultra-wide snappers I must say that this is the best one I’ve come across. Zooming in to the telephoto cameras feels like a natural extension of the main camera, and some excellent software trickery means the transition between lenses when zooming in and out is rarely noticeable.

The Find X8 Pro's optical zoom range of 6x is close to class-leading at this point, now that Samsung no longer fits its phones with 10x lenses. The Find X8 also offers a ludicrous digital zoom range of 120x, which is impressive up to about 40x and then serviceable up to 60x. Past that point, you’re relying on post-processing or an optional AI Telephoto Zoom mode to fill in the gaps and sharpen the blurry original image. The AI zoom isn't great at details, but can guess the outline of shapes and text with decent accuracy.

The camera system’s post-processing is very active overall – some people will prefer a less edited look, but I think it adds a nice amount of color depth, contrast, and sharpness, which directly opposes the brightened style favored by the iPhone and Galaxy flagships. As for video, the phone shoots in 4k at 60fps with the ability to shoot in slow-motion at up to 480fps in 720p.

And, of course, there’s a new way to control the camera system on the Find X8 Pro. The Quick Button appears in the same position and does some of the same things as the iPhone’s Camera Control – it’s seemingly a haptic-sensitive button that supports swiping touch gestures. The Quick Button only does a few things, though – a double press opens the camera app, wherein a single press takes a photo, a long press either takes a burst of photos or a video, and swiping back and forth in landscape mode zooms in and out.

Sure, this isn’t as deep a feature set as Apple’s version, but I still found the Quick Button to be massively effective in reducing the time from thought to photo. The only complaints I have are that the scroll-to-zoom can be a little ‘sticky’ sometimes or occasionally just not work, and that there’s no half-press-to-focus function (Oppo missed an open goal with that one).

Camera score: 4.5 / 5

Oppo Find X8 Pro Camera Samples

Image 1 of 5

The Eiffel Tower, at night, illuminated

(Image credit: Future / Jamie Richards)
Image 2 of 5

Epping Forest

(Image credit: Future / Jamie Richards)
Image 3 of 5

Offices along the Seine

(Image credit: Future / Jamie Richards)
Image 4 of 5

A Parisian street

(Image credit: Future / Jamie Richards)
Image 5 of 5

The Moon

(Image credit: Future / Jamie Richards)

Oppo Find X8 Pro review: Performance

The Oppo Find X8 Pro running Call of Duty Mobile

(Image credit: Future / Jamie Richards)
  • MediaTek Dimensity 9400
  • GPU: Immortalis G-925
  • 16GB of RAM

Day-to-day, the Find X8 Pro performs admirably, powered by the MediaTek Dimensity 9400 chipset. I encountered no slowdown at all in general usage, and found I could swiftly switch between apps and games with no fuss from the hardware.

The phone also performs well across its array of AI tools, with reasonably fast load times and no real lag or slowdown. The Quick Button lives up to its name in accessing the camera app, which opens near-enough instantly from anywhere in the OS.

The Find X8 Pro comes equipped with 16GB of RAM, a generous allotment that means the phone has plenty of headroom for multitasking and AI. The phone's combination of strong internal specs and a large display also makes it a capable gaming machine, and I had no issues booting up games like Atom RPG or Call of Duty Mobile for sessions on the go, with little noticeable warming.

To put it simply, the Oppo Find X8 Pro just feels efficient. I didn’t notice anything putting more strain on the battery, and the phone seems happy to sustain a variety of concurrent processes. The phone excels in shaving milliseconds off of the hundred-a-day tasks: switching apps, opening files, installing software, and so on. This all adds up and makes using the Oppo Find X8 a fluid and satisfying experience.

Oppo Find X8 Pro review: Battery

The USB-C port of the Oppo Find X8 Pro

(Image credit: Future / Jamie Richards)
  • 5,910mAh battery
  • 80W wired charging
  • 50W wireless charging

The Oppo Find X8 Pro sports an all-day battery life, with power to spare. The 5,910mAh silicon-carbon battery gives the Find X8 Pro exceptional longevity. It handles busy days of mixed use with no issue, and doesn’t seem to drain too drastically during gaming sessions or when playing back longer videos.

The real magic comes when it’s time to plug in the Find X8 Pro to recharge – the phone doesn't only come with a charger, which is itself a major win in today’s market, but an 80W charger using Oppo’s own SuperVOOC technology. What that means in practice is blisteringly fast charging speeds and more flexibility for battery top-ups. I almost never saw the Oppo Find X8 Pro run out of battery, as even a cursory 5-minute charge could net me an extra 10% or so of battery life. The phone also supports 50W wireless charging, and reverse wireless charging.

When I tested the charging speed of the Oppo Find X8 Pro, I found that the phone reached 50% charge in about 20 minutes and 100% in around 45 minutes. I started the test, as despite my best efforts I couldn't get the phone to completely run out in a reasonable amount of time.

Standby times are also exceptional, and the phone will do everything in its power to prevent this with warnings at 20%, 10%, 5%, and 2%, before launching into Super Power Saving mode at 1%, limiting your usage to just six apps.

Battery score: 5 / 5

Should you buy the Oppo Find X8 Pro?

Buy it if...

You want the best cameras

The Oppo Find X8 Pro has a simply fantastic camera system that rivals any of our present choices for the best camera phones. The new Quick Button adds even more control.

You want a truly premium design

The Find X8 Pro makes some bold choices with its design, but ultimately feels as luxurious as it does aesthetically fresh. It hits a home run with its ergonomics and is clearly built to last.

You want a beautiful display

The Find X8 Pro comes equipped with a beautiful and immersive 6.78-inch display that rarely looks anything less than great. It's large enough to be a serious contender for watching TV shows and movies on, too.

You want impressive battery life

The Oppo Find X8 Pro lasts a full day of mixed use with energy to spare, with a huge 5,910mAh cell that simply refuses to run all the way down. Charging is absolutely rapid, too.

Don't buy it if...

You're on a budget

The Find X8 Pro offers a lot of high-quality hardware, but you'll certainly pay for it. Chinese phone makers can no longer be relied on to undercut Western brands at the top end of their lineups, and Oppo has proved no different.

You value originality

The Oppo Find X8 Pro does some things better than the iPhone 16 Pro, but it's fairly obvious that the phone was designed with some serious Apple inspiration. If you're someone who likes to reward originality, you might want to look elsewhere.

Oppo Find X8 Pro review: Also consider

iPhone 16 Pro Max

The real thing, as it were. Those who want an iPhone probably won't be swayed by the Find X8 Pro, but nevertheless it's worth considering paying a little extra to scratch the Apple itch if it's one you find yourself stuck with.

Read our iPhone 16 Pro Max review

Samsung Galaxy S24 Plus

The Samsung Galaxy S24 Plus takes the premium design, exceptional cameras, and powerful AI tools of the base-model S24 and puts them into a larger frame, with a bigger display and even better battery life. If you want a large Android phone from a more recognizable brand, this is one to consider.

Read our Samsung Galaxy S24 Plus review

iPhone 16

If you're more intrigued by the Find X8 Pro's Quick Button than anything else, it could be worth taking a look at the iPhone 16. Sure, it's got a humbler specs sheet than Oppo's new flagship, but the Camera Control is far more powerful than the Find X8 Pro's shutter button. It helps that it's a fair bit cheaper, too.

Read our iPhone 16 review

How I tested the Oppo Find X8 Pro

My testing of the Oppo Find X8 Pro included several specifically chosen test scenarios as well as more general day-to-day usage over the course of about one week. The model tested came in the Pearl White color option and came with 512GB of storage.

I used the Oppo Find X8 Pro as my everyday smartphone for about a week to test it, using it to chat with friends and family, scroll through websites and social media, watch videos, listen to music, and play games. I went out to test all four of the phone’s cameras in a variety of conditions. I then considered the performance and value proposition of the Find X8 Pro using my knowledge of the smartphone market and journalistic training.

For more on our smartphone test process, be sure to check out our guide to how we test phones for review.

Next Page »