Organizer
Gadget news
The AMD RX 9070 XT delivers exactly what the market needs with stunning performance at an unbeatable price
5:00 pm | March 5, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

AMD Radeon RX 9070 XT: Two-minute review

AMD had one job to do with the launch of its RDNA 4 graphics cards, spearheaded by the AMD Radeon RX 9070 XT, and that was to not get run over by Blackwell too badly this generation.

With the RX 9070 XT, not only did AMD manage to hold its own against the GeForce RTX monolith, it perfectly positions Team Red to take advantage of the growing discontent among gamers upset over Nvidia's latest GPUs with one of the best graphics cards I've ever tested.

The RX 9070 XT is without question the most powerful consumer graphics card AMD's put out, beating the AMD Radeon RX 7900 XTX overall and coming within inches of the Nvidia GeForce RTX 4080 in 4K and 1440p gaming performance.

It does so with an MSRP of just $599 (about £510 / AU$870), which is substantially lower than those two card's MSRP, much less their asking price online right now. This matters because AMD traditionally hasn't faced the kind of scalping and price inflation that Nvidia's GPUs experience (it does happen, obviously, but not nearly to the same extent as with Nvidia's RTX cards).

That means, ultimately, that gamers who look at the GPU market and find empty shelves, extremely distorted prices, and uninspiring performance for the price they're being asked to pay have an alternative that will likely stay within reach, even if price inflation keeps it above AMD's MSRP.

The RX 9070 XT's performance comes at a bit of a cost though, such as the 309W maximum power draw I saw during my testing, but at this tier of performance, this actually isn't that bad.

This card also isn't too great when it comes to non-raster creative performance and AI compute, but no one is looking to buy this card for its creative or AI performance, as Nvidia already has those categories on lock. No, this is a card for gamers out there, and for that, you just won't find a better one at this price. Even if the price does get hit with inflation, it'll still likely be way lower than what you'd have to pay for an RX 7900 XTX or RTX 4080 (assuming you can find them at this point) making the AMD Radeon RX 9070 XT a gaming GPU that everyone can appreciate and maybe even buy.

AMD Radeon RX 9070 XT: Price & availability

An AMD Radeon RX 9070 XT made by Sapphire on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • How much is it? MSRP is $599 (about £510 / AU$870)
  • When can you get it? The RX 9070 XT goes on sale March 6, 2025
  • Where is it available? The RX 9070 XT will be available in the US, UK, and Australia at launch

The AMD Radeon RX 9070 XT is available as of March 6, 2025, starting at $599 (about £510 / AU$870) for reference-spec third-party cards from manufacturers like Asus, Sapphire, Gigabyte, and others, with OC versions and those with added accoutrements like fancy cooling and RGB lighting likely selling for higher than MSRP.

At this price, the RX 9070 XT comes in about $150 cheaper than the RTX 5070 Ti, and about $50 more expensive than the RTX 5070 and the AMD Radeon RX 9070, which also launches alongside the RX 9070 XT. This price also puts the RX 9070 XT on par with the MSRP of the RTX 4070 Super, though this card is getting harder to find nowadays.

While I'll dig into performance in a bit, given the MSRP (and the reasonable hope that this card will be findable at MSRP in some capacity) the RX 9070 XT's value proposition is second only to the RTX 5070 Ti's, if you're going by its MSRP. Since price inflation on the RTX 5070 Ti will persist for some time at least, in many cases you'll likely find the RX 9070 XT offers better performance per price paid of any enthusiast card on the market right now.

  • Value: 5 / 5

AMD Radeon RX 9070 XT: Specs

An AMD Radeon RX 9070 XT made by Sapphire on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • PCIe 5.0, but still just GDDR6
  • Hefty power draw

The AMD Radeon RX 9070 XT is the first RDNA 4 card to hit the market, and so its worth digging into its architecture for a bit.

The new architecture is built on TSMC's N4P node, the same as Nvidia Blackwell, and in a move away from AMD's MCM push with the last generation, the RDNA 4 GPU is a monolithic die.

As there's no direct predecessor for this card (or for the RX 9070, for that matter), there's not much that we can apples-to-apples compare the RX 9070 XT against, but I'm going to try, putting the RX 9070 XT roughly between the RX 7800 XT and the RX 7900 GRE if it had a last-gen equivalent.

The Navi 48 GPU in the RX 9070 XT sports 64 compute units, breaking down into 64 ray accelerators, 128 AI accelerators, and 64MB of L3 cache. Its cores are clocked at 1,600MHz to start, but can run as fast as 2,970MHz, just shy of the 3GHz mark.

It uses the same GDDR6 memory as the last-gen AMD cards, with a 256-bit bus and a 644.6GB/s memory bandwidth, which is definitely helpful in pushing out 4K frames quickly.

The TGP of the RX 9070 XT is 304W, which is a good bit higher than the RX 7900 GRE, though for that extra power, you do get a commensurate bump up in performance.

  • Specs: 4 / 5

AMD Radeon RX 9070 XT: Design

An AMD Radeon RX 9070 XT made by Sapphire on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • No AMD reference card
  • High TGP means bigger coolers and more cables

There's no AMD reference card for the Radeon RX 9070 XT, but the unit I got to test was the Sapphire Pulse Radeon RX 9070 XT, which I imagine is pretty indicative of what we can expect from the designs of the various third-party cards.

The 304W TGP all but ensures that any version of this card you find will be a triple-fan cooler over a pretty hefty heatsink, so it's not going to be a great option for small form factor cases.

Likewise, that TGP just puts it over the line where it needs a third 8-pin PCIe power connector, something that you may or may not have available in your rig, so keep that in mind. If you do have three spare power connectors, there's no question that cable management will almost certainly be a hassle as well.

After that, it's really just about aesthetics, as the RX 9070 XT (so far) doesn't have anything like the dual pass-through cooling solution of the RTX 5090 and RTX 5080, so it's really up to personal taste.

As for the card I reviewed, the Sapphire Pulse shroud and cooling setup on the RX 9070 XT was pretty plain, as far as desktop GPUs go, but if you're looking for a non-flashy look for your PC, it's a great-looking card.

  • Design: 4 / 5

AMD Radeon RX 9070 XT: Performance

An AMD Radeon RX 9070 XT in a test bench

(Image credit: Future / John Loeffler)
  • Near-RTX 4080 levels of gaming performance, even with ray tracing
  • Non-raster creative and AI performance lags behind Nvidia, as expected
  • Likely the best value you're going to find anywhere near this price point
A note on my data

The charts shown below offer the most recent data I have for the cards tested for this review. They may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

Simply put, the AMD Radeon RX 9070 XT is the gaming graphics card that we've been clamoring for this entire generation. While it shows some strong performance in synthetics and raster-heavy creative tasks, gaming is where this card really shines, managing to come within 7% overall of the RTX 4080 and getting within 4% of the RTX 4080's overall gaming performance. For a card launching at half the price of the RTX 4080's launch price, this is a fantastic showing.

The RX 9070 XT is squaring up against the RTX 5070 Ti, however, and here the RTX 5070 Ti does manage to pull well ahead of the RX 9070 XT, but it's much closer than I thought it would be starting out.

On the synthetics side, the RX 9070 XT excels at rasterization workloads like 3DMark Steel Nomad, while the RTX 5070 Ti wins out in ray-traced workloads like 3DMark Speed Way, as expected, but AMD's 3rd generation ray accelerators have definitely come a long way in catching up with Nvidia's more sophisticated hardware.

Also, as expected, when it comes to creative workloads, the RX 9070 XT performs very well in raster-based tasks like photo editing, and worse at 3D modeling in Blender, which is heavily reliant on Nvidia's CUDA instruction set, giving Nvidia an all but permanent advantage there.

In video editing, the RX 9070 XT likewise lags behind, though it's still close enough to Nvidia's RTX 5070 Ti that video editors won't notice much difference, even if the difference is there on paper.

Gaming performance is what we're on about though, and here the sub-$600 GPU holds its own against heavy hitters like the RTX 4080, RTX 5070 Ti, and Radeon RX 7900 XTX.

In 1440p gaming, the RX 9070 XT is about 8.4% faster than the RTX 4070 Ti and RX 7900 XTX, just under 4% slower than the RTX 4080, and about 7% slower than the RTX 5070 Ti.

This strong performance carries over into 4K gaming as well, thanks to the RX 9070 XT's 16GB VRAM. Here, it's about 15.5% faster than the RTX 4070 Ti and about 2.5% faster than the RX 7900 XTX. Against the RTX 4080, the RX 9070 XT is just 3.5% slower, while it comes within 8% of the RTX 5070 Ti's 4K gaming performance.

When all is said and done, the RX 9070 XT doesn't quite overpower one of the best Nvidia graphics cards of the last-gen (and definitely doesn't topple the RTX 5070 Ti), but given its performance class, it's power draw, its heat output (which wasn't nearly as bad as the power draw might indicate), and most of all, it's price, the RX 9070 XT is easily the best value of any graphics card playing at 4K.

And given Nvidia's position with gamers right now, AMD has a real chance to win over some converts with this graphics card, and anyone looking for an outstanding 4K GPU absolutely needs to consider it before making their next upgrade.

  • Performance: 5 / 5

Should you buy the AMD Radeon RX 9070 XT?

Buy the AMD Radeon RX 9070 XT if...

You want the best value proposition for a high-end graphics card
The performance of the RX 9070 XT punches way above its price point.

You don't want to pay inflated prices for an Nvidia GPU
Price inflation is wreaking havoc on the GPU market right now, but this card might fare better than Nvidia's RTX offerings.

Don't buy it if...

You're on a tight budget
If you don't have a lot of money to spend, this card is likely more than you need.

You need strong creative or AI performance
While AMD is getting better at creative and AI workloads, it still lags far behind Nvidia's competing offerings.

How I tested the AMD Radeon RX 9070 XT

  • I spent about a week with the AMD Radeon RX 9070 XT
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about a week with the AMD Radeon RX 9070 XT, which was spent benchmarking, using, and digging into the card's hardware to come to my assessment.

I used industry standard benchmark tools like 3DMark, Cyberpunk 2077, and Pugetbench for Creators to get comparable results with other competing graphics cards, all of while have been tested using the same testbench setup listed on the right.

I've reviewed more than 30 graphics cards in the last three years, and so I've got the experience and insight to help you find the best graphics card for your needs and budget.

  • Originally reviewed March 2025
Huawei Mate 70 Pro gets a Premium Edition with CPU downgrade
1:01 pm | February 28, 2025

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

Huawei introduced the Mate 70 Pro back in November with a Kirin 9020 chipset and today the company introduced a somewhat confusing alternative version. Dubbed Premium Edition it will go on sale on March 5. The name suggests an improved version, but in reality, it has an underclocked CPU and a lower price tag on the company store Vmall. The listing did not actually mention the chip downgrade – we caught that in a separate Geekbench listing. The Huawei Mate 70 Pro Premium Edition has a model number PLR-AL50 with 1,450 score for a single core and 3,793 for multiple cores. These are...

I tested the iPhone 16e for a week and found it’s a good phone that stretches the definition of ‘budget’
5:00 am | February 27, 2025

Author: admin | Category: Computers Gadgets iPhone Phones | Tags: , , , | Comments: Off

Apple iPhone 16e: Two-Minute Review

The iPhone 16e is a good phone. It has a pleasing design, and it feels like a true member of the iPhone 16 family. It is not a great phone, though – how could it be with a retro notch in the Super Retina XDR display and just a single 48MP camera?

There are 'budget' phones that cost far less and which have larger screens and multiple rear cameras. They're not iOS handsets, and that counts for something – any new iPhone joins an expansive and well-designed ecosystem offering connective tissue between excellent Apple services and other Apple hardware. I mostly live in that world now, and I appreciate how well my iPhone 16 Pro Max works with, for instance, my Mac, and how all my cloud-connected services know it's me on the line.

It's been a while since I've had such conflicting feelings about an iPhone. I appreciate that Apple thought it was time to move away from the iPhone SE design language, one that owed most of its look and feel to 2017's iPhone 8. I'm sure Apple couldn't wait to do away with the Lightning port and the Home button with Touch ID (which lives on in Macs and some iPads). But instead of giving us something fresh, Apple took a bit of this and a bit of that to cobble together the iPhone 16e.

The display is almost the best Apple has to offer if you can ignore the notch, aren't bothered by larger bezels, and don't miss the Dynamic Island too much. The main 48MP Fusion camera is very good and shoots high-quality stills and videos, but don't be fooled by the claims of 2x zoom, which is actually a 12MP crop on the middle of the 48MP sensor. I worry that people paying $599 / £599 / AU$999 for this phone will be a little frustrated that they're not at least getting a dedicated ultra-wide camera at that price.

Conversely, there is one bit of this iPhone 16e that's not only new but is, for the moment, unique among iPhone 16 devices: the C1 chip. I don't know why Apple's cheapest iPhone got this brand-new bit of Apple silicon, but it does a good job of delivering 5G and even satellite connectivity. Plus, it starts moving Apple out from under the yolk of Qualcomm, Apple's cellular modem chip frenemy. That relationship has been fraught for years, and I wonder if Apple had originally hoped to put the C1 in all iPhone 16 models but the development schedule slipped.

Apple iPhone 16e REVIEW

The iPhone 16e (center) with the iPhone 16 (right) and iPhone SE 3 (left). (Image credit: Future / Lance Ulanoff)

In any case, while it's hard to measure the connectivity benefits (it's another good 5G modem), Apple says this is the most efficient cellular modem it's ever put in an iPhone (that seems like a swipe at Qualcomm), and helps to deliver stellar battery life: a claimed 26 hours of video streaming. Battery life in real-world use will, naturally, be a different story.

On balance, I like this phone's performance (courtesy of the A18 chip and 8GB of RAM), its looks, and how it feels in the hand (a matte glass back and Ceramic Shield front), and I think iOS 18 with Apple Intelligence is well-thought-out and increasingly intelligent (though Siri remains a bit of a disappointment); but if you're shopping for a sub-$600 phone, there may be other even better choices from the likes of Google (Pixel 8a), OnePlus (OnePlus 13R) and the anticipated Samsung Galaxy S25 FE. You just have to be willing to leave the Apple bubble.

Apple iPhone 16e: Price and availability

Apple unveiled the iPhone 16e on February 19, 2025. It joins the iPhone 16 lineup, and starts at $599 / £599 / AU$999 with 128GB of storage, making it the most affordable smartphone of the bunch. It's available in black or white.

While some might consider the iPhone 16e to be the successor to the iPhone SE 3, it has little in common with that device. In particular, that was a $429 phone. At $599, Apple might be stretching the definition of budget, but it is $200 cheaper than the base iPhone 16. The phone's price compares somewhat less favorably outside the iOS sphere. The OnePlus 13R for instance is a 6.7-inch handset with three cameras, and the Google Pixel 8a matches the iPhone 16e's 6.1-inch screen size (though at a lower resolution), but also includes two rear cameras.

You won't find more affordable new phones in the iOS space. The iPhone 15 has the main and ultra-wide camera and the Dynamic Island, but it costs $699 / £699 / AU$1,249. A refurbished iPhone 14 costs $529, but neither it nor the iPhone 15 supports Apple Intelligence.

  • Value score: 4/5

Apple iPhone 16: Specs

Apple iPhone 16e: Design

  • No trace of the iPhone SE design remains
  • Hybrid iPhone 14/15 design
  • Sharper edges than the current iPhone 16 design
Image 1 of 5

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 2 of 5

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 3 of 5

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 4 of 5

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 5 of 5

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

There's no question that the iPhone 16e is a part of the iPhone 16 family. At a glance, especially when the screen is off, it's almost a dead ringer for the base model; the aerospace aluminum fame is only slightly smaller.

Upon closer examination, those similarities recede, and I can see the myriad differences that make this a true hybrid design. This is now the only iPhone with a single camera, which almost looks a little lonely on the matte glass back. The edges of the metal band that wraps around the body are noticeably sharper than those of any other iPhone 16, but the phone still feels good in the hand.

The button configuration is essentially what you'd find on an iPhone 15. There's the power / sleep / Siri button on the right, and on the left are the two volume buttons and the Action button. Unlike the rest of the iPhone 16 lineup the 16e doesn't get the Camera Control, but at least the Action button is configurable, so you can set it to activate the camera or toggle the Flashlight, Silent Mode, Voice Memo, and more. I set mine to launch Visual Intelligence, an Apple Intelligence feature: you press and hold the Action button once to open it, and press again to grab a photo, and then you can select on-screen if you want ChatGPT or Google Search to handle the query. Apple Intelligence can also analyze the image directly and identify the subject.

The phone is iP68 rated to handle water and dust, including a dunk in six meters of water for 30 minutes. The screen is protected with a Ceramic Shield to better protect it from drops, though I'm not sure it does much to prevent scratches.

I put a case on the phone, never dropped it, and handled it gingerly, and yet within a day I noticed a long scratch on the screen, although I have no recollection of brushing the display against anything. I had a similar situation with the Samsung Galaxy S25 Ultra; I await the phone that can handle life in my pocket (empty other than the phone) without sustaining a scratch.

Overall, if you like the looks of the iPhone 16 lineup (or even the iPhone 14 and 15 lineups) the iPhone 16e will not disappoint.

  • Design score: 4 / 5

Apple iPhone 16e: Display

  • Almost Apple's best smartphone display
  • The notch is back
  • The bezels are a little bigger

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

If you're coming from the iPhone SE to the iPhone 16E, you're in for quite a shock. This 6.1-inch Super Retina XDR OLED screen is nothing like the 4.7-inch LCD display on that now-retired design.

The iPhone 16e features a lovely edge-to-edge design – with slightly larger bezels than you'll find on other iPhone 16 phones – that leaves no room for the dearly departed Touch ID Home button. Instead, this phone adopts the Face ID biometric security, which is, as far as I'm concerned, probably the best smartphone face recognition in the business. Face ID lives in the TrueDepth camera system notch, which also accommodates, among other things, the 12MP front-facing camera, microphone, and proximity sensor.

While I never had a big problem with the notch, I can't say I'm thrilled to see it return here. The rest of the iPhone 16 lineup features the versatile Dynamic Island, which I think most would agree is preferable to this cutout.

Image 1 of 3

Apple iPhone 16e REVIEW

The iPhone 16e (left) next to the iPhone SE 3 (middle), and the iPhone 16. (Image credit: Future / Lance Ulanoff)
Image 2 of 3

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 3 of 3

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

The iPhone 16e shares the iPhone 16's 460ppi resolution, but it does lose a few pixels (2532 x 1170 versus 2556 x 1179 for the iPhone 16). It still supports True Tone, Wide color (P3), and a 2,000,000:1 contrast ratio. The only area where it loses a bit of oomph is on the brightness front. Peak brightness for HDR content is 1,200 nits, and all other content is 800nits. The iPhone 16's peak outdoor brightness is 2,000 nits. As with other non-pro models, the refresh rate on the iPhone 16e sits at a fixed 60Hz.

Even so, I had no trouble viewing the iPhone 16e screen in a wide variety of lighting situations, and any shortcomings are only evident in the brightest, direct sunlight.

In day-to-day use, everything from photos and video to AAA games, apps, and websites looks great on this display. Colors are bright and punchy, and the blacks are inky. I'm not distracted by the notch on games, where it can cut a bit into the gameplay view, and most video streaming defaults to a letterbox format that steers clear of it, with black bars on the left and right sides of the screen.

  • Display score: 4 / 5

Apple iPhone 16e: Software and Apple Intelligence

  • iOS 18 is a rich and well-thought-out platform
  • Apple Intelligence has some impressive features, but we await the Siri of our dreams
  • Mail and photo redesigns leave something to be desired

iOS 18 is now smarter, more proactive, and more customizable than ever before. I can transform every app icon from 'Light' to 'Tinted' (monochromatic), fill my home screen with widgets, and expand them until they almost fill the screen. This customizability carries through to the Control Center, which is now a multi-page affair that I can leave alone, or completely reorganize so the tools I care about are available with a quick swipe down from the upper-right corner.

Image 1 of 2

Apple iPhone 16e REVIEW

(Image credit: Future)
Image 2 of 2

Apple iPhone 16e REVIEW

(Image credit: Future)

Apple Intelligence, which Apple unveiled last June, is growing in prominence and utility. It lives across apps like Messages and Email in Writing Tools, which is a bit buried so I often forget it exists. It's in notification summaries that can be useful for at-a-glance action but which are sometimes a bit confusing, and in image-generation tools like Image Playground and Genmojis.

It's also in Visual intelligence, which, as have it set up, gives me one-button access to ChatGPT and Google Search.

Image 1 of 2

Apple iPhone 16e review

Apple Intelligence Clean Up does an excellent job of removing those big lights (Image credit: Future / Lance Ulanoff)
Image 2 of 2

Apple iPhone 16e review

See? (Image credit: Future / Lance Ulanoff)

I think I prefer the more utilitarian features of Apple Intelligence like Clean Up. It lets you quickly remove people and objects from photos as if they were never there in the first place.

I'm also a fan of Audio Mix, which is not a part of Apple Intelligence, but uses machine learning to clean up the messiest audio to make it usable in social media, podcasts, or just for sharing with friends.

iOS 18 also features updated Photos and Mail apps with Apple Intelligence. I've struggled a bit with how Photos reorganized my images, and I've had similar issues with how Mail is now reorganizing my emails. I hope Apple takes another run at these apps in iOS 19.

Siri is smarter and more aware of iPhone features than before. It can handle my vocal missteps, and still knows what I want, but remains mostly unaware of my on-device information, and feels far less conversational and powerful as a chatbot than Google Gemini and ChatGPT.

  • Software score: 4.5 / 5

Apple iPhone 16e: Camera

  • 48MP Fusion is a good camera
  • The front-facing camera shines as well
  • A single rear camera at this price is disappointing

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

With a more powerful CPU, a bigger screen, and the new C1 chip, I can almost understand why Apple set the iPhone 16e price as high as it did. Almost… until I consider the single, rear 48MP Fusion camera. Most smartphones in this price range feature at least two lenses, and usually the second one is an ultra-wide – without that lens you miss out on not only dramatic ultra-wide shots but also macro photography capabilities. Had Apple priced this camera at $499, I might understand.

Image 1 of 4

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 2 of 4

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 3 of 4

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 4 of 4

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

Still, I like this camera. It defaults to shooting in 24MP, which is a bin of the 48MP available on the sensor (two pixels for each single image pixel to double the image information). There's a 2x zoom option, which is useful, but it's only shooting at 12MP because it's only using the central 12 megapixels from the full 48MP frame. These images are still good, but just not the same resolution as the default or what you could get shooting full-frame.

Overall, the camera shoots lovely photos with exquisite detail and the kind of color fidelity I appreciate (in people and skies especially) in a wide variety of scenarios. I captured excellent still lifes, portraits, and night-mode shots. I was also impressed with the front camera, which is especially good for portrait-mode selfies. Much of this image quality is thanks to the work Apple has done on its Photonic Engine. Apple's computational image pipeline pulls out extraordinary detail and nuance in most photographic situations, even if it is for just these two cameras.

iPhone 16 camera samples

Image 1 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
Image 2 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 1x (Image credit: Future / Lance Ulanoff)
Image 3 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera (Image credit: Future / Lance Ulanoff)
Image 4 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera portrait mode (Image credit: Future / Lance Ulanoff)
Image 5 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 1x (Image credit: Future / Lance Ulanoff)
Image 6 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
Image 7 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
Image 8 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x, night mode (Image credit: Future / Lance Ulanoff)
Image 9 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x, night mode (Image credit: Future / Lance Ulanoff)
Image 10 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 1x (Image credit: Future / Lance Ulanoff)
Image 11 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
Image 12 of 15

Apple iPhone 16e REVIEW

Rear camera, 1x (Image credit: Future / Lance Ulanoff)
Image 13 of 15

Apple iPhone 16e REVIEW

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
Image 14 of 15

Apple iPhone 16e REVIEW

Rear camera, 1x (Image credit: Future / Lance Ulanoff)
Image 15 of 15

Apple iPhone 16e REVIEW

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
  • Camera score: 4 / 5

Apple iPhone 16e: Performance

  • The A18 is an excellent and powerful CPU
  • It's ready for Apple Intelligence
  • C1, Apple's first cellular modem, is effective for 5G and satellite connectivity

If you're wondering why the successor to the iPhone SE is not a $429 smartphone, you might look at the processing combo of the powerful A18 and the new C1.

The A18 is the same chip you'll find in the iPhone 16, with the exception of one fewer GPU core. I promise you'll never notice the difference.

Performance scores are excellent, and in line with the numbers we got for other A18 chips (and slightly lower than what you get from the A18 Pro in the iPhone 16 Pro and 16 Pro Max).

The A18 has more than enough power not just for day-to-day tasks like email and web browsing, but for 4K video editing (which I did in CapCut) and AAA gaming (game mode turns on automatically to divert more resources toward gaming). I played Asphalt 9 United, Resident Evil 4, and Call of Duty Mobile, and made things easier for myself by connecting my Xbox controller. My only criticism would be that a 6.1-inch screen is a little tight for these games. The audio from the stereo speakers, by the way, is excellent – I get an impressive spatial audio experience with Resident Evil 4.

Image 1 of 3

Apple iPhone 16e review

(Image credit: Future / Lance Ulanoff)
Image 2 of 3

Apple iPhone 16e review

(Image credit: Future / Lance Ulanoff)
Image 3 of 3

Apple iPhone 16e review

(Image credit: Future / Lance Ulanoff)

There's also the new C1 chip, which is notable because it's Apple's first custom cellular mobile chip. Previously Apple relied on, among other partners, Qualcomm for this silicon. I didn't notice any difference in connectivity with the new chip, which is a good thing – and I was impressed that I could use text via satellite.

Apple iPhone 16e REVIEW

(Image credit: Future)

I didn't think I'd get to test this feature, but AT&T connectivity is so bad in my New York neighborhood that the SOS icon appeared at the top of my iPhone 16e screen, and next to it I noticed the satellite icon. I opened messages, and the phone asked if I wanted to use the Satellite texting feature. I held the phone near my screen door to get a clear view of the sky, and followed the on-display guide that told me which way to point the phone. I got a 'Connected' notification, and then sent a few SMS texts over satellite. It's a nifty feature, and it was a nice little test of the C1's capabilities.

  • Performance score: 5 / 5

Apple iPhone 16e: Battery

  • Long lasting
  • Wireless charging
  • No MagSafe

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

It's clear that Apple has prioritized battery life on the iPhone 16e over some other features. That would likely explain, for instance, why we have wireless charging but not MagSafe support – adding that magnetic ring might have eaten into battery space. The C1 chip is apparently smaller than the modem chip in other iPhone 16 models, and even the decision to include one camera instead of two probably helped make room for what is a larger battery than even the one in the iPhone 16.

Apple rates the iPhone 16e for 26 hours of video-rundown battery life – that's about four hours more than the iPhone 16. In my real-world testing the battery life has been very good, but varied use can run the battery down in far fewer than 26 hours.

On one day when I did everything from email and web browsing to social media consumption and then a lot of gaming, battery life was about 12 hours – gaming in particular really chewed through the battery and made the phone pretty warm.

My own video rundown test (I played through episodes of Better Call Saul on Netflix) returned about 24 hours of battery life.

I used a 65W USB-C charger to charge the phone to 57% in 30 minutes, with a full charge taking about one hour and 50 minutes. I also tried a 20W charger, which charged the phone to 50% in 30 minutes.

  • Battery score: 5 / 5

Should you buy the Apple iPhone 16e?

iPhone 16e score card

Buy it if..

You want an affordable, smaller iPhone

This is now your only brand-new 'budget' iPhone choice.

You want sub-$600 access to Apple Intelligence

Apple squeezed a A18 chip inside this affordable iPhone to give you access to Apple's own brand of AI.

Don’t buy it if...

You're a photographer

A single, albeit excellent, rear lens won't be enough for people who like to shoot wide-angle and macros.

You never liked the notch

Apple bringing back a none-too-loved display feature doesn't make a lot of sense. If you want the Dynamic Island at a more affordable price than the iPhone 16, take a look at the iPhone 15.

You want a real zoom lens

The 2x zoom on the iPhone 16e is not a true optical zoom; instead, it's a full-frame sensor crop. If a big optical zoom is your thing, look elsewhere.

Apple iPhone 16: Also consider

iPhone 15

For $100 more you get two cameras, the Dynamic Island, and the Camera Control.

Read TechRadar's iPhone 15 review.

Google Pixel 8a

As soon as you step outside the Apple ecosystem you'll find more affordable phones with more features. The Pixel 8a is not as powerful as the iPhone 16e, but it has a nice build, two cameras, excellent Google services integration, and affordable access to Gemini AI features.

Read TechRadar's Google Pixel 8a review.

Apple iPhone 16: How I tested

I've reviewed countless smartphones ranging from the most affordable models to flagships and foldables. I put every phone through as many rigorous tests and everyday tasks as possible.

I had the iPhone 16e for just under a week, and after receiving it I immediately started taking photos, running benchmarks, and using it as an everyday device for photos, videos, email, social media, messaging, streaming video, and gaming.

Correction 2-27-2025: A previous version of this review listed Audio Mix as part of Apple Intelligence.

First reviewed February 26, 2025

Nvidia GeForce RTX 5090: the supercar of graphics cards
5:00 pm | January 23, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Nvidia GeForce RTX 5090: Two-minute review

The Nvidia GeForce RTX 5090 is a difficult GPU to approach as a professional reviewer because it is the rare consumer product that is so powerful, and so good at what it does, you have to really examine if it is actually a useful product for people to buy.

Right out the gate, let me just lay it out for you: depending on the workload, this GPU can get you up to 50% better performance versus the GeForce RTX 4090, and that's not even factoring in multi-frame generation when it comes to gaming, though on average the performance is still a respectable improvement of roughly 21% overall.

Simply put, whatever it is you're looking to use it for, whether gaming, creative work, or AI research and development, this is the best graphics card for the job if all you care about is pure performance.

Things get a bit more complicated if you want to bring energy efficiency into the equation. But if we're being honest, if you're considering buying the Nvidia RTX 5090, you don't care about energy efficiency. This simply isn't that kind of card, and so as much as I want to make energy efficiency an issue in this review, I really can't. It's not intended to be efficient, and those who want this card do not care about how much energy this thing is pulling down—in fact, for many, the enormous TDP on this card is part of its appeal.

Likewise, I can't really argue too much with the card's price, which comes in at $1,999 / £1,939 / AU$4,039 for the Founders Edition, and which will likely be much higher for AIB partner cards (and that's before the inevitable scalping begins). I could rage, rage against the inflation of the price of premium GPUs all I want, but honestly, Nvidia wouldn't charge this much for this card if there wasn't a line out the door and around the block full of enthusiasts who are more than willing to pay that kind of money for this thing on day one.

Do they get their money's worth? For the most part, yes, especially if they're not a gamer but a creative professional or AI researcher. If you're in the latter camp, you're going to be very excited about this card.

If you're a gamer, you'll still get impressive gen-on-gen performance improvements over the celebrated RTX 4090, and the Nvidia RTX 5090 is really the first consumer graphics card I've tested that can get you consistent, high-framerate 8K gameplay even before factoring in Multi-Frame Generation. That marks the RTX 5090 as something of an inflection point of things to come, much like the Nvidia RTX 2080 did back in 2018 with its first-of-its-kind hardware ray tracing.

Is it worth it though?

That, ultimately, is up to the enthusiast buyer who is looking to invest in this card. At this point, you probably already know whether or not you want it, and many will likely be reading this review to validate those decisions that have already been made.

In that, rest easy. Even without the bells and whistles of DLSS 4, this card is a hearty upgrade to the RTX 4090, and considering that the actual price of the RTX 4090 has hovered around $2,000 for the better part of two years despite its $1,599 MSRP, if the RTX 5090 sticks close to its launch price, it's well worth the investment. If it gets scalped to hell and sells for much more above that, you'll need to consider your purchase much more carefully to make sure you're getting the most for your money. Make sure to check out our where to buy an RTX 5090 guide to help you find stock when it goes on sale.

Nvidia GeForce RTX 5090: Price & availability

  • How much is it? MSRP is $1,999 / £1,939 / AU$4,039
  • When can you get it? The RTX 5090 goes on sale January 30, 2025
  • Where is it available? The RTX 5090 will be available in the US, UK, and Australia at launch
Where to buy the RTX 5090

Looking to pick up the RTX 5090? Check out our Where to buy RTX 5090 live blog for updates to find stock in the US and UK

The Nvidia GeForce RTX 5090 goes on sale on January 30, 2025, starting at $1,999 / £1,939 / AU$4,039 for the Nvidia Founders Edition and select AIB partner cards. Overclocked (OC) and other similarly tweaked cards and designs will obviously run higher.

It's worth noting that the RTX 5090 is 25% more expensive than the $1,599 launch price of the RTX 4090, but in reality, we can expect the RTX 5090 to sell for much higher than its MSRP in the months ahead, so we're really looking at an asking price closer to the $2,499.99 MSRP of the Turing-era Nvidia Titan RTX (if you're lucky).

Of course, if you're in the market for the Nvidia RTX 5090, you're probably not squabbling too much about the price of the card. You're already expecting to pay the premium, especially the first adopter premium, that comes with this release.

That said, this is still a ridiculously expensive graphics card for anyone other than an AI startup with VC backing, so it's worth asking yourself before you confirm that purchase if this card is truly the right card for your system and setup.

  • Value: 3 / 5

Nvidia GeForce RTX 5090: Specs & features

The Nvidia GeForce RTX 5090's power connection port

(Image credit: Future / John Loeffler)
  • First GPU with GDDR7 VRAM and PCIe 5.0
  • Slightly slower clocks
  • Obscene 575W TDP

There are a lot of new architectural changes in the Nvidia RTX 50 series GPUs that are worth diving into, especially the move to a transformer AI model for its upscaling, but let's start with the new specs for the RTX 5090.

First and foremost, the flagship Blackwell GPU is the first consumer graphics card to feature next-gen GDDR7 video memory, and it is substantially faster than GDDR6 and GDDR6X (a roughly 33% increase in Gbps over the RTX 4090). Add in the much wider 512-bit memory interface and you have a total memory bandwidth of 1,790GB/s.

This, more than even the increases VRAM pool of 32GB vs 24GB for the RTX 4090, makes this GPU the first really capable 8K graphics card on the market. 8K textures have an enormous footprint in memory, so moving them through the rendering pipelines to generate playable framerates isn't really possible with anything less than this card has.

Yes, you can, maybe, get playable 8K gaming with some RTX 40 or AMD Radeon RX 7000 series cards if you use aggressive upscaling, but you won't really be getting 8K visuals that'll be worth the effort. In reality, the RTX 5090 is what you want if you want to play 8K, but good luck finding an 8K monitor at this point. Those are still years away from really going mainstream (though there are a growing number of 8K TVs).

If you're settling in at 4K though, you're in for a treat, since all that bandwidth means faster 4K texture processing, so you can get very fast native 4K gaming with this card without having to fall back on upscaling tech to get you to 60fps or higher.

The GeForce RTX logo on the Nvidia GeForce RTX 5090

(Image credit: Future / John Loeffler)

The clock speeds on the RTX 5090 are slightly slower, which is good, because the other major top-line specs for the RTX 5090 are its gargantuan TDP of 575W and its PCIe 5.0 x16 interface. For the TDP, this thermal challenge, according to Nvidia, required major reengineering of the PCB inside the card, which I'll get to in a bit.

The PCIe 5.0 x16 interface, meanwhile, is the first of its kind in a consumer GPU, though you can expect AMD and Intel to quickly follow suit. Why this matters is because a number of newer motherboards have PCIe 5.0 lanes ready to go, but most people have been using those for PCIe 5.0 m.2 SSDs.

If your motherboard has 20 PCIe 5.0 lanes, the RTX 5090 will take up 16 of those, leaving just four for your SSD. If you have one PCIe 5.0 x4 SSD, you should be fine, but I've seen motherboard configurations that have two or three PCIe 5.0 x4 m.2 slots, so if you've got one of those and you've loaded them up with PCIe 5.0 SSDs, you're likely to see those SSDs drop down to the slower PCIe 4.0 speeds. I don't think it'll be that big of a deal, but it's worth considering if you've invested a lot into your SSD storage.

As for the other specs, they're more or less similar to what you'd find in the RTX 4090, just more of it. The new Blackwell GB202 GPU in the RTX 5090 is built on a TSMC 4nm process, compared to the RTX 4090's TSMC 5nm AD102 GPU. The SM design is the same, so 128 CUDA cores, one ray tracing core, and four tensor cores per SM. At 170 SMs, you've got 21,760 CUDA cores, 170 RT cores, and 680 Tensor cores for the RTX 5090, compared to the RTX 4090's 128 SMs (so 16,384 CUDA, 128 RT, and 512 Tensor cores).

  • Specs & features: 4.5 / 5

Nvidia GeForce RTX 5090: Design

The Nvidia GeForce RTX 5090 sitting on its packaging

(Image credit: Future / John Loeffler)
  • Slim, dual-slot form factor
  • Better cooling

So there's a significant change to this generation of Nvidia Founders Edition RTX flagship cards in terms of design, and it's not insubstantial.

Holding the RTX 5090 Founders Edition in your hand, you'll immediately notice two things: first, you can comfortably hold it in one hand thanks to it being a dual-slot card rather than a triple-slot, and second, it's significantly lighter than the RTX 4090.

A big part of this is how Nvidia designed the PCB inside the card. Traditionally, graphics cards have been built with a single PCB that extends from the inner edge of the PC case, down through the PCIe slot, and far enough back to accommodate all of the modules needed for the card. On top of this PCB, you'll have a heatsink with piping from the GPU die itself through a couple of dozen aluminum fins to dissipate heat, with some kind of fan or blower system to push or pull cooler air through the heated fins to carry away the heat from the GPU.

The problem with this setup is that if you have a monolithic PCB, you can only really extend the heatsinks and fans off of the PCB to help cool it since a fan blowing air directly into a plastic wall doesn't do much to help move hot air out of the graphics card.

A split view of the Nvidia GeForce RTX 5090's dual fan passthrough design

(Image credit: Future / John Loeffler)

Nvidia has a genuinely novel innovation on this account, and that's ditching the monolithic PCB that's been a mainstay of graphics cards for 30 years. Instead, the RTX 5090 (and presumably subsequent RTX 50-series GPUs to come), splits the PCB into three parts: the video output interface at the 'front' of the card facing out from the case, the PCIe interface segment of the card, and the main body of the PCB that houses the GPU itself as well as the VRAM modules and other necessary electronics.

This segmented design allows a gap in the front of the card below the fan, so rather than a fan blowing air into an obstruction, it can fully pass over the fins of the GPU's heatsink, substantially improving the thermals.

As a result, Nvidia is able to shrink the width of the card down considerably, moving from a 2.4-inch width to a 1.9-inch width, or a roughly 20% reduction on paper. That said, it feels substantially smaller than its predecessor, and it's definitely a card that won't completely overwhelm your PC case the way the RTX 4090 does.

The 4 8-pin to 16-pin 12VHPWR adapter included with the Nvidia GeForce RTX 5090

(Image credit: Future / John Loeffler)

That said, the obscene power consumption required by this card means that the 8-pin adapter included in the RTX 5090 package is a comical 4-to-1 dongle that pretty much no PSU in anyone's PC case can really accommodate.

Most modular PSUs give you three PCIe 8-pin power connectors at most, so let's just be honest about this setup. You're going to need to get a new ATX 3.0 PSU with at least 1000W to run this card at a minimum (it's officially recommended PSU is 950W, but just round up, you're going to need it), so make sure you factor that into your budget if you pick this card up

Otherwise, the look and feel of the card isn't that different than previous generations, except the front plate of the GPU where the RTX 5090 branding would have gone is now missing, replaced by a finned shroud to allow air to pass through. The RTX 5090 stamp is instead printed on the center panel, similar to how it was done on the Nvidia GeForce RTX 3070 Founders Edition.

As a final touch, the white back-lit GeForce RTX logo and the X strips on the front of the card, when powered, add a nice RGB-lite touch that doesn't look too guady, but for RGB fans out there, you might think it looks rather plain.

  • Design: 4.5 / 5

Nvidia GeForce RTX 5090: Performance

An Nvidia GeForce RTX 5090 slotted into a test bench

(Image credit: Future)
  • Most powerful GPU on the consumer market
  • Substantially faster than RTX 4090
  • Playable 8K gaming
A note on my data

The charts shown below are the most recent test data I have for the cards tested for this review and may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

So how does the Nvidia GeForce RTX 5090 stack up against its predecessor, as well as the best 4K graphics cards on the market more broadly?

Very damn well, it turns out, managing to improve performance over the RTX 4090 in some workloads by 50% or more, while leaving everything else pretty much in the dust.

Though when looked at from 30,000 feet, the overall performance gains are respectable gen-on-gen but aren't the kind of earth-shattering gains the RTX 4090 made over the Nvidia GeForce RTX 3090.

Starting with synthetic workloads, the RTX 5090 scores anywhere from 48.6% faster to about 6.7% slower than the RTX 4090 in various 3DMark tests, depending on the workload. The only poor performance for the RTX 5090 was in 3DMark Night Raid, a test where both cards so completely overwhelm the test that the difference here could be down to CPU bottlenecking or other issues that aren't easily identifiable. On every other 3DMark test, though, the RTX 5090 scores 5.6% better or higher, more often than not by 20-35%. In the most recent;y released test, Steel Nomad, the RTX 5090 is nearly 50% faster than the RTX 4090.

On the compute side of things, the RTX 5090 is up to 34.3% faster in Geekbench 6 OpenGL compute test and 53.9% faster in Vulcan, making it an absolute monster for AI researchers to leverage.

On the creative side, the RTX 5090 is substantially faster in 3D rendering, scoring between 35% and 49.3% faster in my Blender Benchmark 4.30 tests. There's very little difference between the two cards when it comes to video editing though, as they essentially tie in PugetBench for Creators' Adobe Premiere test and in Handbrake 1.7 4K to 1080p encoding.

The latter two results might be down to CPU bottlenecking, as even the RTX 4090 pushes right up against the performance ceiling set by the CPU in a lot of cases.

When it comes to gaming, the RTX 5090 is substantially faster than the RTX 4090, especially at 4K. In non-upscaled 1440p gaming, you're looking at a roughly 18% better average frame rate and a 22.6% better minimum/1% framerate for the RTX 5090. With DLSS 3 upscaling (but no frame generation), you're looking at 23.3% better average and 23% better minimum/1% framerates overall with the RTX 5090 vs the RTX 4090.

With ray tracing turn on without upscaling, you're getting 26.3% better average framerates and about 23% better minimum/1% framerates, and with upscaling turned on to balanced (again, no frame generation), you're looking at about 14% better average fps and about 13% better minimum/1% fps for the RTX 5090 against the RTX 4090.

At 4K, however, the faster memory and wider memory bus really make a difference. Without upscaling and ray tracing turned off, you're getting upwards of 200 fps at 4K for the RTX 5090 on average, compared to the RTX 4090's 154 average fps, a nearly 30% increase. The average minimum/1% fps for the RTX 5090 is about 28% faster than the RTX 4090, as well. With DLSS 3 set to balanced, you're looking at a roughly 22% better average framerate overall compared to the RTX 4090, with an 18% better minimum/1% framerate on average as well.

With ray tracing and no upscaling, the difference is even more pronounced with the RTX 5090 getting just over 34% faster average framerates compared to the RTX 4090 (with a more modest 7% faster average minimum/1% fps). Turn on balanced DLSS 3 with full ray tracing and you're looking at about 22% faster average fps overall for the RTX 5090, but an incredible 66.2% jump in average minimum/1% fps compared to the RTX 4090 at 4K.

Again, none of this even factors in single frame generation, which can already substantially increase framerates in some games (though with the introduction of some input latency). Once Multi-Frame Generation rolls out at launch, you can expect to see these framerates for the RTX 5090 run substantially higher. Pair that with Nvidia Reflex 2 to help mitigate the input latency issues frame generation can introduce, and the playable performance of the RTX 5090 will only get better with time, and it's starting from a substantial lead right out of the gate.

In the end, the overall baseline performance of the RTX 5090 comes in about 21% better than the RTX 4090, which is what you're really looking for when it comes to a gen-on-gen improvement.

That said, you have to ask whether the performance improvement you do get is worth the enormous increase in power consumption. That 575W TDP isn't a joke. I maxed out at 556W of power at 100% utilization, and I hit 100% fairly often in my testing and while gaming.

The dual flow-through fan design also does a great job of cooling the GPU, but at the expense of turning the card into a space heater. That 575W of heat needs to go somewhere, and that somewhere is inside your PC case. Make sure you have adequate airflow to vent all that hot air, otherwise everything in your case is going to slowly cook.

As far as performance-per-price, this card does slightly better than the RTX 4090 on value for the money, but that's never been a buying factor for this kind of card anyway. You want this card for its performance, plain and simple, and in that regard, it's the best there is.

  • Performance: 5 / 5

Should you buy the Nvidia GeForce RTX 5090?

A masculine hand holding an RTX 5090

(Image credit: Future)

Buy the Nvidia GeForce RTX 5090 if...

You want the best performance possible
From gaming to 3D modeling to AI compute, the RTX 5090 serves up best-in-class performance.

You want to game at 8K
Of all the graphics cards I've tested, the RTX 5090 is so far the only GPU that can realistically game at 8K without compromising on graphics settings.

You really want to flex
This card comes with a lot of bragging rights if you're into the PC gaming scene.

Don't buy it if...

You care about efficiency
At 575W, this card might as well come with a smokestack and a warning from your utility provider about the additional cost of running it.

You're in any way budget-conscious
This card starts off more expensive than most gaming PCs and will only become more so once scalpers get their hands on them. And that's not even factoring in AIB partner cards with extra features that add to the cost.

You have a small form-factor PC
There's been some talk about the new Nvidia GPUs being SSF-friendly, but even though this card is thinner than the RTX 4090, it's just as long, so it'll be hard to fit it into a lot of smaller cases.

Also consider

Nvidia GeForce RTX 4090
I mean, honestly, this is the only other card you can compare the RTX 5090 to in terms of performance, so if you're looking for an alternative to the RTX 5090, the RTX 4090 is pretty much it.

Read the full Nvidia GeForce RTX 4090 review

How I tested the Nvidia GeForce RTX 5090

  • I spent about a week and a half with the RTX 5090
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about a week and a half testing the Nvidia GeForce RTX 5090, both running synthetic tests as well as using it in my day-to-day PC for both work and gaming.

I used my updated testing suite, which uses industry standard benchmark tools like 3DMark, Geekbench, Pugetbench for Creators, and various built-in gaming benchmarks. I used the same testbench setup listed to the right for the purposes of testing this card, as well as all of the other cards I tested for comparison purposes.

I've tested and retested dozens of graphics cards for the 20+ graphics card reviews I've written for TechRadar over the last few years, and so I know the ins and outs of these PC components. That's why you can trust my review process to help you make the right buying decision for your next GPU, whether it's the RTX 5090 or any of the other graphics cards I review.

  • Originally reviewed January 2024
There’s a Snapdragon 8 Elite version with just seven CPU cores now
6:03 pm | January 17, 2025

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

The Snapdragon 8 Elite chipset went official in October, and as you may already know it has an octa-core CPU. Right? Well, yes, but also no. The Snapdragon 8 Elite we've known so far definitely does, but it turns out there's another version. This just got listed by Qualcomm on its website (see the Source linked below). And it has just seven CPU cores. Two Prime cores clocked at up to 4.32 GHz, and five Performance cores clocked at up to 3.53 GHz. Basically, one Performance core is missing compared to the regular Snapdragon 8 Elite. This hepta-core CPU is packed inside the Snapdragon...

Lenovo ThinkPad P16v Gen 2 mobile workstation review
10:12 pm | January 15, 2025

Author: admin | Category: Computers Gadgets Pro | Tags: , , | Comments: Off

Lenovo's ThinkPad lineup has always been a significant grouping of offerings for business professionals. The Lenovo ThinkPad P16v Gen 2 is no different. It targets professionals who need workstation-grade performance on the go.

The ThinkPad P16 is one of the best Lenovo ThinkPad laptops around - ideal for heavy computational and graphical work. Compared to the P16, I view the P16v Gen 2 as a ThinkPad P16 lite. But that's not any official branding; it's just my viewpoint. It's a slightly less powerful P16, but still very much enterprise-focused and workstation-esque.

Lenovo ThinkPad P16v Gen 2: Price and Availability

The Lenovo ThinkPad P16v Gen 2 starts at $1,791.92 (pre-tax) and quickly scales up to well over $3,500 before any pre-installed software options if you want to max out the hardware offerings.

These and custom builds are available on Lenovo's website, and pre-built models are available in places like Amazon or other computer retailers.

Lenovo ThinkPad P16v Gen 2

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P16v Gen 2: Unboxing and First Impressions

The ThinkPad P16v Gen 2 laptop comes in the Lenovo packaging, a beefy yellow-tipped Lenovo charger (though you can also charge via USB-C, albeit slower), and other essential documentation. I was immediately reminded of the P16, though the P16v is a bit slimmer and lighter (4.89 lb vs. 6.5 lb).

Another thing that I noticed right away was the port offering and location. I'll discuss this more later, but right off the bat, I was surprised to see a full ethernet port and ports on the back; then again, though thin, this is a workstation. Lastly, I genuinely like the matte black finish on this laptop. It feels professional, and I like it for the same reasons. Though I love some sweet backpack colors, I will always choose black. I love some splashes of color from Apple these days, but I always prefer simple colors. It's clean, goes with everything, and it looks professional.

Lenovo ThinkPad P16v Gen 2: Design and Build Quality

Specs

CPU: Intel Core Ultra 7 165H to Ultra 9 185H options
GPU: NVIDIA RTX 2000 Ada Gen or RTX 3000 Ada Gen
Display: 16” WUXGA (1920 x 1200), IPS, 100% sRGB to 16" WQUXGA (3840 x 2400), IPS, 100%DCI-P3, 60Hz
Storage: 2x 2TB SSD M.2 drives
RAM: 8GB DDR5, upgradable to 96GB .

Unsurprisingly, the Lenovo ThinkPad P16v Gen 2 is very similar to the ThinkPad P16 in design, much like the name. The P16v Gen 2 is slimmer and more portable than a ThinkPad P16. However, it still feels relatively robust and like any of the best mobile workstations I've tried, with actual portability in mind. Thanks to the real estate left behind due to the 16" screen, Lenovo could add a full numpad to the right of the entire keyboard, and better yet, it's comfortable to type on.

The port offering on this computer is excellent for the modern employee needing workstation-grade power. There is an SD Card Reader, an optional Smart Card reader, a full-size HDMI port, a USB-A Port, two Thunderbolt 4 ports, and a full RJ45 Ethernet port. What's fascinating and pretty brilliant is that one of the Thunderbolt ports and the Ethernet port are on the back of the ThinkPad P16v Gen 2. This makes it super easy to plug into a Thunderbolt Docking station and/or that ethernet port, both of which you'd want running away from your desk or workspace exactly how they will when plugged into the back of your laptop.

Lenovo ThinkPad P16v Gen 2

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P16v Gen 2: In use

I've had this laptop in my rotation for the last couple of weeks, and it has been a pretty good computer. It can easily handle my productivity suite of tasks, content creation and video editing, and photo editing. It can handle the 3D modeling software for my 3D printer and all of it at once. I really appreciate the ethernet port and Thunderbolt 4 port on the back, as I could have the not-so-flexible ethernet port run away from my computer when I needed to hardline into the internet at one of my job sites. Whenever I am at my desk, I can easily plug into the docking station I have set up running to my monitors and peripherals.

Lenovo ThinkPad P16v Gen 2

(Image credit: Collin Probst // Future)

Another thing worth mentioning is the reliability and usability of the ThinkPad keyboards. While I never want to use the famous TouchPoint embedded within the keyboard, it's handy when I think about using it. On top of that, the typing experience is quite comfortable, even for all-day typing, as I do.

Lenovo has also chosen to utilize the space granted by the 16-inch screen to fit in a numpad. Some laptops, even with 16-inch screens, will just fit the exact size keyboard in the center of the allotted space. Lenovo chose to utilize that space fitting in a full-numberpad. For those who work with spreadsheets, phone numbers, or numbers in general, having a dedicated numpad makes data entry exponentially faster, and that's easy to do with the ThinkPad P16v Gen 2, adding to the allure for the business professional.

Lenovo ThinkPad P16v Gen 2

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P16v Gen 2: Final verdict

The ThinkPad P16v Gen 2 delivers an exceptional balance of power, portability, and professional features. While it doesn’t quite match the raw performance of the P16, its lighter build and price point make it an excellent choice for professionals on the move who need a reliable machine.


For more workplace computing, we've tested the best business laptops.

Lenovo ThinkPad P1 Gen 7 mobile workstation review
10:41 am | December 25, 2024

Author: admin | Category: Computers Gadgets Pro | Tags: , , | Comments: Off

The Lenovo ThinkPad P1 Gen 7 is Lenovo's take on an all-around perfect portable workstation machine. The Gen 7, of course, replaces the Gen 6 and now boasts up to an Intel Core Ultra 9 185H and an NVIDIA RTX 4070. However, it can also be built with integrated graphics and an Intel Core Ultra 5 with a light 16GB of RAM.

Much like Dell's Precision line-up, the ThinkPad P series is designed for professionals needing a computer that can handle computationally demanding tasks like 3D rendering, video editing, coding, data analysis, and things of that nature. Like many of the best Lenovo ThinkPad laptops I've reviewed, while casual users can use it, this price point focuses on professional users who rely on their machines to be workhorses and get work done.

Lenovo ThinkPad P1 Gen 7: Price and Availability

The Lenovo ThinkPad P1 Gen 7 starts at the base level for under $2,000 with an Intel Core Ultra 5, 16GB of RAM, and integrated graphics. This can be upgraded to a machine that costs over $5,000 when equipped with an Intel Core Ultra 9, NVIDIA RTX 4070 Graphics, 64GB of RAM, and 4TB SSD. What's great about this is that yes. At the same time, this is not an entry-level computer. Thanks to the customization options available for processor, memory, storage, and graphics, it can be kitted to fit just about any professional need. That said, check out our Lenovo coupon codes to see if you can save on the ThinkPad P1 Gen 7.

Lenovo ThinkPad P1 Gen 7

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P1 Gen 7: Unboxing and First Impressions

Out of the box, it's clear this is not designed to be a super-lightweight-ultra-portable-thinnest-device-ever kind of machine. It's beefy. But not in a way that resembles the laptops of a decade ago. As we've seen from many of the best mobile workstations, it's sleek where it can be but houses a lot under the hood -- or keyboard. Depending on the GPU configuration, the P1 Gen 7 has a 135W or 170W charger, the appropriate manuals, and any accessories purchased at Lenovo. The minimalist matte-black design exudes sleek professionalism. However, one thing to note is that it is prone to smudges.

Lenovo ThinkPad P1 Gen 7

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P1 Gen 7: Design and Build Quality

Specs

CPU: Up to an Intel Core Ultra 9 185H
GPU: Up to an NVIDIA RTX 4070
Display: Up to 4K OLED
RAM: Up to 64GB LPDDR5X
Storage: Up to 8TB SSD with built-in RAID options

Overall, the laptop is 17mm thick and 4.3lb. That's not huge in the world of laptops, though it is larger than some of the laptops I am working with. The P1 Gen 7 is made of a combination of Magnesium and Aluminum and has a durability rating of MIL-STD 810H. It can withstand your daily wear and tear and the burdens of being an everyday workhorse.

Completing the all-too-famous ThinkPad design, the TrackPoint is prominently in the center of the keyboard, and the overall design language matches what is frequently found with ThinkPad.

Lenovo ThinkPad P1 Gen 7

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P1 Gen 7: In use

I have used this computer extensively in my workflow for the past few months. Overall, it is an impressive machine. It is remarkably powerful, easily handles multitasking and demanding performance programs, and has a sleek and attractive design. What more could you ask for in a computer? It even has a better port offering than the ever-popular Dell Powerhouses and better port offerings than MacBooks. I have only heard the fans kick on during heavily intensive or many heavy tasks stacked together. Outside of that, I have not heard the fan kick on for my day-to-day professional work even once.

Some more features that make this computer great would be the Wi-Fi 7 antennae, great port offering, a solid trackpad, a comfortable keyboard, and a decent battery.

Lenovo ThinkPad P1 Gen 7

(Image credit: Collin Probst // Future)

I've enjoyed using this computer for everything in my day to day. The keyboard is comfortable enough for long email sessions or writing articles (like this one). The trackpad is responsive enough that I don't need to bring a mouse in my backpack when I am away from my desk for the day. The ports are fantastic. I can leave my dongles at home since this laptop has everything I could need on a given notice built into the computer. Another thing that makes this computer great is that it is super portable. Yes, it's powerful and practical, but it's also surprisingly easy to carry around from place to place in my studio, office, coffee shop, bag, house, and so on. It's simple, and it doesn't get in the way. It's great for my professional workflow.

Lenovo ThinkPad P1 Gen 7

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P1 Gen 7: Final verdict

The Lenovo ThinkPad P1 Gen 7 is an impressive example of what mobile workstations can be. Though premium priced, its versatility, build quality, and performance justify its cost for professionals seeking the best tools to do their work reliably.


For more workplace hardware, we've reviewed the best business laptops

MediaTek Dimensity 8400 now official with all big core CPU, a first for its segment
3:28 pm | December 23, 2024

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

MediaTek's Dimensity 9300 launched last year was the first Android SoC to go with an all big core configuration for its CPU, something this year's Dimensity 9400 also went with, but those are both top of the line chipsets. Today marks the introduction of the first upper-midrange SoC with an all big core CPU. Meet the MediaTek Dimensity 8400, which is sure to pose a big threat, performance-wise, to the Qualcomm Snapdragon 7+ Gen 3, which has been the uncontested king of this category for quite a while now. The Dimensity 8400 comes with eight Cortex-A725 CPU cores, clocked at up to...

Chuwi LarkBox S mini PC review
5:02 pm | December 14, 2024

Author: admin | Category: Computers Computing Gadgets | Tags: , , , | Comments: Off

Chuwi LarkBox S: 30-second review

Picking up the LarkBox S, you can instantly tell that it’s a little different from your standard entry-level mini PC. Firstly, the build quality of this small machine is extremely high, with a mix of plastic and metal used for the outer casing, which instantly positions it at a much higher price level than it actually is.

Looking over the casing, everything is nicely finished, with four rubber feet on the base, a good selection of ports on the front and back, and plenty of ventilation for the i3 processor. As a mini PC, its looks and specifications mark it out as a perfect option for most offices or day-to-day home use. Even on a generous day, this won't be the best mini PC for heavier workloads. With its basic entry-level Intel i3 processor, it has enough power to handle all Microsoft Office apps as well as a little bit of multimedia production, as long as you don’t push it too hard.

In use, the LarkBox S proves to be decent enough at running all office applications. Excel loads quickly, and handling large but not overly complex documents is not an issue. Likewise, opening Edge and browsing, with a bit of streaming highlights how well it can handle 4K video streaming without issue. Additionally, connectivity through Wi-Fi and Bluetooth is fast enough to keep up with demands.

Surprisingly, for this level of machine, even loading up DaVinci Resolve and working with some standard H.265 footage from a Sony Alpha 7 Mark IV was manageable for basic edits. However, output render times were quite significant for a 10–15 minute edit. Then as effects or more complex edits are applied, the machine begins to struggle.

Returning to the design, there are a few interesting highlights, such as the discreet graphic design on the top case and the LED light array that surrounds the front ports. These features are more common with gaming mini PCs than office PCs but are a nice touch and add a sense of fun.

Overall, as an office machine or one to use for day-to-day admin tasks at home, the LarkBox S is an extremely good option. The build quality is well above that of most entry-level PCs, and the choice of the internal hardware is pretty decent for non-intensive applications.

Chuwi LarkBox S: Price and availability

  • How much does it cost? From £250 / $250
  • When is it out? Available now
  • Where can you get it? Directly from Chuwi.com or Amazon.com

While the LarkBox S is directly aimed at the entry-level market, it isn’t the cheapest option available. This is reflected in the quality of the build as well as the higher-end components used. It retails for around $250 / £250 and is available directly through the CHUWI website or major online retailers such as Amazon.

  • Value: 4 / 5

Chuwi LarkBox S

(Image credit: Alastair Jennings)

Chuwi LarkBox S: Specs

  • CPU: Intel Core i3-1220P (10 Cores, 12 Threads, 12 MB cache, up to 4.4 GHz)
  • Graphics: Intel UHD Graphics
  • RAM: 16GB DDR4 3200MHz (Dual-channel SO-DIMM Slots, Expandable up to 64GB)
  • Storage: 512GB PCIe 3.0 SSD (1× M.2 2280 PCIe 3.0 SSD Slot, Expandable up to 1TB)
  • Rear Ports: 2× USB 3.2 Gen 1 Type-A Ports, 2× USB 2.0 Type-A Ports, 1× HDMI 2.0 Port, 1× HDMI 1.4 Port, 1× 1000Mbps LAN Jack, 1× DC-In Jack
  • Front Ports: 1× Full-featured Type-C Port, 1× USB 3.2 Gen 1 Type-C Port, 1× 3.5mm Audio Jack
  • Connectivity: Wi-Fi 5, Bluetooth 5.1
  • Audio: 3.5mm Audio Jack
  • Camera: Not specified
  • Size: 118 × 118 × 41.3 mm
  • OS installed: Windows 11 Home
  • Accessories: 1× LarkBox S Mini PC, 1× VESA Mount, 6× Screws, 1× Power Adapter, 1× User Manual, 1× Warranty Card, 1× Inspection Report

Chuwi LarkBox S: Design

The LarkBox S is an entry-level mini PC, and its design style is both minimalistic and compact. Compared with other mini PCs, it is just a touch smaller at 118 × 118 × 41.3mm and weighs only 478 g. While most entry-level mini PCs are quite lightweight and plasticky, there’s absolutely nothing plastic-feeling about the LarkBox S.

The outer casing is made of a robust mixture of plastic and metal, giving it the durability to withstand the occasional knock during transport.

The satin effect finish is another really nice touch, reinforcing a slightly premium feel for what is essentially a relatively inexpensive machine. Similarly, the quality of the inlay around the ports, both front and back, shows that the machining and moulding are about as good as it gets for mini PCs.

Chuwi LarkBox S

(Image credit: Alastair Jennings)

While the design is stylish and understated, it’s interesting to note the inclusion of subtle inlay graphics on the top of the casing. These add a touch of design flair, visible only when the light catches them. Additionally, there’s an LED array on the front that changes colour as it operates. This is reminiscent of many gaming PCs but is more of a fun, decorative touch on what is otherwise aimed at office use which is a bit odd.

The general layout of the machine is well thought-out. On the front, there's a 3.5 mm audio jack, two USB Type-C ports, two USB Type-A ports, and the power button. Both sides feature plenty of venting, as does the back, which houses the AC input, two HDMI ports, a LAN port, and two additional USB Type-A ports.

One of the standout features of this mini PC is the ability to upgrade both the RAM and SSD. For RAM, it uses DDR4 dual-channel SO-DIMM slots, supporting up to two 32 GB sticks, and it comes with 16 GB as standard. Storage-wise, as an entry-level machine, the hard drive is relatively modest at 512 GB. This is a PCIe 3.0 SSD, but if 512 GB feels too small, it can be upgraded with an M.2 2280 PCIe 3.0 SSD of up to 1 TB, so still not huge.

  • Design: 4.5 / 5

Chuwi LarkBox S

(Image credit: Alastair Jennings)

Chuwi LarkBox S: Features

Starting with the size, this small machine measures 118 x 118 x 41.3 mm, making it one of the smaller mini PCs out there, although it’s relatively heavy at 478 g, which just reinforces the fact that it is made of high-quality materials.

When it comes to the internal hardware, there's an Intel Core i3-1220P with 10 cores, 12 threads, 12 MB of cache, and up to 4.4 GHz. This is supported by standard Intel UHD graphics and 16 GB of DDR4 3200 MHz RAM. The motherboard is dual-channel, so if you want to upgrade that RAM, then you can install two 32 GB sticks as mentioned before, taking you up to 64 GB, which will be useful if you are thinking about doing any multimedia editing. When it comes to storage, this is limited to a 512 GB PCIe 3.0 SSD, and again there's only a single slot for this on the motherboard. You can install an M.2 2280 PCIe 3.0 SSD, and that's expandable up to 1TB.

As an entry-level machine, it comes with Windows 11 Home installed, which will give you all of the normal functions and features. It also supports Wi-Fi 5 and Bluetooth 5.1, so it is not the latest technology, but it still offers good, solid performance. There is also an Ethernet option if you are using a wired network that supports up to 1000 Mbps.

When it comes to ports, you have one full-feature Type-C port, one USB 3.2 Gen 1 Type-C port, two USB 3.2 Gen 1 Type-A ports, two USB 2.0 Type-A ports, one HDMI 2.0 port, one HDMI 1.4 port, one 1000 Mbps LAN port, one 3.5 mm audio jack, and the DC-in.

Even as an entry level machine you can connect up to three displays through the two HDMI ports alongside one of the USB Type-C ports, which is the fully featured port. From the USB Type-C, you can run one 4K resolution monitor at up to 144 Hz. Through the HDMI 2.0 port, you can run one 4K monitor at up to 60 Hz, and through the HDMI 1.4 port, you can run a monitor at 4K up to 30 Hz.

As one final point on the feature set, as is now standard with most mini PCs, the computer also comes with VESA support, so if you do want to mount it on a wall or behind your monitor, then that is perfectly possible.

Chuwi LarkBox S

(Image credit: Alastair Jennings)
  • Features: 4 / 5

Chuwi LarkBox S: Performance

The design of the LarkBox S makes it extremely quick and easy to get started, with the ports on the back being easily accessible for plugging in the HDMI and the two USB Type-As to connect monitor, keyboard, and mouse. Pressing the power button boots up into Windows 11 Home, and you can run through the usual setup process, which takes about 5 to 10 minutes.

While the processor and GPU are relatively low-powered, they are more than sufficient to handle Windows 11 Home, providing a smooth experience from the outset, even when connected to a 4K monitor. Once Windows has finished the set-up, you can proceed to install the applications you need. For us, this included benchmarking software, a few games, Microsoft Office, and multimedia tools such as DaVinci Resolve for video editing and Adobe Photoshop for photography.

Starting with general admin use of the machine, it quickly becomes apparent that it has been finely tuned for day-to-day office use. The LarkBox S handles Microsoft Office and Microsoft Edge (or other browsers) well enough. One minor issue we did have was the LED light array at the front, which, while aesthetically pleasing, it can become a little distracting over time. However, it is possible to turn it off via the firmware settings, though this does require a bit of technical know-how.

Pushing the machine to a higher level of demand, we loaded up DaVinci Resolve to edit some 4K video shot on the Sony Alpha 7 Mark IV in the H.265 file format. Surprisingly, the machine handled this with relative ease for a simple 10-minute video edit. The project included multiple tracks with little grading and no effects applied. However, as soon as text or effects were added to the footage, the machine began to struggle. That said, you could still manage a 10 to 15-minute edit without too much trouble. The main point where the machine struggled was during export, as it took a significant amount of time to render the video into a file ready for upload. Still, for small video projects, this machine should suffice.

Switching to Adobe Lightroom and loading a few images, the machine performed well and was more than capable of handling basic edits to enhance your imagery for print or online use. Moving on to Photoshop with high-resolution files from the Sony A7 IV, the performance remained impressive for basic edits. The only noticeable slowdown occurred when using the brush tool for dodging and burning highlights and shadows. As layers accumulated, the processor and GPU began to struggle, revealing the machine's limitations.

The final test was to assess gaming performance. For this, we selected Tekken 8 and Hogwarts Legacy. It quickly became apparent when loading Tekken 8 that the machine was going to struggle graphically. While this program is often manageable on entry-level machines, the LarkBox S couldn't quite handle the demands. Once in the game (which took some time), even with settings reduced to a minimum and resolution down to Full HD, gameplay was possible but far from a good experience. With Hogwarts Legacy, the game was simply beyond the machine's capabilities and could not run effectively. However, less graphically and processor-intensive games, such as Portal 2 or the legacy Tomb Raider series, ran smoothly and without issue.

  • Performance: 4 / 5

Chuwi LarkBox S

(Image credit: Alastair Jennings)

Chuwi LarkBox S: Final verdict

Chuwi LarkBox S

(Image credit: Alastair Jennings)

Taking a look at what’s on offer here with the LarkBox S, you essentially have an entry-level machine with a premium build quality and very stylish, if discreet, looks. It also has an edge of flair about it with the LED lighting array at the front, which could easily lead you to mistake it for a gaming mini PC. In reality, this is an office machine ideally suited to everyday work with Word and Excel, as well as browsing the internet and a little bit of light multimedia editing.

While the processing and graphical power of the machine isn’t huge, as a day-to-day machine that is built to a high standard and quality, it should last you for a good number of years. The LarkBox S is a great option. Alongside the quality build and relatively decent feature set for the price, there’s also the ability to upgrade the internal SSD and RAM. While the storage capacity upgrade is limited to a maximum of 1 TB, the fact that you can upgrade the RAM to 64 GB gives you a little more flexibility for multimedia editing and handling larger Excel documents.

Overall, the LarkBox S is a great option for any business looking for a compact mini PC for office administration work. Its compact size, durability, and design mean it will fit nicely into any environment. It’s a shame that it comes with Windows Home rather than Windows Pro, but you can always upgrade if needed. As a solid, well-rounded machine with a quality build and finish, the LarkBox S is a worthwhile option that justifies the extra cost over some cheaper alternatives.

Should I buy a Chuwi LarkBox S?

Buy it if...

You want a high-quality build

If you’re looking for a machine that will withstand more than the occasional knock or can be used for van life or in a workshop, the high-quality casing and solid build should meet your needs.

You need plenty of connection options

Across the front and back, there are plenty of connection options, enabling you to connect up to three monitors as well as external hard drives and other accessories neatly and easily.

Don't buy it if...

You need large internal storage

The internal hardware used means that it’s limited to just one terabyte of internal storage with the upgrade, which isn’t a great deal. If you’re looking for a machine for video editing or multimedia, you might want something with more internal storage potential.

You want to play games

Even if you’re a casual gamer and want the option to play some of the latest games, even at reduced resolution, this machine and its integrated GPU will struggle.


For productivity desktops, we reviewed the best business computers.

Intel Arc B580 review: A spectacular success for Intel and a gateway to 1440p for gamers on a budget
5:00 pm | December 12, 2024

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Intel Arc B580: Two-minute review

When I reviewed the Arc A770 and A750, I said that these Alchemist GPUs were impressive first efforts for Intel's Arc range, but not yet at the level that they needed to be to compete with the likes of Nvidia and AMD in discrete graphics.

Well, with the release of the new Intel Arc B580 (2nd-gen Battlemage), there's no doubt that Intel has produced one of the best graphics cards of this generation, and given gamers on a budget an absolute gift just in time for the holidays.

For starters, let's talk about the price of this GPU. At just $249.99 / £249.99 / AU$439, the Arc B580 undercuts both Nvidia's and AMD's budget offerings, the RTX 4060 and RX 7600, while offering substantially better performance, making its value proposition untouchable at this price range.

While I'll dig deeper into the performance in a bit, I'll cut to the chase and point out the simple fact that neither the RTX 4060 nor the RX 7600 can game at 1440p without severely compromising graphics quality. Not only can the B580 perform this feat, it does so brilliantly.

This comes down to some very straightforward spec choices that Intel made with its Battlemage debut that, especially in hindsight, make Nvidia and AMD's respective decisions even more baffling. First, with a VRAM pool of 12GB, the B580 can hold the larger texture files needed for 1440p gaming, whereas the RTX 4060 Ti cannot, due to its 8GB VRAM loadout.

Then there's the B580's wider 192-bit memory interface, compared to the RTX 4060 Ti's and RX 7600 XT's 128-bit. While this might seem like an obscure spec, it's the secret sauce for the B580. This beefier interface allows it to process those larger texture files much faster than its competitors, so this GPU can fully leverage its bigger VRAM pool in a way that Nvidia and AMD's competing cards simply can't, even with larger VRAM configurations.

Boiling all this down, you end up with a budget-class GPU that can get you fast 1440p framerates the likes of which we haven't seen since the RTX 3060 Ti.

Even more impressive, in my mind, is that I did not encounter a single game where there was some kind of quirk or hiccup caused by the driver. With the Arc Alchemist cards last year, there were issues with some games not running well because of inadequate driver support, or a game's reliance on an older version of DirectX that the Alchemist GPUs weren't optimized for. I didn't encounter any of those problems this time around. The Intel graphics team's long, hard work on getting Arc's drivers up to par has definitely paid off.

If there's a criticism I can make of this graphics card, it's that its creative performance isn't as good as Nvidia's. But given the entire creative world's reliance on Nvidia's bespoke CUDA instruction set, neither Intel nor AMD were ever really going to be able to compete here.

Fortunately, the Intel Arc B580 is a graphics card for gaming, and for any gamer looking to play at 1440p resolution on the cheap, the B580 is really the only graphics card that can do it, making it the only GPU you should be considering at this price point.

Intel Arc B580: Price & availability

An Intel Arc B580 resting upright on its retail packaging

(Image credit: Future / John Loeffler)

The Intel Arc B580 is available in the US, UK, and Australia, and has been from December 13, 2024, starting at $249.99, £249.99, and AU$439 respectively. Third-party graphics card partners like Acer, ASRock, and others will have their own variants of the B580, and their prices may be higher, depending on the card.

The closest competition for the Arc B580 in terms of price are the Nvidia RTX 4060 and AMD RX 7600, both of which have a $20-$50 higher MSRP. And while Nvidia and AMD are preparing to roll out their next-gen graphics cards starting next month, it will still be a few months after the initial flagship launches before either company's budget offerings are announced. So, the B580 is the only current-gen GPU available for under $250 / £250 / AU$450 at the moment, and will likely remain so for many months to come.

  • Value: 5/5

Intel Arc B580: Specifications

The video output ports on the Intel Arc B580

(Image credit: Future / John Loeffler)

Intel Arc B580: Architecture & features

A masculine hand holding up the Intel Arc B580

(Image credit: Future / John Loeffler)

The Intel Arc B580 is the first discrete GPU from Intel based on its new Xe2 graphics architecture, codenamed Battlemage, and there are a lot of low-level changes over the previous-gen Intel Arc Alchemist. Many of these are small tweaks to the architectural design, such as the move from SIMD32 to SIMD16 instructions, but when taken together, all of these small changes add up to a major overhaul of the GPU.

That, in addition to using TSMC's 5nm process, means that even though the GPU itself has become physically smaller in just about every measure, it's much more powerful.

The B580 has a roughly 17% reduction in compute units from the Arc A580 and about 10% fewer transistors, but Intel says that its various architectural changes produce about 70% better performance per compute unit (or Xe core, as Intel calls it). I haven't tested or reviewed the Intel Arc A580, so I can't say for certain if that claim holds up, but there has definitely been a major performance gain gen-on-gen based on my experience with the higher-end Arc Alchemist cards. We also can't ignore the substantially faster boost clock of 2,850MHz, up from 1,700MHz for the A580.

Outside of the GPU architecture, there is also a smaller memory bus, with the A580's 256-bit interface dropping down to 192-bit for the B580, but the B580 features a 50% increase in its video memory pool, as well as a faster memory clock.

  • Specs & features: 4.5 / 5

Intel Arc B580: Design

The brand marking on the Intel Arc B580

(Image credit: Future / John Loeffler)

The Intel Arc B580 Limited Edition reference card is what you'd call the 'base' version of this GPU, but don't call it basic.

Despite its all-black-with-white-accent-lettering appearance, this is a good-looking graphics card, much like the initial Arc Alchemist GPUs before it, thanks to its matte, textured black shroud, dual-fan cooling, and rather understated aesthetic.

In a PC component world full of ostentatious, overly aggressive and flashy designs, there is something almost respectable about a graphics card in 2024 that presents itself without gimmicks, almost daring you to underestimate its capabilities due to its lack of RGB.

That said, there is one noticeable difference with this graphics card's design: the open 'window' over the internal heatsink to help with airflow and cooling. Unfortunately, the HWInfo64 utility I use to measure temperature and power draw for the GPUs I review couldn't read the Arc B580 during testing, so I can't tell you how much of a difference this window makes compared to something like the Intel Arc A750—but it certainly won't hurt its thermals.

Beyond that, the card also sports a single 8-pin power connector, in keeping with its 190W TBP, so you can pretty much guarantee that if you already have a discrete GPU in your system, you'll have the available power cables from your PSU required to use this GPU.

It's also not a very large graphics card, though it is larger than some RTX 4060 and RX 7600 GPUs (it's about 10.7-inches / 272mm), though third-party variants might be more compact. In any case, it's a dual-slot card, so it'll fit in place as an upgrade for just about any graphics card you have in your PC currently.

  • Design: 4.5 / 5

Intel Arc B580: Performance

An Intel Arc B580 running on a test bench

(Image credit: Future / John Loeffler)

OK, so now we come to why I am absolutely in love with this graphics card: performance.

Unfortunately, I don't have an Intel Arc A580 card on hand to compare this GPU to, so I can't directly measure how the B580 stacks up to its predecessor. But I can compare the B580 to its current competition, as well as the Intel Arc A750, which prior to this release was selling at, or somewhat below, the price of this graphics card, and has comparable specs.

In terms of pure synthetic performance, the Arc B580 comes in second to the Nvidia RTX 4060 Ti, performing about 10% slower overall. That said, there were some tests, like 3DMark Fire Strike Ultra, Wild Life Extreme (and Wild Life Extreme Unlimited), and Time Spy Extreme where the extra VRAM allowed the Arc B580 to pull ahead of the much more expensive Nvidia RTX 4060 Ti. The Arc B580 did manage to outperform the RTX 4060 by about 12%, however.

Creative workloads aren't the Arc B580's strongest area, with Nvidia's RTX 4060 and RTX 4060 Ti performing substantially better. This might change once PugetBench for Creators Photoshop benchmark gets updated however, as it crashed during every single test I ran, regardless of which graphics card I was using.

Notably, the Intel Arc B580 encoded 4K video to 1080p at a faster rate using Intel's H.264 codec in Handbrake 1.61 than all of the other cards tested using Nvidia or AMD's H.264 options, so this is something for game streamers to consider if they're looking for a card to process their video on the fly.

But what really matters with this GPU is gaming, and if you compare this graphics card's 1080p performance to the competition, you'll have to go with the nearly 40% more expensive Nvidia RTX 4060 Ti in order to beat it, and it's not a crushing defeat for Intel. While I found the Arc B580 is about 17% slower than the RTX 4060 Ti on average at 1080p (with no ray tracing or upscaling), it's still hitting 82 FPS on average overall and actually has a slightly higher minimum/1% FPS performance of just under 60 FPS.

The AMD RX 7600 XT, Intel Arc A750, and Nvidia RTX 4060 don't even come close to reaching these kinds of numbers, with the Arc B580 scoring a roughly 30% faster average 1080p FPS and an incredible 52% faster minimum/1% FPS advantage over the Nvidia RTX 4060, which comes in a very distant third place among the five GPUs being tested. All in all, it's an impressive performance from the Intel Battlemage graphics card.

Also worth noting is that the Intel Arc B580's ray-tracing performance is noticeably better than AMD's, and not that far behind Nvidia's, though its upscaling performance lags a bit behind AMD and Nvidia at 1080p.

Even more impressive, though, is this card's 1440p performance.

Typically, if you're going to buy any 1440p GPU, not even the best 1440p graphics card, you should expect to pay at least $400-$500 (about £320-£400 / AU$600-AU$750). And to really qualify as a 1440p GPU, you need to hit an average of 60 FPS overall, with an average FPS floor of about 40 FPS. Anything less than that, and you're going to have an uneven experience game-to-game.

In this regard, the only two graphics cards I tested that qualify are the Nvidia RTX 4060 Ti and the Intel Arc B580, and they are very close to each other in terms of 1440p performance. (I can give an honorable mention to the Nvidia RTX 4060, which almost got there, but not quite).

While Nvidia has certain built-in advantages owing to its status as the premiere GPU brand (so pretty much any game is optimized for Nvidia hardware by default), at 1440p it only barely ekes out a win over the Intel Arc B580. And that's ultimately down to its stronger native ray-tracing performance—a scenario which pretty much no one opts for. If you're going to use ray tracing, you're going to use upscaling, and in that situation, the RTX 4060 Ti and Arc B580 are effectively tied at 1440p.

And this 1440p performance in particular is why I'm so enthusiastic about this graphics card. While this is the performance section of the review, I can't help but talk about the value that this card represents for gamers—especially the growing number of 1440p-aspiring gamers out there.

Prior to the Intel Arc B580, gaming at 1440p—which is the PC gaming sweet spot; believe me, I've extensively tested nearly every GPU of the past four years at 1440p—was something reserved for the petit bourgeois of PC gamers. These are the folks not rich enough to really go in for the best 4K graphics cards, but they've got enough money to buy a 1440p monitor and a graphics card powerful enough to drive it.

This used to mean something approaching a grand just for these two items alone, locking a lot of gamers into incremental 1080p advances for two successive generations. No more.

Now, with an entry-level 1440p monitor coming in under $300 /£300 / AU$450, it's entirely possible to upgrade your rig for 1440p gaming for about $500 / £500 / AU$750 with this specific graphics card (and only this graphics card), which is absolutely doable for a hell of a lot of gamers out there who are still languishing at 1080p.

Ultimately, this, more than anything, raises the Intel Arc B580 into S-Tier for me, even though Nvidia's $399.99 RTX 4060 Ti GPU gets slightly better performance. The Nvidia RTX 4060 Ti just doesn't offer this kind of value for the vast majority of gamers out there, and even with its improved performance since its launch, the 4060 Ti is still very hard to recommend.

The Nvidia RTX 4060, meanwhile, can't keep up with the B580 despite being 20% more expensive. And with the AMD RX 7600 XT, laden with its $329.99 MSRP (about £250 / AU$480 RRP), falling noticeably behind the B580, the RX 7600 (which I haven't had a chance to retest yet) doesn't stand a chance (and has a slightly more expensive MSRP).

And, it has to be emphasized, I experienced none of the driver issues with the Intel Arc B580 that I did when I originally reviewed the Intel Arc A750 and Arc A770. Every game I tested ran perfectly well, even if something like Black Myth Wukong ran much better on the two Nvidia cards than it did on Intel's GPUs. Tweak some settings and you'll be good to go.

This was something that just wasn't the case with the previous-gen Arc graphics cards at launch, and it truly held Intel back at the time. In one of my Intel Arc Alchemist reviews, I compared that generation of graphics cards to fantastic journeyman efforts that were good, but maybe not ready to be put out on the show floor. No more. Intel has absolutely graduated to full GPU maker status, and has done so with a card more affordable than the cheapest graphics cards its competition has to offer.

Simply put, for a lot of cash-strapped gamers out there, the Intel Arc B580's performance at this price is nothing short of a miracle, and it makes me question how Intel of all companies was able to pull this off while AMD and Nvidia have not.

Even if you don't buy an Intel Arc B580, give Intel its due for introducing this kind of competition into the graphics card market. If Intel can keep this up for the B570, and hopefully the B770 and B750, then Nvidia and AMD will have no choice but to rein in their price inflation with the next-gen cards they plan to offer next year, making it a win-win for every gamer looking to upgrade.

  • Performance: 4.5 / 5

Intel Arc B580: Should you buy it?

A masculine hand holding an Intel Arc B580

(Image credit: Future / John Loeffler)

Buy the Intel Arc B580 if...

You want an extremely affordable 1440p graphics card
A 1440p graphics card can be quite expensive, but the Intel Arc B580 is incredibly affordable.

You're looking for great gaming performance
The Intel Arc B580 delivers incredible framerates for the price.

Don't buy it if...

You're looking for a budget creative GPU
While the B580 isn't terrible, if you're looking for a GPU for creative work, there are better cards out there.

You want a cheap GPU for AI workloads
The Intel Arc B580 might have dedicated AI hardware, but it still lags behind Nvidia by a good amount.

Also consider

Nvidia GeForce RTX 4060
The Nvidia RTX 4060 is a better option for a lot of creative tasks on a budget, though its gaming performance isn't as strong despite the higher price.

Read the full Nvidia GeForce RTX 4060 review

Nvidia GeForce RTX 4060 Ti
If you want a strong 1080p and 1440p gaming GPU, but also need some muscle for creative or machine learning/AI workloads, this card is what you'll want, so long as you're willing to pay the extra premium in the price.

Read the full Nvidia GeForce RTX 4060 Ti review

How I tested the Intel Arc B580

The backplate of the Intel Arc B580

(Image credit: Future / John Loeffler)
  • I tested the Intel Arc B580 for about three weeks
  • I used my updated suite of graphics card benchmark tests
  • I used the Arc B580 as my primary work GPU for creative workloads like Adobe Photoshop, as well as some in-depth game testing
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

Over the course of about three weeks, I used the Intel Arc B580 as my primary workstation GPU when I wasn't actively benchmarking it.

This included using the graphics card for various creative workloads like Adobe Photoshop and light video encoding work.

I also used the B580 for some in-depth game testing, including titles like Black Myth Wukong, Satisfactory, and other recently released games.

I've been doing graphics card reviews for TechRadar for more than two years now, and I've done extensive GPU testing previous to that on a personal basis as a lifelong PC gamer. In addition, my computer science coursework for my Master's degree utilized GPUs very heavily for machine learning and other computational workloads, and as a result, I know my way around every aspect of a GPU. As such, you can rest assured that my testing process is both thorough and sound.

  • Originally reviewed December 2024
« Previous PageNext Page »