Organizer
Gadget news
I’ve been reviewing Dell laptops for years, and I’m still shocked at how much I like the Dell 14 Plus
10:00 pm | May 17, 2025

Author: admin | Category: Computers Computing Gadgets Laptops Windows Laptops | Tags: , , , | Comments: Off

Dell 14 Plus: One-minute review

The Dell 14 Plus is one of the first rebranded Dell laptops to go on sale this year, and despite some growing pains it’s an auspicious start thanks to its solid performance, great portability and style, and an accessible price point.

The new 14 Plus is available now in the US, UK, and Australia, starting at $799.99 / £999 / AU$1,298, and features both Intel Lunar Lake and AMD Ryzen AI 300 processors. This makes it one of the best laptops for budget-conscious Windows users on the market right now without making too many compromises on style, portability, or features.

What you won’t get with the 14 Plus, however, is a professional workstation or one of the best gaming laptops, as the integrated graphics and processor options don’t have the kind of horsepower to churn through complex workloads like video editing or intense gaming at high settings.

But for those in the market for a new laptop for general productivity or school work, everyday computing tasks, video calls, or video streaming, and which offers good responsiveness, battery life, and portability, the Dell 14 Plus delivers pretty much exactly what you need to get the job done – and it even looks pretty good while doing it.

It’s not perfect (I’ll get to its faults soon enough), but for the price and the performance on offer the 14 Plus is easily one of the best Windows laptops going, and should be at the top of the list for students, remote workers, and just about anyone else who needs a solid notebook PC without breaking the bank.

Dell 14 Plus: Price & availability

A Dell 14 Plus on a desk with its lid facing outward

(Image credit: Future / John Loeffler)
  • How much does it cost? Starts at $799.99 / £999 / AU$1,298
  • When is it available? It's available now
  • Where can you get it? You can buy it in the US, UK, and Australia through Dell’s website and other retailers.

The Dell 14 Plus is available now in the US, UK, and Australia, starting at $799.99 / £999 / AU$1,298 for its base configuration, the specs of which vary depending on your region, and maxes out at $1,479.99 / £1,299 / AU$1,498.20.

Compared to something like the Acer Swift 14 AI or the Apple MacBook Air 13 (M4), the Dell 14 Plus almost always comes in cheaper when similarly specced, and in a couple of instances you get better specs with the 14 Plus for a lower price than competing devices like the Asus Zenbook A14, making it an attractive option for value shoppers who don’t want to sacrifice too much in the way of performance.

  • Value: 5 / 5

Dell 14 Plus: Specs

The internal components of the Dell 14 Plus

(Image credit: Future / John Loeffler)
  • Configurations vary considerably between the US, UK, and Australia
  • Options for both Intel Core Ultra 200V and AMD Ryzen AI 300 processors
  • No discrete graphics options

The starting configurations for the Dell 14 Plus vary slightly depending on your region, with the US and Australia sharing the same specs – AMD Ryzen AI 340 CPU with Radeon 849M graphics, 16GB LPDDR5X memory, and a 14-inch FHD+, 300-nit, non-touch display – while the starting setup in the UK uses an Intel Core Ultra 7 256V with second-gen Intel Arc graphics, 16GB of slightly faster LPDDR5X-8533 memory, and a 14-inch 2.5K (2560 x 1600p) 300-nit, non-touch IPS display. All starting configs come with 512GB PCIe NVMe SSD storage.

Dell 14 Plus Base Specs

Region

US

UK

Australia

Price:

$799.99 at Dell.com

£999 at Dell.com

AU$1,298 at Dell.com

CPU:

AMD Ryzen AI 5 340

Intel Core Ultra 7 256V

AMD Ryzen AI 5 340

GPU:

AMD Radeon 840M Graphics

Intel Arc Xe2 (140V)

AMD Radeon 840M Graphics

Memory:

16GB LPDDR5X-7500

16GB LPDDR5X-8533

16GB LPDDR5X-7500

Storage:

512GB SSD

512GB SSD

512GB SSD

Screen:

14-inch 16:10 FHD+ (1200p), 300-nit, non-touch IPS

14-inch 16:10 2.5K (1600p), 300-nit, non-touch IPS

14-inch 16:10 FHD+ (1200p), 300-nit, non-touch IPS

Ports:

2 x USB-C 3.2 Gen 2 w/ DP and Power Delivery, 1 x USB 3.2 Gen 1 Type-A, 1 x HDMI 1.4, 1 x combo jack

1 x USB 3.2 Gen 1, 1 x USB 3.2 Gen 2 Type-C w/ DP 1.4 and Power Delivery, 1 x Thunderbolt 4 w/ DP 2.1 and Power Delivery, 1 x HDMI 2.1, 1 x combo jack

2 x USB-C 3.2 Gen 2 w/ DP and Power Delivery, 1 x USB 3.2 Gen 1 Type-A, 1 x HDMI 1.4, 1 x combo jack

Battery (WHr):

64 WHr

64 WHr

64 WHr

Wireless:

WiFi 7, BT 5.4

WiFi 7, BT 5.4

WiFi 7, BT 5.4

Camera:

1080p@30fps

1080p@30fps

1080p@30fps

Weight:

3.35 lb (1.52 kg)

3.42 lbs (1.55kg)

3.35 lbs (1.52kg)

Dimensions:

12.36 x 8.9 x 0.67 ins | (314 x 226.15 x 16.95mm)

12.36 x 8.9 x 0.67 ins | (314 x 226.15 x 16.95mm)

12.36 x 8.9 x 0.67 ins | (314 x 226.15 x 16.95mm)

The max spec for the Dell 14 Plus in the US and UK is identical: an Intel Core Ultra 9 288V with Intel Arc graphics, 32GB LPDDR5X memory, 1TB PCIe NVMe SSD, and a 14-inch 2.5K (2560 x 1600p) display with 90Hz refresh and 300 nits max brightness. In Australia, the max spec comes with an AMD Ryzen AI 7 350 processor with Radeon 840M graphics, 16GB LPDDR5X RAM, 1TB NVMe SSD storage, and a 14-inch FHD+ (1920 x 1200p) display with a max brightness of 300 nits.

Dell 14 Plus Top Specs

Region

US

UK

Australia

Price:

$1,479.99 at Dell.com

£1,299 at Dell.com

AU$1,498.20 at Dell.com

CPU:

Intel Core Ultra 9 288V

Intel Core Ultra 9 288V

AMD Ryzen AI 7 350

GPU:

Intel Arc Xe2 (140V) Graphics

Intel Arc Xe2 (140V) Graphics

AMD Radeon 840M Graphics

Memory:

32GB LPDDR5X-8533

32GB LPDDR5X-8533

16GB LPDDR5X-7500

Storage:

1TB NVMe SSD

1TB NVMe SSD

1TB NVMe SSD

Screen:

14-inch 16:10 2.5K (1600p), 300 nit, non-touch IPS

14-inch 16:10 2.5K (1600p), 300 nit, non-touch IPS

14-inch 16:10 FHD+ (1200p), 300 nit, non-touch IPS

Ports:

1 x USB 3.2 Gen 1, 1 x USB 3.2 Gen 2 Type-C w/ DP 1.4 and Power Delivery, 1 x Thunderbolt 4 w/ DP 2.1 and Power Delivery, 1 x HDMI 2.1, 1 x combo jack

1 x USB 3.2 Gen 1, 1 x USB 3.2 Gen 2 Type-C w/ DP 1.4 and Power Delivery, 1 x Thunderbolt 4 w/ DP 2.1 and Power Delivery, 1 x HDMI 2.1, 1 x combo jack

2 x USB-C 3.2 Gen 2 w/ DP and Power Delivery, 1 x USB 3.2 Gen 1 Type-A, 1 x HDMI 1.4, 1 x combo jack

Battery (WHr):

64 WHr

64 WHr

64 WHr

Wireless:

WiFi 7, BT 5.4

WiFi 7, BT 5.4

WiFi 7, BT 5.4

Camera:

1080p@30fps

1080p@30fps

1080p@30fps

Weight:

3.42 lbs (1.55kg)

3.42 lbs (1.55kg)

3.35 lb (1.52 kg)

Dimensions:

12.36 x 8.9 x 0.67 ins | (314 x 226.15 x 16.95mm)

12.36 x 8.9 x 0.67 ins | (314 x 226.15 x 16.95mm)

12.36 x 8.9 x 0.67 ins | (314 x 226.15 x 16.95mm)

The configuration I tested for this review is only available in the US, but the UK has a very similar spec, just with a 512GB SSD rather than the 1TB in my review unit, while Australia doesn't yet have Intel-based coinfigurations for the 14 Plus at all.

Dell 14 Plus Review Unit Specs

Price:

$1,179.99 / £999 / (about AU$1,830, but Intel systems not yet available in Australia)

CPU:

Intel Core Ultra 7 256V

GPU:

Intel Arc Xe2 (140V) Graphics

Memory:

16GB LPDDR5X-8533

Storage:

1TB NVMe SSD (512GB NVMe SSD in UK)

Screen:

14-inch 16:10 2.5K (1600p), 300 nit, non-touch IPS

Ports:

1 x USB 3.2 Gen 1, 1 x USB 3.2 Gen 2 Type-C w/ DP 1.4 and Power Delivery, 1 x Thunderbolt 4 w/ DP 2.1 and Power Delivery, 1 x HDMI 2.1, 1 x combo jack

Battery (WHr):

64 WHr

Wireless:

WiFi 7, BT 5.4

Camera:

1080p@30fps

Weight:

3.42 lbs (1.55kg)

Dimensions:

12.36 x 8.9 x 0.67 ins | (314 x 226.15 x 16.95mm)

Generally, there aren’t a whole lot of configuration options available for the Dell 14 Plus right now, but the specs you do get – even with the base configurations – are all solid enough for general computing and productivity work, and some models can even manage some modest PC gaming and creative work.

  • Specs: 4 / 5

Dell 14 Plus: Design

The top lid of the Dell 14 Plus

(Image credit: Future / John Loeffler)
  • Thin and light form factor
  • Trackpad can be tricky at times
  • Display isn’t stellar, especially in daylight

The Dell 14 Plus takes a number of design influences from earlier Inspiron laptops and merges them somewhat with the former Dell XPS laptop series, and the end result is a fairly attractive ultrabook for the price.

A Dell 14 Plus on a desk with its lid facing outward

(Image credit: Future / John Loeffler)

It doesn’t have the same kind of premium materials that more expensive laptops use, but the 14 Plus doesn’t necessarily feel like a cheap laptop either. Where its design does disappoint me, though, is its keyboard, trackpad, and display.

The keyboard on a Dell 14 Plus

(Image credit: Future / John Loeffler)

The keys on the keyboard aren’t bad, but they’re not really great either, and can sometimes feel stiffer than they should. This problem is compounded by the trackpad that isn’t the smoothest, and I’ve found my fingers catching at times from even the light friction of swiping across its surface.

A Dell 14 Plus open to the TechRadar homepage

(Image credit: Future / John Loeffler)

The more ‘premium’ 2.5K display on my review unit works fine in an office environment or when sitting on the couch at home, but its 300 nits peak brightness means that it's hard to use if you’re outside, so if you like to work at an outdoor cafe, or sitting in the grass of a college quad, the display is going to be difficult to see clearly in daylight.

You get a decent selection of ports for a laptop this thin and a physical privacy shutter for the webcam, which I love to see. The webcam is 1080p @ 30 fps, which is good enough for most needs, as you can see from my selfie taken with the webcam.

The underside of the Dell 14 Plus

(Image credit: Future / John Loeffler)

The down-firing speakers aren’t very good, especially if the laptop is sitting on fabric like bedding. They work, though, and conference calls and general audio is fine in most cases. For music and movies, however, I recommend using headphones or one of the best Bluetooth speakers.

  • Design: 3.5 / 5

Dell 14 Plus: Performance

The Intel Core Ultra 7 sticker on a laptop

(Image credit: Future / John Loeffler)
  • Very good productivity and general computing performance
  • Hardware isn’t suited for intensive workloads like heavy gaming or video editing
  • Lags well behind similarly specced MacBook Air models

The Dell 14 Plus is targeted toward office workers, students, and others who need a responsive everyday device for web browsing, video streaming, and the like. In that regard, this laptop does exactly what it should, and does it well.

That’s not to say it's the best, though, as you can see when comparing its benchmark results against something like the MacBook Air 13 (M4), which comfortably outperforms the 14 Plus at pretty much every task.

But the 14 Plus consistently comes in second or third place against several other competing laptops on the market, including the Acer Swift 14 AI, Asus Zenbook A14, and the Microsoft Surface Laptop 7, while also coming in at a lower price point, making it my top pick for the best student laptop of 2025 so far.

Overall, only the Apple MacBook Air 13-inch with M4 offers better value for your money than the 14 Plus, which is something I really wasn’t expecting when I started working with the 14 Plus earlier this month, but it's a very welcome surprise.

  • Performance: 4 / 5

Dell 14 Plus: Battery Life

The battery life indicator on the Windows corner panel

(Image credit: Future / John Loeffler)
  • How long does it last on a single charge? 13 hours, 24 minutes
  • How long to full charge it to 100%? 2 hours, 30 minutes with the included 65W adapter (1 hour, 4 minutes to charge it to 50%)

If there’s one area in my testing where the Dell 14 Plus came in dead last, it’s battery life, but it’s not as bad as it might sound. While some laptops like the Zenbook A14 can run for just over 18 hours in our Web Surfing Battery Test, the Dell 14 Plus’s nearly 13 and a half hours isn’t terrible, especially given how we were praising laptops like the Inspiron 14 2-in-1 from 2022 for making it longer than eight hours on a single charge.

As for charging time, the 64WHr battery takes a little over an hour to get from fully depleted to 50% using the included 65W USB-C power adapter, but with its Thunderbolt 4 or USB4 ports capable of higher power delivery, a higher-wattage adapter will speed things up.

  • Battery Life: 4 / 5

Should you buy the Dell 14 Plus?

A Dell 14 Plus on a desk

(Image credit: Future / John Loeffler)
Dell 14 Plus Scorecard

Category

Notes

Rating

Value

The Dell 14 Plus offers possibly the best value of any Windows laptop at this price.

5 / 5

Specs

The available specs are generally excellent, especially for the price

4 / 5

Design

Aesthetically, the 14 Plus looks more premium than it is, but its keyboard, trackpad, speakers, and display could be better.

3.5 / 5

Performance

General computing and productivity performance are very good, but it falters under medium-intensity workloads, much less heavy-duty ones like gaming.

4 / 5

Battery Life

Not the longest-lasting battery life on the market, but still capable of many hours of use before you need to recharge.

4 / 5

Final Score

The Dell 14 Plus is a solid general-use and productivity notebook that’s great for work or school, but it makes some compromises to keep its price affordable. The trade-off is generally worth it, in the end.

4.1 / 5

Buy the Dell 14 Plus if...

You want solid productivity and general computing performance
For everyday use, school work, and productivity, the Dell 14 Plus is very good, especially for its price.

You want a laptop that doesn’t look too cheap
Aesthetically, the 14 Plus is a pretty great-looking device for the price, though if you look closely, you can spot its shortcomings.

Don't buy it if...

You need a high-performance laptop
If you’re looking to game or do resource-intensive work like video editing, this laptop won’t get the job done.

You want a really good-looking laptop
While the 14 Plus doesn’t look bad, it can't hold a candle to the most recent MacBook Air or Surface Laptop models.

Also consider

If my Dell 14 Plus review has you looking at other options, here are three other laptops you should consider instead...

Apple MacBook Air 13-inch (M4)
The most recent Apple MacBook Air 13-inch offers much better performance, battery life, and aesthetics than the Dell 14 Plus, though you’ll pay more for it.

Read our full Apple MacBook Air 13-inch (M4) review

Acer Swift 14 AI
For roughly the same price as the 14 Plus, the Acer Swift 14 AI with Qualcomm Snapdragon X Elite chip offers similar performance as the Dell 14 Plus with better battery life, but still has Windows app compatibility struggles.

Read our full Acer Swift 14 AI review

Asus Zenbook A14
While its performance lags behind the Dell 14 Plus, the battery life on this thing is unreal, making it a great pick for those who need a laptop that can go the distance.

Read the full Asus Zenbook A14 review

How I tested the Dell 14 Plus

  • I spent about two weeks with the Dell 14 Plus
  • I used it mostly for general computing and work tasks
  • I used our standard laptop benchmark suite for testing, along with other productivity and creative apps

I used the Dell 14 Plus for about two weeks both as an everyday laptop and as a dedicated work device. This involved a lot of writing, general productivity work (like Google Sheets), and some light creative work like photo editing in Adobe Photoshop.

I also put it through our standard benchmark testing suite, which includes industry-standard tools like Geekbench 6, 3DMark, and Shadow of the Tomb Raider's built-in gaming benchmark.

I’ve been testing laptops for TechRadar for more than five years, with dozens of reviews under my belt, so I know what a laptop should be capable of at this price point. As a media professional and former student, I’m also the target audience for this kind of laptop, so I’m well positioned to assess the quality of this device.

  • First reviewed May 2025
MediaTek unveils Helio G200 with slightly faster GPU and better HDR for videos 
5:51 pm | May 12, 2025

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

MediaTek is keeping its 4G series of chipsets alive, but the new Helio G200 is not a huge upgrade over the Helio G100, despite what the numbers suggest. The new chip brings higher GPU clock speeds, better HDR video quality for the camera and a new tech that improves using social networking apps in areas with spotty reception. At heart, this is still mostly the same 6nm chipset that the Helio G99 was. The G100 added 200MP camera support, the G200 improves on that with 12-bit DCG support for better HDR in videos. The CPU features 2x Cortex-A76 cores (2.2GHz) and 6x A55 (2.0GHz), same...

The AMD Radeon RX 9070 is too close to the RX 9070 XT to really stand out, but it’s a better value given the market
11:09 am | May 1, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

AMD Radeon RX 9070: Two-minute review

The AMD Radeon RX 9070 is a card that just might be saved by the economic chaos engulfing the GPU market right now.

With 'normal' price inflation pretty much rampant with every current-gen GPU, the price proposition for the RX 9070 might actually make it an appealing pick for gamers who're experiencing sticker shock when looking for the best graphics card for their next GPU upgrade.

That doesn't mean, unfortunately, that the AMD RX 9070 is going to be one of the best cheap graphics cards going, even by comparison with everything else that's launched since the end of 2024. With an MSRP of $549 / £529.99 / AU$1,229, the RX 9070 is still an expensive card, even if it's theoretically in line with your typical 'midrange' offering.

And, with the lack of an AMD reference card that might have helped anchor the RX 9070's price at Team Red's MSRP, you're going to pretty much be at the mercy of third-party manufacturers and retailers who can charge whatever they want for this card.

Comparatively speaking, though, even with price inflation, this is going to be one of the cheaper midrange GPUs of this generation, so if you're looking at a bunch of different GPUs, without question this one is likely to be the cheapest graphics card made by either AMD or Nvidia right now (yes, that's even counting the RTX 5060 Ti, which is already selling for well above 150% of MSRP in many places).

The radeon logo on the AMD Radeon RX 9070

(Image credit: Future / John Loeffler)

Does that make this card worth the purchase? Well, that's going to depend on what you're being asked to pay for it. While it's possible to find RX 9070 cards at MSRP, they are rare, and so you're going to have to make a back-of-the-envelope calculation to see if this card is going to offer you the best value in your particular circumstance.

I'm fairly confident, however, that it will. Had I the time to review this card when it first launched in March, I might have scored it lower based on its performance and price proximity to the beefier AMD Radeon RX 9070 XT.

Looking at both of those cards based on their MSRPs, there's no question that the RX 9070 XT is the much better graphics card, so I'd have recommended you spend the extra cash to get that card instead of this one.

Unfortunately, contrary to my hopes, the RX 9070 XT has been scalped almost as badly as the best Nvidia graphics cards of this generation, so that relatively small price difference on paper can be quite large in practice.

Given that reality, for most gamers, the RX 9070 is the best 1440p graphics card going, and can even get you some solid 4K gaming performance for a lot less than you're likely to find the RX 9070 XT or competing Nvidia card, even from the last generation.

If you're looking at this card and the market has returned to sanity and MSRP pricing, then definitely consider going for the RX 9070 XT instead of this card. But barring that happy contingency, given where everything is right now with the GPU market, the RX 9070 is the best AMD graphics card for 1440p gaming, and offers some of the best bang for your (inflationary) buck as you're likely to find today.

AMD Radeon RX 9070: Price & availability

An AMD Radeon RX 9070 sitting on its retail packaging with its fans visible

(Image credit: Future / John Loeffler)
  • How much is it? MSRP is $549 / £529.99 / AU$1,229, but retail price will likely be higher
  • When can you get it? The RX 9070 is available now
  • Where is it available? The RX 9070 is available in the US, UK, and Australia

The AMD Radeon RX 9070 is available now in the US, UK, and Australia for an MSRP of $549 / £529.99 / AU$1,229, respectively, but the price you'll pay for this card from third-party partners and retailers will likely be higher.

Giving credit where it's due, the RX 9070 is the exact same MSRP as the AMD Radeon RX 7900 GRE, which you can argue the RX 9070 is replacing. It's also coming in at the same price as the RTX 5070's MSRP, and as I'll get into in a bit, for gaming performance, the RX 9070 offers a better value at MSRP.

Given how the RTX 5070 can rarely be found at MSRP, the RX 9070 is in an even stronger position compared to its competition.

  • Value: 4 / 5

AMD Radeon RX 9070: Specs

The power connector ports on an AMD Radeon RX 9070

(Image credit: Future / John Loeffler)
  • PCIe 5.0
  • 16GB VRAM
  • Specs & features: 4 / 5

AMD Radeon RX 9070: Design & features

  • No AMD reference card
  • Will be good for SFF cases

In terms of design, the RX 9070 doesn't have a reference card, so the card I reviewed is the Sapphire Pulse Radeon RX 9070.

This card, in particular, is fairly straightforward with few frills, but for those who don't want a whole lot of RGB lighting in their PC, this is more of a positive than a negative. RGB fans, however, will have to look at other AMD partner cards for their fix.

The card is a noticeably shorter dual-fan design compared to the longer triple-fan RX 9070 XT cards. That makes the RX 9070 a great option for small form factor PC cases.

  • Design: 3.5 / 5

AMD Radeon RX 9070: Performance

  • About 13% slower than RX 9070 XT
  • Outstanding 1440p gaming performance
  • Decent 4K performance
A note on my data

The charts shown below offer the most recent data I have for the cards tested for this review. They may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

When it comes down to performance, the RX 9070 is a very strong graphics card that is somewhat overshadowed by its beefier 9070 XT sibling, but goes toe-to-toe against the RTX 5070 where it counts for most users, which is gaming.

On the synthetic side, the RTX 9070 puts up some incredibly solid numbers, especially in pure rasterization workloads like 3DMark Steel Nomad, beating out the RTX 5070 by 13%. In ray tracing heavy workloads like 3DMark Speed Way, meanwhile, the RX 9070 manages to comes within 95% of the RTX 5070's performance.

As expected though, the RX 9070's creative performance isn't able to keep up with Nvidia's competing RTX 5070, especially in 3D modeling workloads like Blender. If you're looking for a cheap creative workstation GPU, you're going to want to go for the RTX 5070, no question.

But that's not really what this card is about. AMD cards are gaming cards through and through, and as you can see above, at 1440p, the RX 9070 goes blow for blow with Nvidia's midrange card so that the overall average FPS at 1440p is 114 against Nvidia's 115 FPS average (72 FPS to 76 FPS average minimums/1%, respectively).

Likewise, at 4K, the two cards are effectively tied, with the RX 9070 holding a slight 2 FPS edge over the RTX 5070, on average (50 FPS to 51 FPS minimum/1%, respectively).

Putting it all together, one thing in the Nvidia RTX 5070's favor is that it is able to tie things up with the RX 9070 at about 26 fewer watts under load (284W maximum power draw to the RTX 5070's 258W).

That's not the biggest difference, but even 26W extra power can mean the difference between needing to replace your PSU or sticking with the one you have.

Under normal conditions, I'd argue that this would swing things in favor of Nvidia's GPU, but the GPU market is hardly normal right now, and so what you really need to look at is how much you're being asked to pay for either of these cards. Chances are, you're going to be able to find an RX 9070 for a good bit cheaper than the RTX 5070, and so its value to you in the end is likely going to be higher.

  • Performance: 4.5 / 5

Should you buy the AMD Radeon RX 9070?

A masculine hand holding an AMD Radeon RX 9070

(Image credit: Future / John Loeffler)

Buy the AMD Radeon RX 9070 if...

You want a fantastic 1440p graphics card
The RX 9070 absolutely chews through 1440p gaming with frame rates that can fully saturate most 1440p gaming monitors' refresh rates.

You don't want to spend a fortune on a midrange GPU
While the RX 9070 isn't cheap, necessarily, it's among the cheapest midrange cards you can get, even after factoring in scalping and price inflation.

Don't buy it if...

You want great creative performance
While the RX 9070 is a fantastic gaming graphics card, its creative performance (especially for 3D modeling work) lags behind Nvidia midrange cards.

Also consider

AMD Radeon RX 9070 XT
The RX 9070 XT is an absolute barnburner of a gaming GPU, offering excellent 4K performance and even better 1440p performance, especially if you can get it close to MSRP.

Read the full AMD Radeon RX 9070 XT review

Nvidia GeForce RTX 5070
The RTX 5070 essentially ties the RX 9070 in gaming performance in 1440p and 4K gaming, but has better power efficiency and creative performance.

Read the full Nvidia GeForce RTX 5070 review

How I tested the AMD Radeon RX 9070

  • I spent about two weeks with the RX 9070
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Samsung 9100 Pro 4TB SSD
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about two weeks with the AMD RX 9070, using it as my primary workstation GPU for creative work and gaming after hours.

I used my updated benchmarking process, which includes using built-in benchmarks on the latest PC games like Black Myth: Wukong, Cyberpunk 2077, and Civilization VII. I also used industry-standard benchmark tools like 3DMark for synthetic testing, while using tools like PugetBench for Creators and Blender Benchmark for creative workload testing.

I've reviewed more than three dozen graphics cards for TechRadar over the past three years, which has included hundreds of hours of dedicated GPU testing, so you can trust that I'm giving you the fullest picture of a graphics card's performance in my reviews.

  • Originally reviewed May 2025
The AMD Radeon RX 9070 is too close to the RX 9070 XT to really stand out, but it’s a better value given the market
11:09 am |

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

AMD Radeon RX 9070: Two-minute review

The AMD Radeon RX 9070 is a card that just might be saved by the economic chaos engulfing the GPU market right now.

With 'normal' price inflation pretty much rampant with every current-gen GPU, the price proposition for the RX 9070 might actually make it an appealing pick for gamers who're experiencing sticker shock when looking for the best graphics card for their next GPU upgrade.

That doesn't mean, unfortunately, that the AMD RX 9070 is going to be one of the best cheap graphics cards going, even by comparison with everything else that's launched since the end of 2024. With an MSRP of $549 / £529.99 / AU$1,229, the RX 9070 is still an expensive card, even if it's theoretically in line with your typical 'midrange' offering.

And, with the lack of an AMD reference card that might have helped anchor the RX 9070's price at Team Red's MSRP, you're going to pretty much be at the mercy of third-party manufacturers and retailers who can charge whatever they want for this card.

Comparatively speaking, though, even with price inflation, this is going to be one of the cheaper midrange GPUs of this generation, so if you're looking at a bunch of different GPUs, without question this one is likely to be the cheapest graphics card made by either AMD or Nvidia right now (yes, that's even counting the RTX 5060 Ti, which is already selling for well above 150% of MSRP in many places).

The radeon logo on the AMD Radeon RX 9070

(Image credit: Future / John Loeffler)

Does that make this card worth the purchase? Well, that's going to depend on what you're being asked to pay for it. While it's possible to find RX 9070 cards at MSRP, they are rare, and so you're going to have to make a back-of-the-envelope calculation to see if this card is going to offer you the best value in your particular circumstance.

I'm fairly confident, however, that it will. Had I the time to review this card when it first launched in March, I might have scored it lower based on its performance and price proximity to the beefier AMD Radeon RX 9070 XT.

Looking at both of those cards based on their MSRPs, there's no question that the RX 9070 XT is the much better graphics card, so I'd have recommended you spend the extra cash to get that card instead of this one.

Unfortunately, contrary to my hopes, the RX 9070 XT has been scalped almost as badly as the best Nvidia graphics cards of this generation, so that relatively small price difference on paper can be quite large in practice.

Given that reality, for most gamers, the RX 9070 is the best 1440p graphics card going, and can even get you some solid 4K gaming performance for a lot less than you're likely to find the RX 9070 XT or competing Nvidia card, even from the last generation.

If you're looking at this card and the market has returned to sanity and MSRP pricing, then definitely consider going for the RX 9070 XT instead of this card. But barring that happy contingency, given where everything is right now with the GPU market, the RX 9070 is the best AMD graphics card for 1440p gaming, and offers some of the best bang for your (inflationary) buck as you're likely to find today.

AMD Radeon RX 9070: Price & availability

An AMD Radeon RX 9070 sitting on its retail packaging with its fans visible

(Image credit: Future / John Loeffler)
  • How much is it? MSRP is $549 / £529.99 / AU$1,229, but retail price will likely be higher
  • When can you get it? The RX 9070 is available now
  • Where is it available? The RX 9070 is available in the US, UK, and Australia

The AMD Radeon RX 9070 is available now in the US, UK, and Australia for an MSRP of $549 / £529.99 / AU$1,229, respectively, but the price you'll pay for this card from third-party partners and retailers will likely be higher.

Giving credit where it's due, the RX 9070 is the exact same MSRP as the AMD Radeon RX 7900 GRE, which you can argue the RX 9070 is replacing. It's also coming in at the same price as the RTX 5070's MSRP, and as I'll get into in a bit, for gaming performance, the RX 9070 offers a better value at MSRP.

Given how the RTX 5070 can rarely be found at MSRP, the RX 9070 is in an even stronger position compared to its competition.

  • Value: 4 / 5

AMD Radeon RX 9070: Specs

The power connector ports on an AMD Radeon RX 9070

(Image credit: Future / John Loeffler)
  • PCIe 5.0
  • 16GB VRAM
  • Specs & features: 4 / 5

AMD Radeon RX 9070: Design & features

  • No AMD reference card
  • Will be good for SFF cases

In terms of design, the RX 9070 doesn't have a reference card, so the card I reviewed is the Sapphire Pulse Radeon RX 9070.

This card, in particular, is fairly straightforward with few frills, but for those who don't want a whole lot of RGB lighting in their PC, this is more of a positive than a negative. RGB fans, however, will have to look at other AMD partner cards for their fix.

The card is a noticeably shorter dual-fan design compared to the longer triple-fan RX 9070 XT cards. That makes the RX 9070 a great option for small form factor PC cases.

  • Design: 3.5 / 5

AMD Radeon RX 9070: Performance

  • About 13% slower than RX 9070 XT
  • Outstanding 1440p gaming performance
  • Decent 4K performance
A note on my data

The charts shown below offer the most recent data I have for the cards tested for this review. They may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

When it comes down to performance, the RX 9070 is a very strong graphics card that is somewhat overshadowed by its beefier 9070 XT sibling, but goes toe-to-toe against the RTX 5070 where it counts for most users, which is gaming.

On the synthetic side, the RTX 9070 puts up some incredibly solid numbers, especially in pure rasterization workloads like 3DMark Steel Nomad, beating out the RTX 5070 by 13%. In ray tracing heavy workloads like 3DMark Speed Way, meanwhile, the RX 9070 manages to comes within 95% of the RTX 5070's performance.

As expected though, the RX 9070's creative performance isn't able to keep up with Nvidia's competing RTX 5070, especially in 3D modeling workloads like Blender. If you're looking for a cheap creative workstation GPU, you're going to want to go for the RTX 5070, no question.

But that's not really what this card is about. AMD cards are gaming cards through and through, and as you can see above, at 1440p, the RX 9070 goes blow for blow with Nvidia's midrange card so that the overall average FPS at 1440p is 114 against Nvidia's 115 FPS average (72 FPS to 76 FPS average minimums/1%, respectively).

Likewise, at 4K, the two cards are effectively tied, with the RX 9070 holding a slight 2 FPS edge over the RTX 5070, on average (50 FPS to 51 FPS minimum/1%, respectively).

Putting it all together, one thing in the Nvidia RTX 5070's favor is that it is able to tie things up with the RX 9070 at about 26 fewer watts under load (284W maximum power draw to the RTX 5070's 258W).

That's not the biggest difference, but even 26W extra power can mean the difference between needing to replace your PSU or sticking with the one you have.

Under normal conditions, I'd argue that this would swing things in favor of Nvidia's GPU, but the GPU market is hardly normal right now, and so what you really need to look at is how much you're being asked to pay for either of these cards. Chances are, you're going to be able to find an RX 9070 for a good bit cheaper than the RTX 5070, and so its value to you in the end is likely going to be higher.

  • Performance: 4.5 / 5

Should you buy the AMD Radeon RX 9070?

A masculine hand holding an AMD Radeon RX 9070

(Image credit: Future / John Loeffler)

Buy the AMD Radeon RX 9070 if...

You want a fantastic 1440p graphics card
The RX 9070 absolutely chews through 1440p gaming with frame rates that can fully saturate most 1440p gaming monitors' refresh rates.

You don't want to spend a fortune on a midrange GPU
While the RX 9070 isn't cheap, necessarily, it's among the cheapest midrange cards you can get, even after factoring in scalping and price inflation.

Don't buy it if...

You want great creative performance
While the RX 9070 is a fantastic gaming graphics card, its creative performance (especially for 3D modeling work) lags behind Nvidia midrange cards.

Also consider

AMD Radeon RX 9070 XT
The RX 9070 XT is an absolute barnburner of a gaming GPU, offering excellent 4K performance and even better 1440p performance, especially if you can get it close to MSRP.

Read the full AMD Radeon RX 9070 XT review

Nvidia GeForce RTX 5070
The RTX 5070 essentially ties the RX 9070 in gaming performance in 1440p and 4K gaming, but has better power efficiency and creative performance.

Read the full Nvidia GeForce RTX 5070 review

How I tested the AMD Radeon RX 9070

  • I spent about two weeks with the RX 9070
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Samsung 9100 Pro 4TB SSD
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about two weeks with the AMD RX 9070, using it as my primary workstation GPU for creative work and gaming after hours.

I used my updated benchmarking process, which includes using built-in benchmarks on the latest PC games like Black Myth: Wukong, Cyberpunk 2077, and Civilization VII. I also used industry-standard benchmark tools like 3DMark for synthetic testing, while using tools like PugetBench for Creators and Blender Benchmark for creative workload testing.

I've reviewed more than three dozen graphics cards for TechRadar over the past three years, which has included hundreds of hours of dedicated GPU testing, so you can trust that I'm giving you the fullest picture of a graphics card's performance in my reviews.

  • Originally reviewed May 2025
I reviewed Lenovo’s answer to the Mac Studio – but can this mini desktop survive in the business world?
9:02 pm | April 11, 2025

Author: admin | Category: Computers Gadgets Pro | Tags: , , | Comments: Off

The Apple Mac Studio made a huge splash when it entered the market a few years back. The form factor with that kind of power was nearly too good to be true. Now, the best mini PC manufacturers are replicating that style of desktop powerhouse.

The Lenovo ThinkCentre Neo Ultra is an excellent example of that. Lenovo took the exact size of the popular Mac Studio and threw their machine into it, claiming it was the business version of a Mac Studio.

For the most part, it has excellent ports, an option for up to 8 displays, beats out the Mac Studio, an RTX 4060 GPU, and even a discrete AI NPU. But can this machine match the performance ability of the Mac Studio at its best?

Lenovo ThinkCentre Neo Ultra

(Image credit: Collin Probst // Future)

Lenovo ThinkCentre Neo Ultra: Price and Availability

The Lenovo ThinkCentre Neo Ultra starts at around $3,000 but's frequently discounted to under $2,000. If you spec this thing out, you can run over $5,000. The Lenovo ThinkCentre Neo Ultra is available for purchase through Lenovo.com and enterprise partners, so if you are looking to pick this up, I'd check first at Lenovo to snag one of those great deals on this machine.

Lenovo ThinkCentre Neo Ultra

(Image credit: Collin Probst // Future)

Lenovo ThinkCentre Neo Ultra: Unboxing & first impressions

The Lenovo ThinkCentre Neo Ultra is nearly the exact dimensions of the Apple Mac Studio. It comes in a compact box with the cable and paperwork you'd expect. Unlike the popular silver on Macs, the ThinkCentre Neo Ultra comes in a Luna Gray chassis that looks more like what I'd expect a Lenovo device to look like.

Much like other compact desktops, the ThinkCentre Neo Ultra would fit easily under a monitor, even if not on a monitor arm, or if you wanted to, you could tuck it off to the side, keep it front and center to show off or mount it behind the monitor or under the desk.

Lenovo ThinkCentre Neo Ultra

(Image credit: Collin Probst // Future)

Lenovo ThinkCentre Neo Ultra: Design & build quality

Specs

CPU: Up to Intel Core i9-14900 vPro
GPU: NVIDIA RTX 4060 8GB
RAM: Up to 64GB DDR5
Storage: Up to 2x 2TB M.2 PCIe 4.0 NVMe SSDs
Ports:
1x USB-C 20Gbps, 1x 3.5mm combo jack, 2x USB-A 5Gbps, 4x USB-A 10Gbps, 2x HDMI 2.1, 4x DisplayPort 1.4a, 2.5GbE LAN
Optional: Configurable punch-out ports (HDMI, VGA, USB-C, LAN, etc.)
Connectivity: Wi-Fi 6E, Bluetooth 5.3
Dimensions: 7.68” x 7.52” x 4.25” (3.6L), 7.7 lbs

The Lenovo ThinkCentre Neo Ultra is a very professional and simple-looking machine. Lenovo has done a great job at making this a machine that does not stand out, is not overly flashy, but looks professional and top-tier at the same time. It's got a solid frame with rounded off edges, but not so much so that it looks round, more just not sharp. The top panel looks like it's the roof to a building with a row of windows, leaving plenty of room for ventilation to keep this powerhouse from overheating.

For those who like being able to upgrade RAM and SSD on their own, it’s great to see that the bottom panel can easily be removed. This is something that I see less and less in computers in general. But it’s a vital component for some users.

Lenovo ThinkCentre Neo Ultra

(Image credit: Collin Probst // Future)

Lenovo ThinkCentre Neo Ultra: In use

I work predominantly from a laptop. It has always appealed to me to have a single computer that I can easily take from place to place. However, having used this computer in my rotation of devices for the last several weeks, I can say there is something fantastic about a desktop that is set up, ready to rock, no dock needed, no charge needed, plugged into multiple displays, set up when you’re ready—a kind of desk setup.

As you can see in the desk shots, I usually have this on a single monitor setup. However, I ran five displays on this at one time simply because that was the number I had with me at the time of testing. I can confidently say that this is an excellent desktop if you are working primarily on business tasks and want to use multiple displays.

There is no need for an external graphics card or a dock with DisplayLink like I need with my M2 Series MacBook Pro, and there are no issues when running different types of monitors, as I have seen questions about. I was running a 49-inch ultrawide, a 32-inch, a 27-inch, a portable monitor, and a TV, all without any issues.

Lenovo ThinkCentre Neo Ultra

(Image credit: Collin Probst // Future)

Lenovo ThinkCentre Neo Ultra

(Image credit: Collin Probst // Future)

During my testing, I used this display for a few virtual meetings, a lot of writing and admin work, some basic photo editing, some video rendering, a lot (40+) of heavy Chrome Tabs (multiple extensive project management tools), Slack, Asana, Jira, Basecamp, ZenDesk, Hubspot, Postman, VS Code, WhatsApp, Email, and more. I worked on some web design, system automation, large Google Docs with 40+ pages of 11pt font and many comments, and so on. I tried to crash this computer, which handled everything while easily outputting to an abundance of screen real estate.

I wouldn’t use this machine for heavy video editing because I don’t think it's one of the best video editing computers available, but it is one of the best business computers in this form-factor, ideal for administrative or more standard business tasks like project management, documents, emails, virtual meetings, and so on.

After testing, I also see a lot of advantages to using this if you're a project manager or supervisor. It would allow for ample displays to show everything that kind of role needs to see all at once, without compromise.

Lenovo ThinkCentre Neo Ultra: Final verdict

The ThinkCentre Neo Ultra is a powerhouse of a machine. I’d still choose a Mac Studio for creative tasks, but this machine is a genuine contender for classic business performance. It’s got better video outputs, is just as compact, and has leading enterprise security and great software.

For business professionals, developer teams, or even things like conference rooms, command centers, or other setups that need a lot of screens, this machine is a fantastic one to consider. Just know that it doesn’t have Thunderbolt, so file transfers will be quite a bit slower than on something that does support a version of Thunderbolt.


For extra power, we reviewed the best workstations you can get right now.

Qualcomm Snapdragon 8s Gen 4 announced with Kryo CPU, Adreno 825 GPU
4:54 pm | April 2, 2025

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

Qualcomm just announced its latest premium mobile chipset – the Snapdragon 8s Gen 4. The SM8735 as it is known internally is fabbed on TSMC’s N4P 4nm process node and uses older Kryo CPU instead of the Oryon core found on the flagship Snapdragon 8 Elite chip. The CPU layer consists of 1x Cortex-X4 prime core clocked at up to 3.2GHz, 3x Cortex-A720 cores at 3.0GHz, 2x Cortex-A720 cores at 2.8GHz, and 2x Cortex-A720 cores at 2.0GHz. According to Qualcomm, the new chip is 31% faster than last year’s Snapdragon 8s Gen 3 while drawing 39% less power. The chip also gets a new Adreno 825...

Qualcomm Snapdragon 8s Gen 4 announced with Kryo CPU, Adreno 825 GPU
4:54 pm |

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

Qualcomm just announced its latest premium mobile chipset – the Snapdragon 8s Gen 4. The SM8735 as it is known internally is fabbed on TSMC’s N4P 4nm process node and uses older Kryo CPU instead of the Oryon core found on the flagship Snapdragon 8 Elite chip. The CPU layer consists of 1x Cortex-X4 prime core clocked at up to 3.2GHz, 3x Cortex-A720 cores at 3.0GHz, 2x Cortex-A720 cores at 2.8GHz, and 2x Cortex-A720 cores at 2.0GHz. According to Qualcomm, the new chip is 31% faster than last year’s Snapdragon 8s Gen 3 while drawing 39% less power. The chip also gets a new Adreno 825...

The AMD RX 9070 XT delivers exactly what the market needs with stunning performance at an unbeatable price
5:00 pm | March 5, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

AMD Radeon RX 9070 XT: Two-minute review

AMD had one job to do with the launch of its RDNA 4 graphics cards, spearheaded by the AMD Radeon RX 9070 XT, and that was to not get run over by Blackwell too badly this generation.

With the RX 9070 XT, not only did AMD manage to hold its own against the GeForce RTX monolith, it perfectly positions Team Red to take advantage of the growing discontent among gamers upset over Nvidia's latest GPUs with one of the best graphics cards I've ever tested.

The RX 9070 XT is without question the most powerful consumer graphics card AMD's put out, beating the AMD Radeon RX 7900 XTX overall and coming within inches of the Nvidia GeForce RTX 4080 in 4K and 1440p gaming performance.

It does so with an MSRP of just $599 (about £510 / AU$870), which is substantially lower than those two card's MSRP, much less their asking price online right now. This matters because AMD traditionally hasn't faced the kind of scalping and price inflation that Nvidia's GPUs experience (it does happen, obviously, but not nearly to the same extent as with Nvidia's RTX cards).

That means, ultimately, that gamers who look at the GPU market and find empty shelves, extremely distorted prices, and uninspiring performance for the price they're being asked to pay have an alternative that will likely stay within reach, even if price inflation keeps it above AMD's MSRP.

The RX 9070 XT's performance comes at a bit of a cost though, such as the 309W maximum power draw I saw during my testing, but at this tier of performance, this actually isn't that bad.

This card also isn't too great when it comes to non-raster creative performance and AI compute, but no one is looking to buy this card for its creative or AI performance, as Nvidia already has those categories on lock. No, this is a card for gamers out there, and for that, you just won't find a better one at this price. Even if the price does get hit with inflation, it'll still likely be way lower than what you'd have to pay for an RX 7900 XTX or RTX 4080 (assuming you can find them at this point) making the AMD Radeon RX 9070 XT a gaming GPU that everyone can appreciate and maybe even buy.

AMD Radeon RX 9070 XT: Price & availability

An AMD Radeon RX 9070 XT made by Sapphire on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • How much is it? MSRP is $599 (about £510 / AU$870)
  • When can you get it? The RX 9070 XT goes on sale March 6, 2025
  • Where is it available? The RX 9070 XT will be available in the US, UK, and Australia at launch

The AMD Radeon RX 9070 XT is available as of March 6, 2025, starting at $599 (about £510 / AU$870) for reference-spec third-party cards from manufacturers like Asus, Sapphire, Gigabyte, and others, with OC versions and those with added accoutrements like fancy cooling and RGB lighting likely selling for higher than MSRP.

At this price, the RX 9070 XT comes in about $150 cheaper than the RTX 5070 Ti, and about $50 more expensive than the RTX 5070 and the AMD Radeon RX 9070, which also launches alongside the RX 9070 XT. This price also puts the RX 9070 XT on par with the MSRP of the RTX 4070 Super, though this card is getting harder to find nowadays.

While I'll dig into performance in a bit, given the MSRP (and the reasonable hope that this card will be findable at MSRP in some capacity) the RX 9070 XT's value proposition is second only to the RTX 5070 Ti's, if you're going by its MSRP. Since price inflation on the RTX 5070 Ti will persist for some time at least, in many cases you'll likely find the RX 9070 XT offers better performance per price paid of any enthusiast card on the market right now.

  • Value: 5 / 5

AMD Radeon RX 9070 XT: Specs

An AMD Radeon RX 9070 XT made by Sapphire on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • PCIe 5.0, but still just GDDR6
  • Hefty power draw

The AMD Radeon RX 9070 XT is the first RDNA 4 card to hit the market, and so its worth digging into its architecture for a bit.

The new architecture is built on TSMC's N4P node, the same as Nvidia Blackwell, and in a move away from AMD's MCM push with the last generation, the RDNA 4 GPU is a monolithic die.

As there's no direct predecessor for this card (or for the RX 9070, for that matter), there's not much that we can apples-to-apples compare the RX 9070 XT against, but I'm going to try, putting the RX 9070 XT roughly between the RX 7800 XT and the RX 7900 GRE if it had a last-gen equivalent.

The Navi 48 GPU in the RX 9070 XT sports 64 compute units, breaking down into 64 ray accelerators, 128 AI accelerators, and 64MB of L3 cache. Its cores are clocked at 1,600MHz to start, but can run as fast as 2,970MHz, just shy of the 3GHz mark.

It uses the same GDDR6 memory as the last-gen AMD cards, with a 256-bit bus and a 644.6GB/s memory bandwidth, which is definitely helpful in pushing out 4K frames quickly.

The TGP of the RX 9070 XT is 304W, which is a good bit higher than the RX 7900 GRE, though for that extra power, you do get a commensurate bump up in performance.

  • Specs: 4 / 5

AMD Radeon RX 9070 XT: Design

An AMD Radeon RX 9070 XT made by Sapphire on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • No AMD reference card
  • High TGP means bigger coolers and more cables

There's no AMD reference card for the Radeon RX 9070 XT, but the unit I got to test was the Sapphire Pulse Radeon RX 9070 XT, which I imagine is pretty indicative of what we can expect from the designs of the various third-party cards.

The 304W TGP all but ensures that any version of this card you find will be a triple-fan cooler over a pretty hefty heatsink, so it's not going to be a great option for small form factor cases.

Likewise, that TGP just puts it over the line where it needs a third 8-pin PCIe power connector, something that you may or may not have available in your rig, so keep that in mind. If you do have three spare power connectors, there's no question that cable management will almost certainly be a hassle as well.

After that, it's really just about aesthetics, as the RX 9070 XT (so far) doesn't have anything like the dual pass-through cooling solution of the RTX 5090 and RTX 5080, so it's really up to personal taste.

As for the card I reviewed, the Sapphire Pulse shroud and cooling setup on the RX 9070 XT was pretty plain, as far as desktop GPUs go, but if you're looking for a non-flashy look for your PC, it's a great-looking card.

  • Design: 4 / 5

AMD Radeon RX 9070 XT: Performance

An AMD Radeon RX 9070 XT in a test bench

(Image credit: Future / John Loeffler)
  • Near-RTX 4080 levels of gaming performance, even with ray tracing
  • Non-raster creative and AI performance lags behind Nvidia, as expected
  • Likely the best value you're going to find anywhere near this price point
A note on my data

The charts shown below offer the most recent data I have for the cards tested for this review. They may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

Simply put, the AMD Radeon RX 9070 XT is the gaming graphics card that we've been clamoring for this entire generation. While it shows some strong performance in synthetics and raster-heavy creative tasks, gaming is where this card really shines, managing to come within 7% overall of the RTX 4080 and getting within 4% of the RTX 4080's overall gaming performance. For a card launching at half the price of the RTX 4080's launch price, this is a fantastic showing.

The RX 9070 XT is squaring up against the RTX 5070 Ti, however, and here the RTX 5070 Ti does manage to pull well ahead of the RX 9070 XT, but it's much closer than I thought it would be starting out.

On the synthetics side, the RX 9070 XT excels at rasterization workloads like 3DMark Steel Nomad, while the RTX 5070 Ti wins out in ray-traced workloads like 3DMark Speed Way, as expected, but AMD's 3rd generation ray accelerators have definitely come a long way in catching up with Nvidia's more sophisticated hardware.

Also, as expected, when it comes to creative workloads, the RX 9070 XT performs very well in raster-based tasks like photo editing, and worse at 3D modeling in Blender, which is heavily reliant on Nvidia's CUDA instruction set, giving Nvidia an all but permanent advantage there.

In video editing, the RX 9070 XT likewise lags behind, though it's still close enough to Nvidia's RTX 5070 Ti that video editors won't notice much difference, even if the difference is there on paper.

Gaming performance is what we're on about though, and here the sub-$600 GPU holds its own against heavy hitters like the RTX 4080, RTX 5070 Ti, and Radeon RX 7900 XTX.

In 1440p gaming, the RX 9070 XT is about 8.4% faster than the RTX 4070 Ti and RX 7900 XTX, just under 4% slower than the RTX 4080, and about 7% slower than the RTX 5070 Ti.

This strong performance carries over into 4K gaming as well, thanks to the RX 9070 XT's 16GB VRAM. Here, it's about 15.5% faster than the RTX 4070 Ti and about 2.5% faster than the RX 7900 XTX. Against the RTX 4080, the RX 9070 XT is just 3.5% slower, while it comes within 8% of the RTX 5070 Ti's 4K gaming performance.

When all is said and done, the RX 9070 XT doesn't quite overpower one of the best Nvidia graphics cards of the last-gen (and definitely doesn't topple the RTX 5070 Ti), but given its performance class, it's power draw, its heat output (which wasn't nearly as bad as the power draw might indicate), and most of all, it's price, the RX 9070 XT is easily the best value of any graphics card playing at 4K.

And given Nvidia's position with gamers right now, AMD has a real chance to win over some converts with this graphics card, and anyone looking for an outstanding 4K GPU absolutely needs to consider it before making their next upgrade.

  • Performance: 5 / 5

Should you buy the AMD Radeon RX 9070 XT?

Buy the AMD Radeon RX 9070 XT if...

You want the best value proposition for a high-end graphics card
The performance of the RX 9070 XT punches way above its price point.

You don't want to pay inflated prices for an Nvidia GPU
Price inflation is wreaking havoc on the GPU market right now, but this card might fare better than Nvidia's RTX offerings.

Don't buy it if...

You're on a tight budget
If you don't have a lot of money to spend, this card is likely more than you need.

You need strong creative or AI performance
While AMD is getting better at creative and AI workloads, it still lags far behind Nvidia's competing offerings.

How I tested the AMD Radeon RX 9070 XT

  • I spent about a week with the AMD Radeon RX 9070 XT
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about a week with the AMD Radeon RX 9070 XT, which was spent benchmarking, using, and digging into the card's hardware to come to my assessment.

I used industry standard benchmark tools like 3DMark, Cyberpunk 2077, and Pugetbench for Creators to get comparable results with other competing graphics cards, all of while have been tested using the same testbench setup listed on the right.

I've reviewed more than 30 graphics cards in the last three years, and so I've got the experience and insight to help you find the best graphics card for your needs and budget.

  • Originally reviewed March 2025
The featherweight Tecno Megabook S14 has a Snapdragon X Elite chip, optional GPU dock
3:23 pm |

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

Tecno is on a roll – after unveiling the Camon 40 series and AI smart glasses, the company now turns its attention to laptops. Meet the new Tecno Megabook S14, an ultra-portable 14” laptop that weighs only 899g (1.98lbs). That’s not the only interesting fact about the S14 – there is a version of it powered by the Snapdragon X Elite (X1E-80-100). That’s a 12-core chip (with custom Qualcomm CPU design) that runs at up to 3.4GHz with multiple cores active or up to 4.0GHz with two cores active. The Adreno GPU offers up to 3.8 TFLOPS of performance, while the Hexagon NPU can go up to 45 TOPS...

I really wanted to like the Nvidia GeForce RTX 5070, but it broke my heart and it shouldn’t have to break yours, too
5:00 pm | March 4, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , , | Comments: Off

Nvidia GeForce RTX 5070: Two-minute review

A lot of promises were made about the Nvidia GeForce RTX 5070, and in some narrow sense, those promises are fulfilled with Nvidia's mainstream GPU. But the gulf between what was expected and what the RTX 5070 actually delivers is simply too wide a gap to bridge for me and the legion of gamers and enthusiasts out there who won't be able to afford—or even find, frankly—Nvidia's best graphics cards from this generation.

Launching on March 5, 2025, at an MSRP of $549 / £549 / AU$1,109 in the US, UK, and Australia, respectively, this might be one of the few Nvidia Blackwell GPUs you'll find at MSRP (along with available stock), but only for lack of substantial demand. As the middle-tier GPU in Nvidia's lineup, the RTX 5070 is meant to have broader appeal and more accessible pricing and specs than the enthusiast-grade Nvidia GeForce RTX 5090, Nvidia GeForce RTX 5080, and Nvidia GeForce RTX 5070 Ti, but of all the cards this generation, this is the one that seems to have the least to offer prospective buyers over what's already on the market at this price point.

That's not to say there is nothing to commend this card. The RTX 5070 does get up to native Nvidia GeForce RTX 4090 performance in some games thanks to Nvidia Blackwell's exclusive Multi-Frame Generation (MFG) technology. And, to be fair, the RTX 5070 is a substantial improvement over the Nvidia GeForce RTX 4070, so at least in direct gen-on-gen uplift, there is a roughly 20-25% performance gain.

But this card is a far, far cry from the promise of RTX 4090 performance that Nvidia CEO Jensen Huang presented on stage at CES 2025, even with the qualifier that such an achievement would be "impossible without artificial intelligence," which implies a heavy reliance on DLSS 4 and MFG to get this card over the line.

If we're just talking framerates, then in some very narrow cases this card can do that, but at 4K with ray tracing and cranked-up settings, the input latency for the RTX 5070 with MFG can be noticeable depending on your settings, and it can become distracting. Nvidia Reflex helps, but if you take RTX 4090 performance to mean the same experience as the RTX 4090, you simply won't get that with MFG, even in the 80 or so games that support it currently.

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)

Add to all this the fact that the RTX 5070 barely outpaces the Nvidia GeForce RTX 4070 Super when you take MFG off the table (which will be the case for the vast majority of games played on this card) and you really don't have anything to show for the extra 30W of power this card pulls down over the RTX 4070 Super.

With the RTX 5070 coming in at less than four percent faster in gaming without MFG than the non-OC RTX 4070 Super, and roughly 5% faster overall, that means that the RTX 5070 is essentially a stock-overclocked RTX 4070 Super, performance-wise, with the added feature of MFG. An overclocked RTX 4070 Super might even match or exceed the RTX 5070's overall performance in all but a handful of games, and that doesn't even touch upon AMD's various offerings in this price range, like the AMD Radeon RX 7900 GRE or AMD's upcoming RX 9070 XT and RX 9070 cards.

Given that the RTX 4070 Super is still generally available on the market (at least for the time being) at a price where you're likely to find it for less than available RTX 5070 cards, and competing AMD cards are often available for less, easier to find, and offer roughly the same level of performance, I really struggle to find any reason to recommend this card, even without the questionable-at-best marketing for this card to sour my feelings about it.

I caught a lot of flack from enthusiasts for praising the RTX 5080 despite its 8-10% performance uplift over the Nvidia GeForce RTX 4080 Super, but at the level of the RTX 5080, there is no real competition and you're still getting the third-best graphics card on the market with a noticeable performance boost over the RTX 4080 Super for the same MSRP. Was it what enthusiasts wanted? No, but it's still a fantastic card with few peers, and the base performance of the RTX 5080 was so good that the latency problem of MFG just wasn't an issue, making it a strong value-add for the card.

You just can't claim that for the RTX 5070. There are simply too many other options for gamers to consider at this price point, and MFG just isn't a strong enough selling point at this performance level to move the needle. If the RTX 5070 is the only card you have available to you for purchase and you need a great 1440p graphics card and can't wait for something better (and you're only paying MSRP), then you'll ultimately be happy with this card. But the Nvidia GeForce RTX 5070 could have and should have been so much better than it ultimately is.

Nvidia GeForce RTX 5070: Price & availability

An Nvidia GeForce RTX 5070 sitting on top of its retail packaging

(Image credit: Future / John Loeffler)
  • How much is it? MSRP/RRP starting at $549 / £549 / AU$1,109
  • When can you get it? The RTX 5070 goes on sale on March 5, 2025
  • Where is it available? The RTX 5070 will be available in the US, UK, and Australia at launch

The Nvidia GeForce RTX 5070 is available starting March 5, 2025, with an MSRP of $549 / £549 / AU$1,109 in the US, UK, and Australia, respectively.

This puts it at the same price as the current RTX 4070 MSRP, and slightly less than that of the RTX 4070 Super. It's also the same MSRP as the AMD's RX 7900 GRE and upcoming RX 9070, and slightly cheaper than the AMD RX 9070 XT's MSRP.

The relatively low MSRP for the RTX 5070 is one of the bright spots for this card, as well as the existence of the RTX 5070 Founders Edition card, which Nvidia will sell directly at MSRP. This will at least put something of an anchor on the card's price in the face of scalping and general price inflation.

  • Value: 4 / 5

Nvidia GeForce RTX 5070: Specs

  • GDDR7 VRAM and PCIe 5.0
  • Higher power consumption
  • Still just 12GB VRAM, and fewer compute units

The Nvidia GeForce RTX 5070 is a mixed bag when it comes to specs. On the one hand, you have advanced technology like the new PCIe 5.0 interface and new GDDR7 VRAM, both of which appear great on paper.

On the other hand, it feels like every other spec was configured and tweaked to make sure that it compensated for any performance benefit these technologies would impart to keep the overall package more or less the same as the previous generation GPUs.

For instance, while the RTX 5070 sports faster GDDR7 memory, it doesn't expand the VRAM pool beyond 12GB, unlike its competitors. If Nvidia was hoping that the faster memory would make up for keeping the amount of VRAM the same, it only makes a modest increase in the number of compute units in the GPU (48 compared to the RTX 4070's 46), and a noticeable decrease from the RTX 4070 Super's (56).

Whatever performance gains the RTX 5070 makes with its faster memory, then, is completely neutralized by the larger number of compute units (along with the requisite number of CUDA cores, RT cores, and Tensor cores) in the RTX 4070 Super.

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)

The base clock on the RTX 5070 is notably higher, but its boost clock is only slightly increased, which is ultimately where it counts while playing games or running intensive workloads.

Likewise, whatever gains the more advanced TSMC N4P node offers the RTX 5070's GPU over the TSMC N4 node of its predecessors seems to be eaten up by the cutting down of the die. If there was a power or cost reason for this, I have no idea, but I think that this decision is what ultimately sinks the RTX 5070.

It seems like every decision was made to keep things right where they are rather than move things forward. That would be acceptable, honestly, if there was some other major benefit like a greatly reduced power draw or much lower price (I've argued for both rather than pushing for more performance every gen), but somehow the RTX 5070 manages to pull down an extra 30W of power over the RTX 4070 Super and a full 50W over the RTX 4070, and the price is only slightly lower than the RTX 4070 was at launch.

Finally, this is a PCIe 5.0 x16 GPU, which means that if you have a motherboard with 16 PCIe lanes or less, and you're using a PCIe 5.0 SSD, one of these two components is going to get nerfed down to PCIe 4.0, and most motherboards default to prioritizing the GPU.

You might be able to set your PCIe 5.0 priority to your SSD in your motherboard's BIOS settings and put the RTX 5070 into PCIe 4.0, but I haven't tested how this would affect the performance of the RTX 5070, so be mindful that this might be an issue with this card.

  • Specs: 2.5 / 5

Nvidia GeForce RTX 5070: Design

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)
  • No dual-pass-through cooling
  • FE card is the same size as the RTX 4070 and RTX 4070 Super FE cards

The Nvidia GeForce RTX 5070 Founders Edition looks identical to the RTX 5090 and RTX 5080 that preceeded it, but with some very key differences, both inside and out.

One of the best things about the RTX 5090 and RTX 5080 FE cards was the innovative dual pass-through cooling solution on those cards, which improved thermals so much that Nvidia was able to shrink the size of those cards from the gargantuan bricks of the last generation to something far more manageable and practical.

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)

It would have been nice to see what such a solution could have done for the RTX 5070, but maybe it just wasn't possible to engineer it so it made any sense. Regardless, it's unfortunate that it wasn't an option here, even though the RTX 5070 is hardly unwieldy (at least for the Founders Edition card).

Otherwise, it sports the same 16-pin power connector placement as the RTX 5090 and RTX 5080, so 90-degree power connectors won't fit the Founders Edition, though you will have better luck with most, if not all, AIB partner cards which will likely stick to the same power connector placement of the RTX 40 series.

The RTX 5070 FE will easily fit inside even a SFF case with ease, and its lighter power draw means that even if you have to rely on the included two-to-one cable adapter to plug in two free 8-pin cables from your power supply, it will still be a fairly manageable affair.

Lastly, like all the Founders Edition cards before it, the RTX 5070 has no RGB, with only the white backlight GeForce RTX logo on the top edge of the card to provide any 'flair' of that sort.

  • Design: 3.5 / 5

Nvidia GeForce RTX 5070: Performance

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)
  • Almost no difference in performance over the RTX 4070 Super without MFG
  • Using MFG can get you native RTX 4090 framerates in some games
  • Significantly faster performance over the RTX 4070
A note on my data

The charts shown below offer the most recent data I have for the cards tested for this review. They may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

Boy howdy, here we go.

The best thing I can say about the performance of this card is that it is just barely the best 1440p graphics card on the market as of this review, and that DLSS 4's Multi Frame Generation can deliver the kind of framerates Nvidia promises in those games where the technology is available, either natively or through the Nvidia App's DLSS override feature.

Both of those statements come with a lot of caveats, though, and the RTX 5070 doesn't make enough progress from the last gen to make a compelling case for itself performance-wise, especially since its signature feature is only available in a smattering of games at the moment.

On the synthetic side of things, the RTX 5070 looks strong against the card it's replacing, the RTX 4070, and generally offers about 25% better performance on synthetic benchmarks like 3DMark Steel Nomad or Speed Way. It also has higher compute performance in Geekbench 6 than its direct predecessor, though not be as drastic a margin (about 10% better).

Compared to the RTX 4070 Super, however, the RTX 5070's performance is only about 6% better overall, and only about 12% better than the AMD RX 7900 GRE's overall synthetic performance.

Again, a win is a win, but it's much closer than it should be gen-on-gen.

The RTX 5070 runs into similar issues on the creative side, where it only outperforms the RTX 4070 Super by about 3% overall, with its best performance coming in PugetBench for Creators' Adobe Premiere benchmark (~13% better than the RTX 4070 Super), but faltering somewhat with Blender Benchmark 4.3.0.

This isn't too surprising, as the RTX 5070 hasn't been released yet and GPUs tend to perform better in Blender several weeks or months after the card's release when the devs can better optimize things for new releases.

All in all, for this class of cards, the RTX 5070 is a solid choice for those who might want to dabble in creative work without much of a financial commitment, but real pros are better off with the Nvidia GeForce RTX 5070 Ti if you're looking to upgrade without spending a fortune.

It's with gaming, though, where the real heartbreak comes with this card.

Technically, with just 12GB VRAM, this isn't a 4K graphics card, but both the RTX 4070 Super and RTX 5070 are strong enough cards that you can get playable native 4K in pretty much every game so long as you never, ever touch ray tracing, global illumination, or the like. Unfortunately, both cards perform roughly the same under these conditions at 4K, with the RTX 5070 pulling into a slight >5 fps lead in a few games like Returnal and Dying Light 2.

However, in some titles like F1 2024, the RTX 4070 Super actually outperforms the RTX 5070 when ray tracing is turned on, or when DLSS is set to balanced and without any Frame Generation. Overall and across different setting configurations, the RTX 5070 only musters a roughly 4.5% better average FPS at 4K than the RTX 4070 Super.

It's pretty much the same story at 1440p, as well, with the RTX 5070 outperforming the RTX 4070 Super by about 2.7% across configurations at 1440p. We're really in the realm of what a good overclock can get you on an RTX 4070 Super rather than a generational leap, despite all the next-gen specs that the RTX 5070 brings to bear.

OK, but what about the RTX 4090? Can the RTX 5070 with DLSS 4 Multi Frame Generation match the native 4K performance of the RTX 4090?

Yes, it can, at least if you're only concerned with average FPS. The only game with an in-game benchmark that I can use to measure the RTX 5070's MFG performance is Cyberpunk 2077, and I've included those results here, but in Indiana Jones and the Great Circle and Dragon Age: Veilguard (using the Nvidia App's override function) I pretty much found MFG to perform consistently as promised, delivering substantially faster FPS than DLSS 4 alone and landing in the ballpark of where the RTX 4090's native 4K performance ends up.

And so long as you stay far away from ray tracing, the base framerate at 4K will be high enough on the RTX 5070 that you won't notice too much, if any, latency in many games. But when you turn ray tracing on, even the RTX 5090's native frame rate tanks, and it's those baseline rendered frames that handle changes based on your input, and the three AI-generated frames based on that initial rendered frame don't factor in whatever input changes you've made at all.

As such, even though you can get up to 129 FPS at 4K with Psycho RT and Ultra preset in Cyberpunk 2077 on the RTX 5070 (blowing way past the RTX 5090's native 51 average FPS on the Ultra preset with Psycho RT), only 44 of the RTX 5070's 129 frames per second are reflecting active input. This leads to a situation where your game looks like its flying by at 129 FPS, but feels like it's still a sluggish 44 FPS.

For most games, this isn't going to be a deal breaker. While I haven't tried the RTX 5070 with 4x MFG on Satisfactory, I'm absolutely positive I will not feel the difference, as it's not the kind of game where you need fast reflexes (other than dealing with the effing Stingers), but Marvel Rivals? You're going to feel it.

Nvidia Reflex definitely helps take the edge off MFG's latency, but it doesn't completely eliminate it, and for some games (and gamers) that is going to matter, leaving the RTX 5070's MFG experience too much of a mixed bag to be a categorical selling point. I think the hate directed at 'fake frames' is wildly overblown, but in the case of the RTX 5070, it's not entirely without merit.

So where does that leave the RTX 5070? Overall, it's the best 1440p card on the market right now, and it's relatively low MSRP makes it the best value proposition in its class. It's also much more likely that you'll actually be able to find this card at MSRP, making the question of value more than just academic.

For most gamers out there, Multi Frame Generation is going to be great, and so long as you go easy on the ray tracing, you'll probably never run into any practical latency in your games, so in those instances, the RTX 5070 might feel like black magic in a circuit board.

But my problem with the RTX 5070 is that it is absolutely not the RTX 4090, and for the vast majority of the games you're going to be playing, it never will be, and that's essentially what was promised when the RTX 5070 was announced. Instead, the RTX 5070 is an RTX 4070 Super with a few games running MFG slapped to its side that look like they're playing on an RTX 4090, but may or may not feel like they are, and that's just not good enough.

It's not what we were promised, not by a long shot.

  • Performance: 3 / 5

Should you buy the Nvidia GeForce RTX 5070?

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)

Buy the Nvidia GeForce RTX 5070 if...

You don't have the money for (or cannot find) an RTX 5070 Ti or RTX 4070 Super
This isn't a bad graphics card, but there are so many better cards that offer better value or better performance within its price range.

You want to dabble in creative or AI work without investing a lot of money
The creative and AI performance of this card is great for the price.

Don't buy it if...

You can afford to wait for better
Whether it's this generation or the next, this card offers very little that you won't be able to find elsewhere within the next two years.

Also consider

Nvidia GeForce RTX 5070 Ti
The RTX 5070 Ti is a good bit more expensive, especially with price inflation, but if you can get it at a reasonable price, it is a much better card than the RTX 5070.

Read the full Nvidia GeForce RTX 5070 Ti review

Nvidia GeForce RTX 4070 Super
With Nvidia RTX 50 series cards getting scalped to heck, if you can find an RTX 4070 Super for a good price, it offers pretty much identical performance to the RTX 5070, minus the Multi Frame Generation.

Read the full Nvidia GeForce RTX 4070 Super review

How I tested the Nvidia GeForce RTX 5070

  • I spent about a week with the RTX 5070
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about a week testing the Nvidia GeForce RTX 5070, using it as my main workstation GPU for creative content work, gaming, and other testing.

I used my updated testing suite including industry standard tools like 3DMark and PugetBench for Creators, as well as built-in game benchmarks like Cyberpunk 2077, Civilization VII, and others.

I've reviewed more than 30 graphics cards for TechRadar in the last two and a half years, as well as extensively testing and retesting graphics cards throughout the year for features, analysis, and other content, so you can trust that my reviews are based on experience and data, as well as my desire to make sure you get the best GPU for your hard earned money.

  • Originally reviewed March 2025
« Previous PageNext Page »