Organizer
Gadget news
The AMD Radeon RX 9070 is too close to the RX 9070 XT to really stand out, but it’s a better value given the market
11:09 am | May 1, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

AMD Radeon RX 9070: Two-minute review

The AMD Radeon RX 9070 is a card that just might be saved by the economic chaos engulfing the GPU market right now.

With 'normal' price inflation pretty much rampant with every current-gen GPU, the price proposition for the RX 9070 might actually make it an appealing pick for gamers who're experiencing sticker shock when looking for the best graphics card for their next GPU upgrade.

That doesn't mean, unfortunately, that the AMD RX 9070 is going to be one of the best cheap graphics cards going, even by comparison with everything else that's launched since the end of 2024. With an MSRP of $549 / £529.99 / AU$1,229, the RX 9070 is still an expensive card, even if it's theoretically in line with your typical 'midrange' offering.

And, with the lack of an AMD reference card that might have helped anchor the RX 9070's price at Team Red's MSRP, you're going to pretty much be at the mercy of third-party manufacturers and retailers who can charge whatever they want for this card.

Comparatively speaking, though, even with price inflation, this is going to be one of the cheaper midrange GPUs of this generation, so if you're looking at a bunch of different GPUs, without question this one is likely to be the cheapest graphics card made by either AMD or Nvidia right now (yes, that's even counting the RTX 5060 Ti, which is already selling for well above 150% of MSRP in many places).

The radeon logo on the AMD Radeon RX 9070

(Image credit: Future / John Loeffler)

Does that make this card worth the purchase? Well, that's going to depend on what you're being asked to pay for it. While it's possible to find RX 9070 cards at MSRP, they are rare, and so you're going to have to make a back-of-the-envelope calculation to see if this card is going to offer you the best value in your particular circumstance.

I'm fairly confident, however, that it will. Had I the time to review this card when it first launched in March, I might have scored it lower based on its performance and price proximity to the beefier AMD Radeon RX 9070 XT.

Looking at both of those cards based on their MSRPs, there's no question that the RX 9070 XT is the much better graphics card, so I'd have recommended you spend the extra cash to get that card instead of this one.

Unfortunately, contrary to my hopes, the RX 9070 XT has been scalped almost as badly as the best Nvidia graphics cards of this generation, so that relatively small price difference on paper can be quite large in practice.

Given that reality, for most gamers, the RX 9070 is the best 1440p graphics card going, and can even get you some solid 4K gaming performance for a lot less than you're likely to find the RX 9070 XT or competing Nvidia card, even from the last generation.

If you're looking at this card and the market has returned to sanity and MSRP pricing, then definitely consider going for the RX 9070 XT instead of this card. But barring that happy contingency, given where everything is right now with the GPU market, the RX 9070 is the best AMD graphics card for 1440p gaming, and offers some of the best bang for your (inflationary) buck as you're likely to find today.

AMD Radeon RX 9070: Price & availability

An AMD Radeon RX 9070 sitting on its retail packaging with its fans visible

(Image credit: Future / John Loeffler)
  • How much is it? MSRP is $549 / £529.99 / AU$1,229, but retail price will likely be higher
  • When can you get it? The RX 9070 is available now
  • Where is it available? The RX 9070 is available in the US, UK, and Australia

The AMD Radeon RX 9070 is available now in the US, UK, and Australia for an MSRP of $549 / £529.99 / AU$1,229, respectively, but the price you'll pay for this card from third-party partners and retailers will likely be higher.

Giving credit where it's due, the RX 9070 is the exact same MSRP as the AMD Radeon RX 7900 GRE, which you can argue the RX 9070 is replacing. It's also coming in at the same price as the RTX 5070's MSRP, and as I'll get into in a bit, for gaming performance, the RX 9070 offers a better value at MSRP.

Given how the RTX 5070 can rarely be found at MSRP, the RX 9070 is in an even stronger position compared to its competition.

  • Value: 4 / 5

AMD Radeon RX 9070: Specs

The power connector ports on an AMD Radeon RX 9070

(Image credit: Future / John Loeffler)
  • PCIe 5.0
  • 16GB VRAM
  • Specs & features: 4 / 5

AMD Radeon RX 9070: Design & features

  • No AMD reference card
  • Will be good for SFF cases

In terms of design, the RX 9070 doesn't have a reference card, so the card I reviewed is the Sapphire Pulse Radeon RX 9070.

This card, in particular, is fairly straightforward with few frills, but for those who don't want a whole lot of RGB lighting in their PC, this is more of a positive than a negative. RGB fans, however, will have to look at other AMD partner cards for their fix.

The card is a noticeably shorter dual-fan design compared to the longer triple-fan RX 9070 XT cards. That makes the RX 9070 a great option for small form factor PC cases.

  • Design: 3.5 / 5

AMD Radeon RX 9070: Performance

  • About 13% slower than RX 9070 XT
  • Outstanding 1440p gaming performance
  • Decent 4K performance
A note on my data

The charts shown below offer the most recent data I have for the cards tested for this review. They may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

When it comes down to performance, the RX 9070 is a very strong graphics card that is somewhat overshadowed by its beefier 9070 XT sibling, but goes toe-to-toe against the RTX 5070 where it counts for most users, which is gaming.

On the synthetic side, the RTX 9070 puts up some incredibly solid numbers, especially in pure rasterization workloads like 3DMark Steel Nomad, beating out the RTX 5070 by 13%. In ray tracing heavy workloads like 3DMark Speed Way, meanwhile, the RX 9070 manages to comes within 95% of the RTX 5070's performance.

As expected though, the RX 9070's creative performance isn't able to keep up with Nvidia's competing RTX 5070, especially in 3D modeling workloads like Blender. If you're looking for a cheap creative workstation GPU, you're going to want to go for the RTX 5070, no question.

But that's not really what this card is about. AMD cards are gaming cards through and through, and as you can see above, at 1440p, the RX 9070 goes blow for blow with Nvidia's midrange card so that the overall average FPS at 1440p is 114 against Nvidia's 115 FPS average (72 FPS to 76 FPS average minimums/1%, respectively).

Likewise, at 4K, the two cards are effectively tied, with the RX 9070 holding a slight 2 FPS edge over the RTX 5070, on average (50 FPS to 51 FPS minimum/1%, respectively).

Putting it all together, one thing in the Nvidia RTX 5070's favor is that it is able to tie things up with the RX 9070 at about 26 fewer watts under load (284W maximum power draw to the RTX 5070's 258W).

That's not the biggest difference, but even 26W extra power can mean the difference between needing to replace your PSU or sticking with the one you have.

Under normal conditions, I'd argue that this would swing things in favor of Nvidia's GPU, but the GPU market is hardly normal right now, and so what you really need to look at is how much you're being asked to pay for either of these cards. Chances are, you're going to be able to find an RX 9070 for a good bit cheaper than the RTX 5070, and so its value to you in the end is likely going to be higher.

  • Performance: 4.5 / 5

Should you buy the AMD Radeon RX 9070?

A masculine hand holding an AMD Radeon RX 9070

(Image credit: Future / John Loeffler)

Buy the AMD Radeon RX 9070 if...

You want a fantastic 1440p graphics card
The RX 9070 absolutely chews through 1440p gaming with frame rates that can fully saturate most 1440p gaming monitors' refresh rates.

You don't want to spend a fortune on a midrange GPU
While the RX 9070 isn't cheap, necessarily, it's among the cheapest midrange cards you can get, even after factoring in scalping and price inflation.

Don't buy it if...

You want great creative performance
While the RX 9070 is a fantastic gaming graphics card, its creative performance (especially for 3D modeling work) lags behind Nvidia midrange cards.

Also consider

AMD Radeon RX 9070 XT
The RX 9070 XT is an absolute barnburner of a gaming GPU, offering excellent 4K performance and even better 1440p performance, especially if you can get it close to MSRP.

Read the full AMD Radeon RX 9070 XT review

Nvidia GeForce RTX 5070
The RTX 5070 essentially ties the RX 9070 in gaming performance in 1440p and 4K gaming, but has better power efficiency and creative performance.

Read the full Nvidia GeForce RTX 5070 review

How I tested the AMD Radeon RX 9070

  • I spent about two weeks with the RX 9070
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Samsung 9100 Pro 4TB SSD
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about two weeks with the AMD RX 9070, using it as my primary workstation GPU for creative work and gaming after hours.

I used my updated benchmarking process, which includes using built-in benchmarks on the latest PC games like Black Myth: Wukong, Cyberpunk 2077, and Civilization VII. I also used industry-standard benchmark tools like 3DMark for synthetic testing, while using tools like PugetBench for Creators and Blender Benchmark for creative workload testing.

I've reviewed more than three dozen graphics cards for TechRadar over the past three years, which has included hundreds of hours of dedicated GPU testing, so you can trust that I'm giving you the fullest picture of a graphics card's performance in my reviews.

  • Originally reviewed May 2025
I reviewed Lenovo’s answer to the Mac Studio – but can this mini desktop survive in the business world?
9:02 pm | April 11, 2025

Author: admin | Category: Computers Gadgets Pro | Tags: , , | Comments: Off

The Apple Mac Studio made a huge splash when it entered the market a few years back. The form factor with that kind of power was nearly too good to be true. Now, the best mini PC manufacturers are replicating that style of desktop powerhouse.

The Lenovo ThinkCentre Neo Ultra is an excellent example of that. Lenovo took the exact size of the popular Mac Studio and threw their machine into it, claiming it was the business version of a Mac Studio.

For the most part, it has excellent ports, an option for up to 8 displays, beats out the Mac Studio, an RTX 4060 GPU, and even a discrete AI NPU. But can this machine match the performance ability of the Mac Studio at its best?

Lenovo ThinkCentre Neo Ultra

(Image credit: Collin Probst // Future)

Lenovo ThinkCentre Neo Ultra: Price and Availability

The Lenovo ThinkCentre Neo Ultra starts at around $3,000 but's frequently discounted to under $2,000. If you spec this thing out, you can run over $5,000. The Lenovo ThinkCentre Neo Ultra is available for purchase through Lenovo.com and enterprise partners, so if you are looking to pick this up, I'd check first at Lenovo to snag one of those great deals on this machine.

Lenovo ThinkCentre Neo Ultra

(Image credit: Collin Probst // Future)

Lenovo ThinkCentre Neo Ultra: Unboxing & first impressions

The Lenovo ThinkCentre Neo Ultra is nearly the exact dimensions of the Apple Mac Studio. It comes in a compact box with the cable and paperwork you'd expect. Unlike the popular silver on Macs, the ThinkCentre Neo Ultra comes in a Luna Gray chassis that looks more like what I'd expect a Lenovo device to look like.

Much like other compact desktops, the ThinkCentre Neo Ultra would fit easily under a monitor, even if not on a monitor arm, or if you wanted to, you could tuck it off to the side, keep it front and center to show off or mount it behind the monitor or under the desk.

Lenovo ThinkCentre Neo Ultra

(Image credit: Collin Probst // Future)

Lenovo ThinkCentre Neo Ultra: Design & build quality

Specs

CPU: Up to Intel Core i9-14900 vPro
GPU: NVIDIA RTX 4060 8GB
RAM: Up to 64GB DDR5
Storage: Up to 2x 2TB M.2 PCIe 4.0 NVMe SSDs
Ports:
1x USB-C 20Gbps, 1x 3.5mm combo jack, 2x USB-A 5Gbps, 4x USB-A 10Gbps, 2x HDMI 2.1, 4x DisplayPort 1.4a, 2.5GbE LAN
Optional: Configurable punch-out ports (HDMI, VGA, USB-C, LAN, etc.)
Connectivity: Wi-Fi 6E, Bluetooth 5.3
Dimensions: 7.68” x 7.52” x 4.25” (3.6L), 7.7 lbs

The Lenovo ThinkCentre Neo Ultra is a very professional and simple-looking machine. Lenovo has done a great job at making this a machine that does not stand out, is not overly flashy, but looks professional and top-tier at the same time. It's got a solid frame with rounded off edges, but not so much so that it looks round, more just not sharp. The top panel looks like it's the roof to a building with a row of windows, leaving plenty of room for ventilation to keep this powerhouse from overheating.

For those who like being able to upgrade RAM and SSD on their own, it’s great to see that the bottom panel can easily be removed. This is something that I see less and less in computers in general. But it’s a vital component for some users.

Lenovo ThinkCentre Neo Ultra

(Image credit: Collin Probst // Future)

Lenovo ThinkCentre Neo Ultra: In use

I work predominantly from a laptop. It has always appealed to me to have a single computer that I can easily take from place to place. However, having used this computer in my rotation of devices for the last several weeks, I can say there is something fantastic about a desktop that is set up, ready to rock, no dock needed, no charge needed, plugged into multiple displays, set up when you’re ready—a kind of desk setup.

As you can see in the desk shots, I usually have this on a single monitor setup. However, I ran five displays on this at one time simply because that was the number I had with me at the time of testing. I can confidently say that this is an excellent desktop if you are working primarily on business tasks and want to use multiple displays.

There is no need for an external graphics card or a dock with DisplayLink like I need with my M2 Series MacBook Pro, and there are no issues when running different types of monitors, as I have seen questions about. I was running a 49-inch ultrawide, a 32-inch, a 27-inch, a portable monitor, and a TV, all without any issues.

Lenovo ThinkCentre Neo Ultra

(Image credit: Collin Probst // Future)

Lenovo ThinkCentre Neo Ultra

(Image credit: Collin Probst // Future)

During my testing, I used this display for a few virtual meetings, a lot of writing and admin work, some basic photo editing, some video rendering, a lot (40+) of heavy Chrome Tabs (multiple extensive project management tools), Slack, Asana, Jira, Basecamp, ZenDesk, Hubspot, Postman, VS Code, WhatsApp, Email, and more. I worked on some web design, system automation, large Google Docs with 40+ pages of 11pt font and many comments, and so on. I tried to crash this computer, which handled everything while easily outputting to an abundance of screen real estate.

I wouldn’t use this machine for heavy video editing because I don’t think it's one of the best video editing computers available, but it is one of the best business computers in this form-factor, ideal for administrative or more standard business tasks like project management, documents, emails, virtual meetings, and so on.

After testing, I also see a lot of advantages to using this if you're a project manager or supervisor. It would allow for ample displays to show everything that kind of role needs to see all at once, without compromise.

Lenovo ThinkCentre Neo Ultra: Final verdict

The ThinkCentre Neo Ultra is a powerhouse of a machine. I’d still choose a Mac Studio for creative tasks, but this machine is a genuine contender for classic business performance. It’s got better video outputs, is just as compact, and has leading enterprise security and great software.

For business professionals, developer teams, or even things like conference rooms, command centers, or other setups that need a lot of screens, this machine is a fantastic one to consider. Just know that it doesn’t have Thunderbolt, so file transfers will be quite a bit slower than on something that does support a version of Thunderbolt.


For extra power, we reviewed the best workstations you can get right now.

MediaTek Dimensity 9400+ is here with a minor prime CPU core frequency bump
10:01 pm | April 10, 2025

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

The much anticipated MediaTek Dimensity 9400+ is now official, and as expected, it's a minor improvement over the 9400 which was launched last year. The Dimensity 9400+'s CPU has one ARM Cortex-X925 core clocked at up to 3.73 GHz, higher than the 3.63 GHz of the 9400. So basically, it seems like the 9400+ is a higher-bin 9400. While the NPU 890 is still in, it now comes with 20% faster agentic AI performance with Speculative Decoding+ and lets developers easily turn traditional AI applications into "sophisticated agentic AI applications" because "agentic AI" is the buzzword of early 2025...

I tested the Armari Magnetar MC16R7 – see what I thought of this workstation
9:42 am | April 9, 2025

Author: admin | Category: Computers Gadgets Pro | Tags: | Comments: Off

This review first appeared in issue 348 of PC Pro.

Armari’s lower-cost system is something of a technology showcase, exhibiting the latest options in processor, graphics and storage. The combination is one of the most powerful workstations you could buy for £4,500 inc VAT.

At the center of the Magnetar MC16R7 is AMD’s range-topping Ryzen 9 7950X. This potent 16-core processor uses AMD’s latest Zen 4 architecture and is manufactured on the 5nm process. This enables an incredible base clock of 4.5GHz, which is the boost clock for AMD Ryzen Threadripper Pro processors. The 7950X’s boost clock of 5.7GHz is only a few hundred megahertz behind the best Intel has to offer, and only with the latter’s P-cores, so it’s good to see that Armari makes the most out of the Ryzen 9 via its own customized CPU liquid cooling.

Armari has also taken full advantage of the fact that the AMD Ryzen 7000 series supports DDR5 memory by supplying 64GB of 6,000MHz RAM in two 32GB modules, leaving two DIMM slots free for upgrades. This is the fastest-clocked memory of any system this month.

Side view of the Armari Magnetar MC16R7

The Magnetar MC16R7 showcases the latest CPU, graphics and storage technology (Image credit: Future)

So the Magnetar MC16R7 has a cutting-edge processor, some of the fastest system memory available, and its graphics acceleration is bleeding edge, too. In the past, choosing AMD professional GPUs might be a good choice to keep within a budget, but it rarely beat the Nvidia alternative for performance. The AMD Radeon Pro W7800 is a different matter. It’s in the same price category as the Nvidia RTX A5000 and offers 4,480 unified shaders (which aren’t equivalent to CUDA cores) on AMD’s latest RDNA 3 architecture. It also boasts 32GB of GDDR6 memory on a 256-bit bus, offering 576GB/sec bandwidth.

Armari is notable in the UK market because it’s one of the few local PC integrators that designs its own chassis. However, these cases come at a premium so the Magnetar MC16R7 has been built into a Fractal Design Meshify 2. This is still a great basis for a workstation, with plenty of room inside for airflow and storage upgrades. There are six spaces for 3.5in or 2.5in drives included, and there could optionally be up to 14. On top of this there are two 2.5in-only spaces as standard, but up to four are possible.

You may want to build upon the single M.2 NVMe SSD Armari supplies, but what a great foundation it provides. It’s a 2TB Crucial T700 drive, which supports PCI Express 5, as does the Asus ProArt B650-Creator motherboard. The Crucial SSD delivers incredible throughput from a single drive. CrystalDiskMark recorded sustained reading at 12,373MB/sec and writing at 11,807MB/sec, which were close to twice as fast as some of the PCI Express 4 NVMe SSDs in other workstations this month.

Front and rear views of the Armari Magnetar MC16R7

The Fractal Design Meshify 2 case offers lots of room for airflow and upgrades (Image credit: Future)

Considering all the powerful components in the Magnetar MC16R7, it’s no surprise that it produced some stunning test results. Our media-focused benchmarks are the Intel Core i9’s forte, but the Armari system’s overall result of 772 is still incredible, significantly beating the other system this month based on an AMD Ryzen 9 7950X. Its Cinebench R23 multithread rendering result of 38,611 was the fastest in the £4,500 category, and the Blender rendering time of 265 seconds was also top in this class. The OpenCL-accelerated Adobe Media Encoder time of 105 seconds beat every other system this month.

The AMD Radeon Pro W7800 graphics may be around the same price as Nvidia’s RTX A5000, but its performance with SPECviewperf 2020 v3.1 is in a different league as well. The results of 235 in 3dsmax-07 and an unbelievable 846 in maya-06 imply this will be a consummate accelerator for 3D animation. Likewise, 155 in catia-06, 235 in creo-03, 622 in snx-04 and 460 in solidworks-07 show strong abilities with product development, CAD and engineering.

Its LuxMark 3.1 result of 14,919 is a little behind the RTX A5000, but GPU rendering in Blender took just 141 seconds, which is ahead.

Overall, the Armari Magnetar MC16R7 provides the best possible performance for the money in most areas. If you need a powerful all-round workstation, this system should be top of your list.

We've also rated the best business computers.

Samsung Galaxy Tab S10 FE benchmarked, showing 32% gains in CPU performance
1:49 pm | March 12, 2025

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

Geekbench confirms the reports that the upcoming Samsung Galaxy Tab S10 FE and Tab S10 FE+ will be powered by the Exynos 1580 (S5E8855), a chipset that is currently found only inside the Galaxy A56. This brings a massive performance boost over the old Exynos 1380 that powered 2023’s Tab S9 FE and Tab S9 FE+. The tablet that ran the benchmark – Samsung SM-X520 should be Tab S10 FE – was the base model with 8GB of RAM. That’s another upgrade over its predecessor, which had 6GB as its base capacity. Both new FE models will have up to 12GB of RAM and up to 256GB storage (128GB base). Here’s...

The Apple MacBook Air 13-inch (M4) is the best ultraportable – and the new price makes it even more appealing
4:09 pm | March 11, 2025

Author: admin | Category: Computers Computing Gadgets Laptops Macbooks | Tags: , , | Comments: Off

Apple MacBook Air 13-inch (M4): Two-minute review

How do you make the best MacBook, and arguably one of the best laptops on the market, better? You could redesign it, but that’s a move fraught with potential downsides; if the current design is popular, you risk disenfranchising fans. In that case, making small changes, especially under-the-hood ones, is probably the smart move, and it’s clearly Apple’s strategy.

The MacBook Air 13-inch (M4) is virtually indistinguishable from the M3 model. Apple has left the exquisite keyboard and responsive trackpad untouched, and the same goes for the brilliant Liquid Retina display. The 2.7lbs. weight is unchanged, and even the two Thunderbolt 4 ports are essentially the same. Visually, the only thing that's new is a new color option, and the Sky Blue finish is a subtle hue that can, depending on the light, look almost gray, but a second glance always reveals that pleasing almost pastel-like azure. It’s a color that should sell out fast.

@techradar

♬ original sound - TechRadar

The other two significant changes are to the hardware. Replacing the FaceTime camera is the new 12MP Center Stage Camera. It’s an ultra-wide lens in a screen notch that can keep you in the frame during video calls, and it’s a nice-to-have though not earth-shattering update.

There’s also the M4 chip, which adds cores and performance over the M3 Apple silicon it replaces. Like the M3, this is a fast, efficient, 3-nanometer chip with plenty of headroom for AAA gaming, video editing, music creation and, of course, Apple Intelligence.

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)

From one perspective, the biggest upgrade might be in the value space. Apple doubled the base memory from 8GB of unified memory to 16GB while reducing the price to $999 / £999 / AU$1,699. That’s a shocking, and very welcome, turn of events. The best MacBook is now back to its pre–MacBook Air M3 price, and better value because of it.

It really is hard to find any fault with the MacBook Air 13-inch (M4). It’s lightweight, attractive, powerful, easy to use, and up for anything. I gamed, streamed video, browsed the web, answered email, texted friends, conducted FaceTime calls, edited video, practiced guitar, and wrote this review on it. I’m not concerned about the lack of design changes, and I like the new color, the Center Stage Camera, and especially the price. I would not be surprised to see the MacBook Air 13-inch (M4) rise to the very top of our best laptops list.

Apple MacBook Air 13-inch (M4) review: Price and availability

  • Starts at $999 / £999 / AU$1,699
  • Lower launch price than the discontinued M3 model
  • M2 and M3 models no longer on the Apple Store, but M2 MacBooks can be found at third-party retailers

Rarely do I get to write about a price drop for a new product that arrives with feature enhancements. Usually, we get the same or sometimes a little less for the money. That is not the case with the MacBook Air 13-inch M4.

Even though Apple hasn't radically refreshed its best MacBook, the updates in performance, memory, and video conferencing, plus a new color, hit all the right notes – and when paired with a now $100 (in the US) lower price, they have me singing a happy tune.

Funnily enough, the first 3lb MacBook Air – the one that slid out of a manilla envelope in 2008 – cost $1,799. It would take a few years for it to hit that $999 sweet spot, which it maintained until recently.

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)

Sometimes that $999 got you a lower-end Intel Core I, but in the age of Apple silicon we’re getting great performance and efficiency at an excellent price.

The MacBook Air 13-inch (M4) comes in three base configurations. If you upgrade to the $1,199 / £1,199 model the GPU gets a bump from eight to 10 cores, and the storage doubles to 512GB. Go for the $1,499 / £1,499 / AU$2,399 top-tier model and the base unified memory is increased from 16GB to 24GB, and you can get up to 2TB of storage. Whichever option you go for, you can upgrade the RAM to 32GB.

It’s available in the new Sky Blue (like my 256GB review unit), Midnight, Starlight, and Silver. Apple has discontinued Space Gray (for now).

Apple unveiled the MacBook Air 13-inch (M4) on March 5, 2025, and the laptop starts shipping on March 12.

  • Price score: 4.5/5

Apple MacBook Air 13-inch (M4) review: Specs

The Apple MacBook Air 13-inch (M4) comes in three pre-configured options.

Apple MacBook Air 13-inch (M4) review: Design

  • No major redesign
  • Sky Blue is subtle but attractive
  • Excellent construction, materials, keyboard, and trackpad

There are still some who mourn the passing of the original MacBook Air’s wedge design, the one that started at a more than half inch (1.61 cm) at one end and ended at 0.16 inches (4.064mm) at the other. That design remains so popular that the M1 model featuring it is still a top seller at Walmart.

I’ve moved on. The MacBook Air M4 is just 2.7lbs / 1.24kg, and at 11.97 x 8.46 x 0.44 inches / 30.41 x 21.5 x 1.13cm, is thinner than the OG MacBook Air was at its thickest point. This is a laptop that's built for your backpack and, yes, it’s light enough that you might forget it’s there.

Everything about the MacBook Air M4 feels premium. The 100% recycled aluminum enclosure is light but solid and has all the exacting tolerances Apple is known for. It’s a finely machined, eye-catching piece of hardware, and few laptops can match its elegance.

Image 1 of 8

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 2 of 8

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 3 of 8

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 4 of 8

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 5 of 8

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 6 of 8

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 7 of 8

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 8 of 8

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)

The backlit keyboard is an absolute pleasure to type on, and has remarkable travel and response for such a thin design. It includes all your function keys and a multipurpose power / sleep / Touch ID button that’s useful for unlocking the MacBook Air and logging into various apps and services with your registered fingertips.

I do prefer the Microsoft Surface Laptop’s Windows Hello feature, which lets you log on using your face in much the way you do with Face ID on any of the best iPhones, although I don’t have to touch anything because I set the MacBook Air to unlock automatically with my Apple Watch.

While Apple hasn't redesigned the keyboard, there is one small change that you might not notice at first glance: the mute key now features a speaker icon with a line through it, which matches what you see on-screen when you press the key. It's a small but clarifying change.

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)

There’s ample room to rest your palms, and the glass-covered multi-touch trackpad is huge and responsive.

Ports and other elements are unchanged from the last two MacBook Air generations. There are two Thunderbolt 4 ports on the left side with up to 40GBps of throughput and which are capable of driving two external screens, even with the MacBook Air lid open. Next to those is the MagSafe charging port, and on the right side is the 3.5mm headphone jack.

The four-speaker stereo sound system is hidden in the hinge below the display. It can fill a room with bright, crisp audio, although it mostly lacks bass (the 15-inch model offers a 6-speaker sound system with force-cancelling sound woofers).

  • Design score: 4.5/5

Apple MacBook Air 13-inch (M4) review: Display and Center Stage

With one exception, the 13-inch M4 MacBook Air’s display is identical to the last generation. It’s still a 13.6-inch Liquid Retina panel with 2560 x 1664 resolution and 500 nits of sustained brightness, which in my experience is viewable in direct sunlight, and support for one billion colors. It’s a fantastic display for everything from gaming to streaming to content creation.

There is a notch at the top for the camera, but most apps do not wrap around that cutout, and it’s not distracting on the desktop.

Image 1 of 3

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 2 of 3

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 3 of 3

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)

The notch also contains the new 12MP Center Stage Camera. The idea here is that the lens is an ultra-wide camera, but for the purposes of video conferencing it crops to an undistorted rectangle. Then, as you move around, the frame moves around to keep you in the frame. If you like to get up and walk around, or people walk in and out of the video conversation, this can be tremendously useful, and it worked well for me as long as I didn't stray too far out of frame. If you need the camera to stay still (as I do when I use the 1080p camera to go on TV), you can easily turn Center Stage off.

Apple MacBook Air 13-inch (M4)

(Image credit: Future)

Compared to Microsoft’s excellent Surface Laptop 7, the screen is missing one feature: touch. I used Surface laptops for years, and I did enjoy being able to touch and even draw on the display with a dedicated Bluetooth pen. Apple has steadfastly resisted introducing touch on its MacBook line – and Apple co-founder Steve Jobs didn’t think it made sense. If you require that kind of multipurpose device, you may want to consider the M4 iPad Pro 13-inch plus a Magic Keyboard.

  • Display score: 4.5/5

Apple MacBook Air 13-inch (M4) review: macOS and Apple Intelligence

  • macOS Sequoia is a rich, deep, and well-organized platform
  • Everything is well integrated into Apple's wider ecosystem
  • Apple Intelligence can be useful, but it's not yet compelling

With macOS Sequoia, Apple has built one of the most consistent and stable desktop platforms on the planet. It virtually never crashes, and it’s full of useful features.

The latest version is mostly a refinement of the platform, but if it’s been a while since you’ve upgraded you will notice feature enhancement like better widgets and window-management tools, the excellent new Passwords app, and audio transcription on Notes.

Image 1 of 3

Apple MacBook Air 13-inch (M4) Review

(Image credit: Future)
Image 2 of 3

Apple MacBook Air 13-inch (M4) Review

(Image credit: Future)
Image 3 of 3

Apple MacBook Air 13-inch (M4) Review

(Image credit: Future)

What’s more, macOS makes excellent use of the M4’s power.

At one point I ran Garage Band, and I was pleased to discover that not only could I use the MacBook Air to tune my guitar, but it could also tell me if I was playing my chords correctly. I also used Pixelmator Pro image and video editor (now owned by Apple) to effortlessly apply complex masks.

Image 1 of 2

Apple MacBook 13-inch M4

(Image credit: Future)
Image 2 of 2

Apple MacBook 13-inch M4

(Image credit: Future)

Of course, the big news on the software side is Apple Intelligence, Apple’s own brand of AI, which is supported by the M4’s 16-core neural engine.

It enables features like Image Playground, which lets you imagine wild scenes that can include representations of you and others from your Photos library. It’s good fun, but I still struggle to see the utility, and I wonder when Apple will offer a more open-range image-generation platform, one that enables me to describe a complex scene in a prompt and get a result. Most Windows laptops running Copilot can do this.

Image 1 of 4

Apple MacBook Air 13-inch (M4)

(Image credit: Future)
Image 2 of 4

Apple MacBook Air 13-inch M4

(Image credit: Future)
Image 3 of 4

Apple MacBook Air 13-inch M4

(Image credit: Future)
Image 4 of 4

Apple MacBook Air 13-inch M4

(Image credit: Future)

Writing Tools, which is available in Apple's native text composition apps like Notes and Mail, is useful, especially if you struggle to write clear, cogent sentences. It's of limited utility to me.

Similarly, Siri got a few nice upgrades, like the ability to respond to text prompts and better handle broken speech patterns, but it's still unable to carry on longer conversations or learn anything about you, and you still can't use it to comprehensively control your MacBook. What’s worse is that promised updates to Siri that would have made it a more able competitor to ChatGPT and Gemini have failed to materialize. At least Siri can now tap into ChatGPT (if you allow it) for more complex queries.

Safari is an excellent browser, but I still find myself using Chrome.

  • Software score: 4/5

Apple MacBook Air 13-inch (M4) review: Performance

  • M4 has more CPU cores than the M3 that preceded it
  • Ample power
  • Decent but not massive performance upgrade
  • Excellent platform and increasing Apple Intelligence capabilities
Benchmarks

Here’s how the MacBook Air 13-inch (M4) performed in our suite of benchmark tests:

Geekbench 6.2.2 Single-Core: 3679; Multi-Core: 14430
Geekbench Metal score (8-core GPU): 48515
Cinebench 2024 Single-core: 165; Multi-core: 652
Battery life (web surfing): 14 hours 51 minutes, and 59 seconds

For comparison, here’s how the MacBook Air 13-inch (M3) performed in our suite of benchmark tests:

Geekbench 6.2.2 Single-Core: 3,148; Multi-Core: 11,893
Geekbench Metal score (10-core GPU): 49090
Cinebench 2024 Single-core: 141; Multi-core: 615

Ever since Apple switched from Intel to Apple silicon we’ve seen significant gains in performance and efficiency. The power of these lightweight laptops and the M-class chips can appear limitless, and all-day battery life is now usually a given.

Of course, the world has not stood still. Some Windows laptops are now arriving with the Qualcomm Snapdragon X Elite, and these ultraportables often nearly match Apple silicon for performance and battery life.

The M4 10-core CPU and 8-Core GPU backed by 16GB of unified memory inside my test system generally outperformed the X Elite on single-core scores but are now matched for multi-core performance.

These are just numbers of course, and I prefer to rely on real-world performance. In my tests, the MacBook Air 13 and its M4 chip handled everything I threw at it. It can be difficult to stress out the system – I played the AAA game Lies of Pi at maximum settings and it was smooth as butter, thanks no doubt in part to the new Game Mode that optimizes performance for gaming.

I highly recommend getting a controller (I use one designed for the Xbox), but regardless, the new MacBook Air offers a great gaming experience with thrilling, smooth graphics, and excellent sound.

Image 1 of 2

Apple MacBook 13-inch M4

(Image credit: Future)
Image 2 of 2

Apple MacBook 13-inch M4

(Image credit: Future)

I often ran the game alongside multiple background apps, including Final Cut Pro. I had no trouble editing four 4K 30fps streams at once, but when I loaded up four 4K 120fps clips, I did notice some stuttering on video playback, although as this is not a considerably more expensive MacBook Pro, that doesn’t concern me.

I noticed in my benchmarking that the Metal Score on the MacBook Air M3 was slightly higher than that of the M4 system, but that’s because I had a 10-core GPU on the older MacBook and just an eight-core GPU on the new M4 system. You can, as I noted earlier in the price section, pay a bit more for the two extra cores. It’s worth noting, though, that the differences in performance between the M3 10 Core and M4 8-Core GPU were minimal.

The system supports WiFi 6e and Bluetooth 5.3, which is good, if not entirely forward-leaning – I'd like to see WiFi 7 and Bluetooth 5.4.

  • Performance score: 4.5/5

Apple MacBook Air 13-inch (M4) review: Battery life

  • 14 hours battery life (web activities)
  • Effectively lasts all day (mixed use)
  • Charges to 50% in 90 minutes; 100% in three hours

Apple is promising up to 18 hours of battery life from the MacBook Air 13-inch (M4), which is mostly a test of how long the laptop can play 1080p video for; for comparison, Microsoft promises 20 hours from its Surface Laptop 7 for a similar task. The MacBook Air 13 M4’s real-world battery life numbers will vary significantly when performing a mix of sometimes CPU-intensive tasks.

Image 1 of 2

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 2 of 2

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)

In my tests, which included playing games (which made the base of the laptop quite warm), editing video, opening multiple browser windows and streaming video, battery life came in around eight hours. That’s quite good for a hard day of work, and especially for such a thin and light laptop. In our Future Labs test, which is primarily web browsing, the MacBook Air 13-inch (M4) managed 14 hours, 51 minutes, which is about 30 minutes longer than the M3 but for slightly different tasks.

Overall, you're getting good, all-day battery life, but your experience will vary based on the tasks you perform.

After I drained the laptop to zero, I recharged it with the included 30W charger (the more expensive 24GB model comes with a 35W charger) and (matching Sky Blue) woven MagSafe charger to 50% in 90 minutes, and 100% in three-and-a-half hours.

  • Battery score: 5/5

Should you buy the Apple MacBook Air 13-inch (M4)?

Buy it if...

You want the best ultraportable experience
The MacBook Air 13-inch (M4) might look the same as last year's model, but it's a definite upgrade – and that price makes it a winner.

You like your laptops thin and light
At 0.44 inches / 1.13cm thick and just 2.7lbs /1.24kg, the new 13-inch Air is a perfect backpack companion.

You need a good blend of power and efficiency
The MacBook Air 13-inch (M4) packs more than enough power for most users and you can bank on all-day battery life.

Don't buy it if...

You want a touchscreen
Apple may never introduce a touchscreen MacBook. For that, look to the Surface Laptop, or an iPad Pro paired with a Magic Keyboard.

You want more AI
Apple Intelligence is showing promise, but it still pales in comparison to what you'll find on some Windows Laptops with the Qualcomm Snapdragon X Elite.

Apple MacBook Air 13-inch (M3) review: Also consider

If our Apple MacBook Air 13-inch (M4) review has you considering other options, here are two laptops to consider...

Apple MacBook Air 15-inch (M4)
The MacBook Air 15-inch (M4) is virtually the same as the 13-inch model in every aspect except size (and screen size), but the base model does start with two extra GPU cores. It also gets a price reduction compared to the M3 model, so if screen real estate matters to you, this is the MacBook Air to go for.

Check out our MacBook Air 15-inch (M4) review

Dell XPS 13 Plus
Its thin and light design, stunning OLED screen, great sound quality, and comfortable keyboard make this a premium Windows 11 laptop that in many ways rivals the MacBook Air. However, it’s prone to overheating, and the touch bar is divisive.

Read more: Dell XPS 13 Plus review

How I tested the Apple MacBook Air 13-inch (M4)

Apple MacBook Air 13-inch (M4)

(Image credit: Future / Lance Ulanoff)
  • I used the Apple MacBook Air 13-inch (M4) for five days
  • I worked, played, listened, edited, and wrote this review on it
  • I usually ran multiple apps at once

After receiving my MacBook Air 13-inch (M4) review unit I immediately unboxed it and began testing, and it did not leave my side for much of the next five days.

I ran benchmarks, installed multiple apps, and then began using it to edit images and video, play AAA games, listen to music, stream movies and shows, answer email, browse the web, and generate words and images with Apple Intelligence.

I've been reviewing technology for over 30 years, and I've tested everything from DOS-based word processors to Apple's Vision Pro. I've reviewed laptops of all stripes, including traditional clamshells and convertibles. I regularly work on macOS but also use the Windows platform almost every day – I like to keep my hands in all the ecosystems.

Read more about how we test

First reviewed March 2025

I’ve reviewed three generations of 3D V-cache processors, and the AMD Ryzen 9 9950X3D is the best there is
4:00 pm |

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , | Comments: Off

AMD Ryzen 9 9950X3D: Two-minute review

So the AMD Ryzen 9 9950X3D has something of a high bar to clear given the strength of AMD's first Zen 5 3D V-Cache chip, the Ryzen 7 9800X3D, but having spent a week testing this chip, I can say unequivocally that AMD has produced the best processor ever made for the consumer market.

Whether it's gaming, creating, or general productivity work, the Ryzen 9 9950X3D doesn't suffer from the same hang-ups that held its predecessor, the AMD Ryzen 9 7950X3D, from completely dominating its competition among the previous generation of processors.

Like its predecessor, the Ryzen 9 9950X3D will sell for $699 / £699 / AU$1,349 when it goes on sale on March 12, 2025. This makes it the most expensive consumer processor on the market, so definitely be prepared to invest quite a bit for this chip, especially if you're upgrading from an Intel or AMD AM4 system. As an AM5 chip, you'll need to upgrade some major components, including motherboard and possibly RAM.

Unlike nearly all other X3D chips besides the 9800X3D and 9900X3D, however, the Ryzen 9 9950X3D is fully overclockable thanks to AMD rearchitecting the way the 3D V-cache sits on the compute die, so there's a lot more that this chip can do that other X3D chips can't.

That includes beating out the current champ for the best gaming CPU, the 9800X3D, in most games while also offering substantially better general and creative performance thanks to twice as many processing cores.

That doesn't mean that the AMD Ryzen 9 9950X3D is flawless, as there are some things to caveat here (which I'll get into in more depth below), but as an overall package, you simply won't find a better CPU on the market right now that will let you do just about anything you want exceptionally well while still letting you run a more reasonable cooling solution. Just be prepared to pay a premium for all that performance.

AMD Ryzen 9 9950X3D: Price & availability

An AMD Ryzen 9 9950X3D leaning against its retail packaging

(Image credit: Future / John Loeffler)
  • How much will it cost? US MSRP is $699 / £699 / AU$1,349
  • When is it available? It goes on sale on March 12, 2025
  • Where is it available? It will be available in the US, UK, and Australia at launch

The Ryzen 9 9950X3D goes on sale March 12, 2025, for a US MSRP of $699 / £699 / AU$1,349 in the US, UK, and Australia, respectively, making it the most expensive processor on the market.

It comes in at the same price as its predecessor, the Ryzen 9 7950X3D when it launched, and costs $100 more than the Ryzen 9 9900X3D that launches on the same day.

This is also just over $200 more expensive than the Ryzen 7 9800X3D which has nearly the same level of gaming performance (and in some cases surpasses the 9950X3D), so if you are strictly looking for a gaming CPU, the 9800X3D might be the better value.

Compared to Intel's latest flagship processor, meanwhile, the Ryzen 9 9950X3D is just over $100 more expensive than the Intel Core Ultra 9 285K, though that chip requires a whole new motherboard chipset if you're coming from an Intel LGA 1700 chip like the Intel Core i9-12900K, so it might represent a much larger investment overall.

  • Value: 3.5 / 5

AMD Ryzen 9 9950X3D: Specs

  • 128MB L3 Cache (96MB + 32MB)
  • Fully overclockable
  • Not all processing cores have access to 3D V-cache

Compared to the Ryzen 9 7950X3D, there don't seem to be too many changes spec wise, but there's a lot going on under the hood here.

First, the way the 3D V-cache is seated over the CCX for the 9950X3D differs considerably than with the 7950X3D, specifically that its seated underneath the processing die, rather than above it.

This means that the processing cores are now in 'direct' contact with the lid and cooling solution for the chip, allowing the 9950X3D to be fully overclocked, whereas the V-cache in the 7950X3D sat between the lid and the processing cores, making careful thermal design and limiting necessary and ruling out overclocking.

The 9950X3D does keep the same two-module split in its L3 cache as the 7950X3D, so that only one of the eight-core CCXs in the chip actually has access to the added V-cache (32MB + 64MB), while the other just has access to 32MB.

This had some benefit for more dedicated, directy access for individual cores in use more cache. In the last-gen, this honestly produced somewhat mixed results compared to the 7800X3D, which didn't split the V-cache up this way, leading ultimately to high levels of gaming performance for the 7800X3D.

Whatever issue there was with the 7950X3D looks to have been largely fixed with the 9950X3D, but some hiccups remains, which I'll get to in the performance section.

Beyond that, the 9950X3D has slightly higher base and boost clock speeds, as well as a 50W higher TDP, but its 170W TDP isn't completely unmanageable, especially next to Intel's competing chips.

  • Specs: 4.5 / 5

AMD Ryzen 9 9950X3D: Performance

An AMD Ryzen 9 9950X3D in a motherboard

(Image credit: Future / John Loeffler)
  • Almost best-in-class gaming performance
  • Strong overall performance

While the Ryzen 7 7800X3D was indisputably a better gaming chip than the Ryzen 9 7950X3D by the numbers, I was very curious going into my testing how this chip would fare against the 9800X3D, but I'm happy to report that not only is it better on the whole when it comes to gaming, it's a powerhouse for general computing and creative work as well, making it the best all-around processor on the market right now.

On the synthetic side, the Ryzen 9 9950X3D goes toe-to-toe with the Intel Core Ultra 9 285K in multi-core performance, coming within 2% of Intel's best on average, and chocking up a 10% stronger single-core result than the 285K.

Compared to its predecessor, the 7950X3D, the 9950X3D is about 15% faster in multi-core and single-core performance, while also barely edging out the Ryzen 9 9950X in multi-core performance.

Compared to the Ryzen 7 9800X3D, the eight-core difference between the two really shows up in the results, with the 9950X3D posting a 61% better multi-core performance, and a roughly 5% better single core score compared to the 9800X3D.

On the creative front, the 9950X3D outclasses Intel's best and anything else in the AMD Ryzen lineup that I've tested overall (we'll see how it fares against the 9900X3D once I've had a chance to test that chip), though it is worth noting that the Intel Core Ultra 9 285K is still the better processor for video editing work.

The AMD Ryzen X3D line is all about gaming though, and here, the Ryzen 9 9950X3D posts the best gaming performance of all the chips tested, with one caveat.

In the Total War: Warhammer III Mirrors of Madness benchmark, the Ryzen 9 9950X3D only scores a few fps higher than the non-X3D Ryzen 9 9950X (331 fps to 318 fps, respectively), while also scoring substantially lower than the 9800X3D's 506 fps in that same benchmark. That's a roughly 35% slower showing for the 9950X3D, and given its roughly where the non-X3D chip scored, it's clear that Total War: Warhammer III was running on one of those cores that didn't have access to the extra V-cache.

This is an issue with the Windows process scheduler that might be fixed in time so that games are run on the right cores to leverage the extra cache available, but that's not a guarantee the way it is with the 9800X3D, which gives all cores access to its added V-cache so there aren't similar issues.

It might be a fairly rare occurence, but if your favorite game does take advantage of the extra cache that you're paying a lot of money for, that could be an issue, and it might not be something you'll ever know unless you have a non-X3D 9950X handy to test the way I do.

With that in mind, if all you want is a gaming processor, and you really don't care about any of these other performance categories, you're probably going to be better served by the 9800X3D, as you will get guaranteed gaming performance increases, even if you don't get the same boost in other areas.

While that's a large caveat, it can't take away from the overall performance profile of this chip, which is just astounding pretty much across the board.

If you want the best processor on the market overall, this is it, even with its occasional blips, especially since it runs much cooler than Intel's chips and its power draw is much more acceptable for midrange PCs to manage.

  • Performance: 4.5 / 5

Should you buy the AMD Ryzen 9 9950X3D?

A masculine hand holding an AMD Ryzen 9 9950X3D processor

(Image credit: Future / John Loeffler)

Buy the AMD Ryzen 9 9950X3D if...

You want spectacular performance no matter the workload
While gamers will be especially interested in this chip, it's real strength is that it's strong everywhere.

You want the best gaming performance
When using 3D V-cache, this processor's gaming chops are unbeatable.

Don't buy it if...

You want consistent top-tier gaming performance
When games run on one of this chip's 3D V-cache cores, you're going to get the best performance possible, but Windows might not assign a game to those cores, so you might miss out on this chip's signature feature.

You're on a budget
This chip is crazy expensive, so only buy it if you're flush with cash.

Also consider

AMD Ryzen 7 9800X3D
If you want consistent, top-tier gaming performance, the 9800X3D will get you performance nearly as good as this chip's, though more consistently.

Read the full AMD Ryzen 7 9800X3D review

How I tested the AMD Ryzen 9 9950X3D

  • I spent several days with the AMD Ryzen 9 9950X3D
  • I used the chip as my main workstation processor and used my updated battery of benchmarks to measure its performance
  • I used it for general productivity, creative, and gaming workloads

I spent about a week with the Ryzen 9 9950X3D as my main workstation CPU, where I ran basic computing workloads as well as extensive creative work, such as Adobe Photoshop.

I also spent as much time as I could gaming with the chip, including titles like Black Myth: Wukong and Civilization VII. I also used my updated suite of benchmark tools including industry standard utilities like Geekbench 6.2, Cyberpunk 2077, and PugetBench for Creators.

I've been reviewing components for TechRadar for three years now, including more than a dozen processor reviews in that time, so you can trust my testing process and recommendations if you're looking for the best processor for your needs and budget.

  • First reviewed March 2025
The AMD RX 9070 XT delivers exactly what the market needs with stunning performance at an unbeatable price
5:00 pm | March 5, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

AMD Radeon RX 9070 XT: Two-minute review

AMD had one job to do with the launch of its RDNA 4 graphics cards, spearheaded by the AMD Radeon RX 9070 XT, and that was to not get run over by Blackwell too badly this generation.

With the RX 9070 XT, not only did AMD manage to hold its own against the GeForce RTX monolith, it perfectly positions Team Red to take advantage of the growing discontent among gamers upset over Nvidia's latest GPUs with one of the best graphics cards I've ever tested.

The RX 9070 XT is without question the most powerful consumer graphics card AMD's put out, beating the AMD Radeon RX 7900 XTX overall and coming within inches of the Nvidia GeForce RTX 4080 in 4K and 1440p gaming performance.

It does so with an MSRP of just $599 (about £510 / AU$870), which is substantially lower than those two card's MSRP, much less their asking price online right now. This matters because AMD traditionally hasn't faced the kind of scalping and price inflation that Nvidia's GPUs experience (it does happen, obviously, but not nearly to the same extent as with Nvidia's RTX cards).

That means, ultimately, that gamers who look at the GPU market and find empty shelves, extremely distorted prices, and uninspiring performance for the price they're being asked to pay have an alternative that will likely stay within reach, even if price inflation keeps it above AMD's MSRP.

The RX 9070 XT's performance comes at a bit of a cost though, such as the 309W maximum power draw I saw during my testing, but at this tier of performance, this actually isn't that bad.

This card also isn't too great when it comes to non-raster creative performance and AI compute, but no one is looking to buy this card for its creative or AI performance, as Nvidia already has those categories on lock. No, this is a card for gamers out there, and for that, you just won't find a better one at this price. Even if the price does get hit with inflation, it'll still likely be way lower than what you'd have to pay for an RX 7900 XTX or RTX 4080 (assuming you can find them at this point) making the AMD Radeon RX 9070 XT a gaming GPU that everyone can appreciate and maybe even buy.

AMD Radeon RX 9070 XT: Price & availability

An AMD Radeon RX 9070 XT made by Sapphire on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • How much is it? MSRP is $599 (about £510 / AU$870)
  • When can you get it? The RX 9070 XT goes on sale March 6, 2025
  • Where is it available? The RX 9070 XT will be available in the US, UK, and Australia at launch

The AMD Radeon RX 9070 XT is available as of March 6, 2025, starting at $599 (about £510 / AU$870) for reference-spec third-party cards from manufacturers like Asus, Sapphire, Gigabyte, and others, with OC versions and those with added accoutrements like fancy cooling and RGB lighting likely selling for higher than MSRP.

At this price, the RX 9070 XT comes in about $150 cheaper than the RTX 5070 Ti, and about $50 more expensive than the RTX 5070 and the AMD Radeon RX 9070, which also launches alongside the RX 9070 XT. This price also puts the RX 9070 XT on par with the MSRP of the RTX 4070 Super, though this card is getting harder to find nowadays.

While I'll dig into performance in a bit, given the MSRP (and the reasonable hope that this card will be findable at MSRP in some capacity) the RX 9070 XT's value proposition is second only to the RTX 5070 Ti's, if you're going by its MSRP. Since price inflation on the RTX 5070 Ti will persist for some time at least, in many cases you'll likely find the RX 9070 XT offers better performance per price paid of any enthusiast card on the market right now.

  • Value: 5 / 5

AMD Radeon RX 9070 XT: Specs

An AMD Radeon RX 9070 XT made by Sapphire on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • PCIe 5.0, but still just GDDR6
  • Hefty power draw

The AMD Radeon RX 9070 XT is the first RDNA 4 card to hit the market, and so its worth digging into its architecture for a bit.

The new architecture is built on TSMC's N4P node, the same as Nvidia Blackwell, and in a move away from AMD's MCM push with the last generation, the RDNA 4 GPU is a monolithic die.

As there's no direct predecessor for this card (or for the RX 9070, for that matter), there's not much that we can apples-to-apples compare the RX 9070 XT against, but I'm going to try, putting the RX 9070 XT roughly between the RX 7800 XT and the RX 7900 GRE if it had a last-gen equivalent.

The Navi 48 GPU in the RX 9070 XT sports 64 compute units, breaking down into 64 ray accelerators, 128 AI accelerators, and 64MB of L3 cache. Its cores are clocked at 1,600MHz to start, but can run as fast as 2,970MHz, just shy of the 3GHz mark.

It uses the same GDDR6 memory as the last-gen AMD cards, with a 256-bit bus and a 644.6GB/s memory bandwidth, which is definitely helpful in pushing out 4K frames quickly.

The TGP of the RX 9070 XT is 304W, which is a good bit higher than the RX 7900 GRE, though for that extra power, you do get a commensurate bump up in performance.

  • Specs: 4 / 5

AMD Radeon RX 9070 XT: Design

An AMD Radeon RX 9070 XT made by Sapphire on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • No AMD reference card
  • High TGP means bigger coolers and more cables

There's no AMD reference card for the Radeon RX 9070 XT, but the unit I got to test was the Sapphire Pulse Radeon RX 9070 XT, which I imagine is pretty indicative of what we can expect from the designs of the various third-party cards.

The 304W TGP all but ensures that any version of this card you find will be a triple-fan cooler over a pretty hefty heatsink, so it's not going to be a great option for small form factor cases.

Likewise, that TGP just puts it over the line where it needs a third 8-pin PCIe power connector, something that you may or may not have available in your rig, so keep that in mind. If you do have three spare power connectors, there's no question that cable management will almost certainly be a hassle as well.

After that, it's really just about aesthetics, as the RX 9070 XT (so far) doesn't have anything like the dual pass-through cooling solution of the RTX 5090 and RTX 5080, so it's really up to personal taste.

As for the card I reviewed, the Sapphire Pulse shroud and cooling setup on the RX 9070 XT was pretty plain, as far as desktop GPUs go, but if you're looking for a non-flashy look for your PC, it's a great-looking card.

  • Design: 4 / 5

AMD Radeon RX 9070 XT: Performance

An AMD Radeon RX 9070 XT in a test bench

(Image credit: Future / John Loeffler)
  • Near-RTX 4080 levels of gaming performance, even with ray tracing
  • Non-raster creative and AI performance lags behind Nvidia, as expected
  • Likely the best value you're going to find anywhere near this price point
A note on my data

The charts shown below offer the most recent data I have for the cards tested for this review. They may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

Simply put, the AMD Radeon RX 9070 XT is the gaming graphics card that we've been clamoring for this entire generation. While it shows some strong performance in synthetics and raster-heavy creative tasks, gaming is where this card really shines, managing to come within 7% overall of the RTX 4080 and getting within 4% of the RTX 4080's overall gaming performance. For a card launching at half the price of the RTX 4080's launch price, this is a fantastic showing.

The RX 9070 XT is squaring up against the RTX 5070 Ti, however, and here the RTX 5070 Ti does manage to pull well ahead of the RX 9070 XT, but it's much closer than I thought it would be starting out.

On the synthetics side, the RX 9070 XT excels at rasterization workloads like 3DMark Steel Nomad, while the RTX 5070 Ti wins out in ray-traced workloads like 3DMark Speed Way, as expected, but AMD's 3rd generation ray accelerators have definitely come a long way in catching up with Nvidia's more sophisticated hardware.

Also, as expected, when it comes to creative workloads, the RX 9070 XT performs very well in raster-based tasks like photo editing, and worse at 3D modeling in Blender, which is heavily reliant on Nvidia's CUDA instruction set, giving Nvidia an all but permanent advantage there.

In video editing, the RX 9070 XT likewise lags behind, though it's still close enough to Nvidia's RTX 5070 Ti that video editors won't notice much difference, even if the difference is there on paper.

Gaming performance is what we're on about though, and here the sub-$600 GPU holds its own against heavy hitters like the RTX 4080, RTX 5070 Ti, and Radeon RX 7900 XTX.

In 1440p gaming, the RX 9070 XT is about 8.4% faster than the RTX 4070 Ti and RX 7900 XTX, just under 4% slower than the RTX 4080, and about 7% slower than the RTX 5070 Ti.

This strong performance carries over into 4K gaming as well, thanks to the RX 9070 XT's 16GB VRAM. Here, it's about 15.5% faster than the RTX 4070 Ti and about 2.5% faster than the RX 7900 XTX. Against the RTX 4080, the RX 9070 XT is just 3.5% slower, while it comes within 8% of the RTX 5070 Ti's 4K gaming performance.

When all is said and done, the RX 9070 XT doesn't quite overpower one of the best Nvidia graphics cards of the last-gen (and definitely doesn't topple the RTX 5070 Ti), but given its performance class, it's power draw, its heat output (which wasn't nearly as bad as the power draw might indicate), and most of all, it's price, the RX 9070 XT is easily the best value of any graphics card playing at 4K.

And given Nvidia's position with gamers right now, AMD has a real chance to win over some converts with this graphics card, and anyone looking for an outstanding 4K GPU absolutely needs to consider it before making their next upgrade.

  • Performance: 5 / 5

Should you buy the AMD Radeon RX 9070 XT?

Buy the AMD Radeon RX 9070 XT if...

You want the best value proposition for a high-end graphics card
The performance of the RX 9070 XT punches way above its price point.

You don't want to pay inflated prices for an Nvidia GPU
Price inflation is wreaking havoc on the GPU market right now, but this card might fare better than Nvidia's RTX offerings.

Don't buy it if...

You're on a tight budget
If you don't have a lot of money to spend, this card is likely more than you need.

You need strong creative or AI performance
While AMD is getting better at creative and AI workloads, it still lags far behind Nvidia's competing offerings.

How I tested the AMD Radeon RX 9070 XT

  • I spent about a week with the AMD Radeon RX 9070 XT
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about a week with the AMD Radeon RX 9070 XT, which was spent benchmarking, using, and digging into the card's hardware to come to my assessment.

I used industry standard benchmark tools like 3DMark, Cyberpunk 2077, and Pugetbench for Creators to get comparable results with other competing graphics cards, all of while have been tested using the same testbench setup listed on the right.

I've reviewed more than 30 graphics cards in the last three years, and so I've got the experience and insight to help you find the best graphics card for your needs and budget.

  • Originally reviewed March 2025
Huawei Mate 70 Pro gets a Premium Edition with CPU downgrade
1:01 pm | February 28, 2025

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

Huawei introduced the Mate 70 Pro back in November with a Kirin 9020 chipset and today the company introduced a somewhat confusing alternative version. Dubbed Premium Edition it will go on sale on March 5. The name suggests an improved version, but in reality, it has an underclocked CPU and a lower price tag on the company store Vmall. The listing did not actually mention the chip downgrade – we caught that in a separate Geekbench listing. The Huawei Mate 70 Pro Premium Edition has a model number PLR-AL50 with 1,450 score for a single core and 3,793 for multiple cores. These are...

I tested the iPhone 16e for a week and found it’s a good phone that stretches the definition of ‘budget’
5:00 am | February 27, 2025

Author: admin | Category: Computers Gadgets iPhone Phones | Tags: , , , | Comments: Off

Apple iPhone 16e: Two-Minute Review

The iPhone 16e is a good phone. It has a pleasing design, and it feels like a true member of the iPhone 16 family. It is not a great phone, though – how could it be with a retro notch in the Super Retina XDR display and just a single 48MP camera?

There are 'budget' phones that cost far less and which have larger screens and multiple rear cameras. They're not iOS handsets, and that counts for something – any new iPhone joins an expansive and well-designed ecosystem offering connective tissue between excellent Apple services and other Apple hardware. I mostly live in that world now, and I appreciate how well my iPhone 16 Pro Max works with, for instance, my Mac, and how all my cloud-connected services know it's me on the line.

It's been a while since I've had such conflicting feelings about an iPhone. I appreciate that Apple thought it was time to move away from the iPhone SE design language, one that owed most of its look and feel to 2017's iPhone 8. I'm sure Apple couldn't wait to do away with the Lightning port and the Home button with Touch ID (which lives on in Macs and some iPads). But instead of giving us something fresh, Apple took a bit of this and a bit of that to cobble together the iPhone 16e.

The display is almost the best Apple has to offer if you can ignore the notch, aren't bothered by larger bezels, and don't miss the Dynamic Island too much. The main 48MP Fusion camera is very good and shoots high-quality stills and videos, but don't be fooled by the claims of 2x zoom, which is actually a 12MP crop on the middle of the 48MP sensor. I worry that people paying $599 / £599 / AU$999 for this phone will be a little frustrated that they're not at least getting a dedicated ultra-wide camera at that price.

Conversely, there is one bit of this iPhone 16e that's not only new but is, for the moment, unique among iPhone 16 devices: the C1 chip. I don't know why Apple's cheapest iPhone got this brand-new bit of Apple silicon, but it does a good job of delivering 5G and even satellite connectivity. Plus, it starts moving Apple out from under the yolk of Qualcomm, Apple's cellular modem chip frenemy. That relationship has been fraught for years, and I wonder if Apple had originally hoped to put the C1 in all iPhone 16 models but the development schedule slipped.

Apple iPhone 16e REVIEW

The iPhone 16e (center) with the iPhone 16 (right) and iPhone SE 3 (left). (Image credit: Future / Lance Ulanoff)

In any case, while it's hard to measure the connectivity benefits (it's another good 5G modem), Apple says this is the most efficient cellular modem it's ever put in an iPhone (that seems like a swipe at Qualcomm), and helps to deliver stellar battery life: a claimed 26 hours of video streaming. Battery life in real-world use will, naturally, be a different story.

On balance, I like this phone's performance (courtesy of the A18 chip and 8GB of RAM), its looks, and how it feels in the hand (a matte glass back and Ceramic Shield front), and I think iOS 18 with Apple Intelligence is well-thought-out and increasingly intelligent (though Siri remains a bit of a disappointment); but if you're shopping for a sub-$600 phone, there may be other even better choices from the likes of Google (Pixel 8a), OnePlus (OnePlus 13R) and the anticipated Samsung Galaxy S25 FE. You just have to be willing to leave the Apple bubble.

Apple iPhone 16e: Price and availability

Apple unveiled the iPhone 16e on February 19, 2025. It joins the iPhone 16 lineup, and starts at $599 / £599 / AU$999 with 128GB of storage, making it the most affordable smartphone of the bunch. It's available in black or white.

While some might consider the iPhone 16e to be the successor to the iPhone SE 3, it has little in common with that device. In particular, that was a $429 phone. At $599, Apple might be stretching the definition of budget, but it is $200 cheaper than the base iPhone 16. The phone's price compares somewhat less favorably outside the iOS sphere. The OnePlus 13R for instance is a 6.7-inch handset with three cameras, and the Google Pixel 8a matches the iPhone 16e's 6.1-inch screen size (though at a lower resolution), but also includes two rear cameras.

You won't find more affordable new phones in the iOS space. The iPhone 15 has the main and ultra-wide camera and the Dynamic Island, but it costs $699 / £699 / AU$1,249. A refurbished iPhone 14 costs $529, but neither it nor the iPhone 15 supports Apple Intelligence.

  • Value score: 4/5

Apple iPhone 16: Specs

Apple iPhone 16e: Design

  • No trace of the iPhone SE design remains
  • Hybrid iPhone 14/15 design
  • Sharper edges than the current iPhone 16 design
Image 1 of 5

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 2 of 5

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 3 of 5

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 4 of 5

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 5 of 5

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

There's no question that the iPhone 16e is a part of the iPhone 16 family. At a glance, especially when the screen is off, it's almost a dead ringer for the base model; the aerospace aluminum fame is only slightly smaller.

Upon closer examination, those similarities recede, and I can see the myriad differences that make this a true hybrid design. This is now the only iPhone with a single camera, which almost looks a little lonely on the matte glass back. The edges of the metal band that wraps around the body are noticeably sharper than those of any other iPhone 16, but the phone still feels good in the hand.

The button configuration is essentially what you'd find on an iPhone 15. There's the power / sleep / Siri button on the right, and on the left are the two volume buttons and the Action button. Unlike the rest of the iPhone 16 lineup the 16e doesn't get the Camera Control, but at least the Action button is configurable, so you can set it to activate the camera or toggle the Flashlight, Silent Mode, Voice Memo, and more. I set mine to launch Visual Intelligence, an Apple Intelligence feature: you press and hold the Action button once to open it, and press again to grab a photo, and then you can select on-screen if you want ChatGPT or Google Search to handle the query. Apple Intelligence can also analyze the image directly and identify the subject.

The phone is iP68 rated to handle water and dust, including a dunk in six meters of water for 30 minutes. The screen is protected with a Ceramic Shield to better protect it from drops, though I'm not sure it does much to prevent scratches.

I put a case on the phone, never dropped it, and handled it gingerly, and yet within a day I noticed a long scratch on the screen, although I have no recollection of brushing the display against anything. I had a similar situation with the Samsung Galaxy S25 Ultra; I await the phone that can handle life in my pocket (empty other than the phone) without sustaining a scratch.

Overall, if you like the looks of the iPhone 16 lineup (or even the iPhone 14 and 15 lineups) the iPhone 16e will not disappoint.

  • Design score: 4 / 5

Apple iPhone 16e: Display

  • Almost Apple's best smartphone display
  • The notch is back
  • The bezels are a little bigger

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

If you're coming from the iPhone SE to the iPhone 16E, you're in for quite a shock. This 6.1-inch Super Retina XDR OLED screen is nothing like the 4.7-inch LCD display on that now-retired design.

The iPhone 16e features a lovely edge-to-edge design – with slightly larger bezels than you'll find on other iPhone 16 phones – that leaves no room for the dearly departed Touch ID Home button. Instead, this phone adopts the Face ID biometric security, which is, as far as I'm concerned, probably the best smartphone face recognition in the business. Face ID lives in the TrueDepth camera system notch, which also accommodates, among other things, the 12MP front-facing camera, microphone, and proximity sensor.

While I never had a big problem with the notch, I can't say I'm thrilled to see it return here. The rest of the iPhone 16 lineup features the versatile Dynamic Island, which I think most would agree is preferable to this cutout.

Image 1 of 3

Apple iPhone 16e REVIEW

The iPhone 16e (left) next to the iPhone SE 3 (middle), and the iPhone 16. (Image credit: Future / Lance Ulanoff)
Image 2 of 3

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 3 of 3

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

The iPhone 16e shares the iPhone 16's 460ppi resolution, but it does lose a few pixels (2532 x 1170 versus 2556 x 1179 for the iPhone 16). It still supports True Tone, Wide color (P3), and a 2,000,000:1 contrast ratio. The only area where it loses a bit of oomph is on the brightness front. Peak brightness for HDR content is 1,200 nits, and all other content is 800nits. The iPhone 16's peak outdoor brightness is 2,000 nits. As with other non-pro models, the refresh rate on the iPhone 16e sits at a fixed 60Hz.

Even so, I had no trouble viewing the iPhone 16e screen in a wide variety of lighting situations, and any shortcomings are only evident in the brightest, direct sunlight.

In day-to-day use, everything from photos and video to AAA games, apps, and websites looks great on this display. Colors are bright and punchy, and the blacks are inky. I'm not distracted by the notch on games, where it can cut a bit into the gameplay view, and most video streaming defaults to a letterbox format that steers clear of it, with black bars on the left and right sides of the screen.

  • Display score: 4 / 5

Apple iPhone 16e: Software and Apple Intelligence

  • iOS 18 is a rich and well-thought-out platform
  • Apple Intelligence has some impressive features, but we await the Siri of our dreams
  • Mail and photo redesigns leave something to be desired

iOS 18 is now smarter, more proactive, and more customizable than ever before. I can transform every app icon from 'Light' to 'Tinted' (monochromatic), fill my home screen with widgets, and expand them until they almost fill the screen. This customizability carries through to the Control Center, which is now a multi-page affair that I can leave alone, or completely reorganize so the tools I care about are available with a quick swipe down from the upper-right corner.

Image 1 of 2

Apple iPhone 16e REVIEW

(Image credit: Future)
Image 2 of 2

Apple iPhone 16e REVIEW

(Image credit: Future)

Apple Intelligence, which Apple unveiled last June, is growing in prominence and utility. It lives across apps like Messages and Email in Writing Tools, which is a bit buried so I often forget it exists. It's in notification summaries that can be useful for at-a-glance action but which are sometimes a bit confusing, and in image-generation tools like Image Playground and Genmojis.

It's also in Visual intelligence, which, as have it set up, gives me one-button access to ChatGPT and Google Search.

Image 1 of 2

Apple iPhone 16e review

Apple Intelligence Clean Up does an excellent job of removing those big lights (Image credit: Future / Lance Ulanoff)
Image 2 of 2

Apple iPhone 16e review

See? (Image credit: Future / Lance Ulanoff)

I think I prefer the more utilitarian features of Apple Intelligence like Clean Up. It lets you quickly remove people and objects from photos as if they were never there in the first place.

I'm also a fan of Audio Mix, which is not a part of Apple Intelligence, but uses machine learning to clean up the messiest audio to make it usable in social media, podcasts, or just for sharing with friends.

iOS 18 also features updated Photos and Mail apps with Apple Intelligence. I've struggled a bit with how Photos reorganized my images, and I've had similar issues with how Mail is now reorganizing my emails. I hope Apple takes another run at these apps in iOS 19.

Siri is smarter and more aware of iPhone features than before. It can handle my vocal missteps, and still knows what I want, but remains mostly unaware of my on-device information, and feels far less conversational and powerful as a chatbot than Google Gemini and ChatGPT.

  • Software score: 4.5 / 5

Apple iPhone 16e: Camera

  • 48MP Fusion is a good camera
  • The front-facing camera shines as well
  • A single rear camera at this price is disappointing

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

With a more powerful CPU, a bigger screen, and the new C1 chip, I can almost understand why Apple set the iPhone 16e price as high as it did. Almost… until I consider the single, rear 48MP Fusion camera. Most smartphones in this price range feature at least two lenses, and usually the second one is an ultra-wide – without that lens you miss out on not only dramatic ultra-wide shots but also macro photography capabilities. Had Apple priced this camera at $499, I might understand.

Image 1 of 4

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 2 of 4

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 3 of 4

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 4 of 4

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

Still, I like this camera. It defaults to shooting in 24MP, which is a bin of the 48MP available on the sensor (two pixels for each single image pixel to double the image information). There's a 2x zoom option, which is useful, but it's only shooting at 12MP because it's only using the central 12 megapixels from the full 48MP frame. These images are still good, but just not the same resolution as the default or what you could get shooting full-frame.

Overall, the camera shoots lovely photos with exquisite detail and the kind of color fidelity I appreciate (in people and skies especially) in a wide variety of scenarios. I captured excellent still lifes, portraits, and night-mode shots. I was also impressed with the front camera, which is especially good for portrait-mode selfies. Much of this image quality is thanks to the work Apple has done on its Photonic Engine. Apple's computational image pipeline pulls out extraordinary detail and nuance in most photographic situations, even if it is for just these two cameras.

iPhone 16 camera samples

Image 1 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
Image 2 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 1x (Image credit: Future / Lance Ulanoff)
Image 3 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera (Image credit: Future / Lance Ulanoff)
Image 4 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera portrait mode (Image credit: Future / Lance Ulanoff)
Image 5 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 1x (Image credit: Future / Lance Ulanoff)
Image 6 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
Image 7 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
Image 8 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x, night mode (Image credit: Future / Lance Ulanoff)
Image 9 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x, night mode (Image credit: Future / Lance Ulanoff)
Image 10 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 1x (Image credit: Future / Lance Ulanoff)
Image 11 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
Image 12 of 15

Apple iPhone 16e REVIEW

Rear camera, 1x (Image credit: Future / Lance Ulanoff)
Image 13 of 15

Apple iPhone 16e REVIEW

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
Image 14 of 15

Apple iPhone 16e REVIEW

Rear camera, 1x (Image credit: Future / Lance Ulanoff)
Image 15 of 15

Apple iPhone 16e REVIEW

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
  • Camera score: 4 / 5

Apple iPhone 16e: Performance

  • The A18 is an excellent and powerful CPU
  • It's ready for Apple Intelligence
  • C1, Apple's first cellular modem, is effective for 5G and satellite connectivity

If you're wondering why the successor to the iPhone SE is not a $429 smartphone, you might look at the processing combo of the powerful A18 and the new C1.

The A18 is the same chip you'll find in the iPhone 16, with the exception of one fewer GPU core. I promise you'll never notice the difference.

Performance scores are excellent, and in line with the numbers we got for other A18 chips (and slightly lower than what you get from the A18 Pro in the iPhone 16 Pro and 16 Pro Max).

The A18 has more than enough power not just for day-to-day tasks like email and web browsing, but for 4K video editing (which I did in CapCut) and AAA gaming (game mode turns on automatically to divert more resources toward gaming). I played Asphalt 9 United, Resident Evil 4, and Call of Duty Mobile, and made things easier for myself by connecting my Xbox controller. My only criticism would be that a 6.1-inch screen is a little tight for these games. The audio from the stereo speakers, by the way, is excellent – I get an impressive spatial audio experience with Resident Evil 4.

Image 1 of 3

Apple iPhone 16e review

(Image credit: Future / Lance Ulanoff)
Image 2 of 3

Apple iPhone 16e review

(Image credit: Future / Lance Ulanoff)
Image 3 of 3

Apple iPhone 16e review

(Image credit: Future / Lance Ulanoff)

There's also the new C1 chip, which is notable because it's Apple's first custom cellular mobile chip. Previously Apple relied on, among other partners, Qualcomm for this silicon. I didn't notice any difference in connectivity with the new chip, which is a good thing – and I was impressed that I could use text via satellite.

Apple iPhone 16e REVIEW

(Image credit: Future)

I didn't think I'd get to test this feature, but AT&T connectivity is so bad in my New York neighborhood that the SOS icon appeared at the top of my iPhone 16e screen, and next to it I noticed the satellite icon. I opened messages, and the phone asked if I wanted to use the Satellite texting feature. I held the phone near my screen door to get a clear view of the sky, and followed the on-display guide that told me which way to point the phone. I got a 'Connected' notification, and then sent a few SMS texts over satellite. It's a nifty feature, and it was a nice little test of the C1's capabilities.

  • Performance score: 5 / 5

Apple iPhone 16e: Battery

  • Long lasting
  • Wireless charging
  • No MagSafe

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

It's clear that Apple has prioritized battery life on the iPhone 16e over some other features. That would likely explain, for instance, why we have wireless charging but not MagSafe support – adding that magnetic ring might have eaten into battery space. The C1 chip is apparently smaller than the modem chip in other iPhone 16 models, and even the decision to include one camera instead of two probably helped make room for what is a larger battery than even the one in the iPhone 16.

Apple rates the iPhone 16e for 26 hours of video-rundown battery life – that's about four hours more than the iPhone 16. In my real-world testing the battery life has been very good, but varied use can run the battery down in far fewer than 26 hours.

On one day when I did everything from email and web browsing to social media consumption and then a lot of gaming, battery life was about 12 hours – gaming in particular really chewed through the battery and made the phone pretty warm.

My own video rundown test (I played through episodes of Better Call Saul on Netflix) returned about 24 hours of battery life.

I used a 65W USB-C charger to charge the phone to 57% in 30 minutes, with a full charge taking about one hour and 50 minutes. I also tried a 20W charger, which charged the phone to 50% in 30 minutes.

  • Battery score: 5 / 5

Should you buy the Apple iPhone 16e?

iPhone 16e score card

Buy it if..

You want an affordable, smaller iPhone

This is now your only brand-new 'budget' iPhone choice.

You want sub-$600 access to Apple Intelligence

Apple squeezed a A18 chip inside this affordable iPhone to give you access to Apple's own brand of AI.

Don’t buy it if...

You're a photographer

A single, albeit excellent, rear lens won't be enough for people who like to shoot wide-angle and macros.

You never liked the notch

Apple bringing back a none-too-loved display feature doesn't make a lot of sense. If you want the Dynamic Island at a more affordable price than the iPhone 16, take a look at the iPhone 15.

You want a real zoom lens

The 2x zoom on the iPhone 16e is not a true optical zoom; instead, it's a full-frame sensor crop. If a big optical zoom is your thing, look elsewhere.

Apple iPhone 16: Also consider

iPhone 15

For $100 more you get two cameras, the Dynamic Island, and the Camera Control.

Read TechRadar's iPhone 15 review.

Google Pixel 8a

As soon as you step outside the Apple ecosystem you'll find more affordable phones with more features. The Pixel 8a is not as powerful as the iPhone 16e, but it has a nice build, two cameras, excellent Google services integration, and affordable access to Gemini AI features.

Read TechRadar's Google Pixel 8a review.

Apple iPhone 16: How I tested

I've reviewed countless smartphones ranging from the most affordable models to flagships and foldables. I put every phone through as many rigorous tests and everyday tasks as possible.

I had the iPhone 16e for just under a week, and after receiving it I immediately started taking photos, running benchmarks, and using it as an everyday device for photos, videos, email, social media, messaging, streaming video, and gaming.

Correction 2-27-2025: A previous version of this review listed Audio Mix as part of Apple Intelligence.

First reviewed February 26, 2025

« Previous PageNext Page »