Organizer
Gadget news
Samsung Galaxy Tab S10 FE benchmarked, showing 32% gains in CPU performance
1:49 pm | March 12, 2025

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

Geekbench confirms the reports that the upcoming Samsung Galaxy Tab S10 FE and Tab S10 FE+ will be powered by the Exynos 1580 (S5E8855), a chipset that is currently found only inside the Galaxy A56. This brings a massive performance boost over the old Exynos 1380 that powered 2023’s Tab S9 FE and Tab S9 FE+. The tablet that ran the benchmark – Samsung SM-X520 should be Tab S10 FE – was the base model with 8GB of RAM. That’s another upgrade over its predecessor, which had 6GB as its base capacity. Both new FE models will have up to 12GB of RAM and up to 256GB storage (128GB base). Here’s...

The Apple MacBook Air 13-inch (M4) is the best ultraportable – and the new price makes it even more appealing
4:09 pm | March 11, 2025

Author: admin | Category: Computers Computing Gadgets Laptops Macbooks | Tags: , , | Comments: Off

Apple MacBook Air 13-inch (M4): Two-minute review

How do you make the best MacBook, and arguably one of the best laptops on the market, better? You could redesign it, but that’s a move fraught with potential downsides; if the current design is popular, you risk disenfranchising fans. In that case, making small changes, especially under-the-hood ones, is probably the smart move, and it’s clearly Apple’s strategy.

The MacBook Air 13-inch (M4) is virtually indistinguishable from the M3 model. Apple has left the exquisite keyboard and responsive trackpad untouched, and the same goes for the brilliant Liquid Retina display. The 2.7lbs. weight is unchanged, and even the two Thunderbolt 4 ports are essentially the same. Visually, the only thing that's new is a new color option, and the Sky Blue finish is a subtle hue that can, depending on the light, look almost gray, but a second glance always reveals that pleasing almost pastel-like azure. It’s a color that should sell out fast.

@techradar

♬ original sound - TechRadar

The other two significant changes are to the hardware. Replacing the FaceTime camera is the new 12MP Center Stage Camera. It’s an ultra-wide lens in a screen notch that can keep you in the frame during video calls, and it’s a nice-to-have though not earth-shattering update.

There’s also the M4 chip, which adds cores and performance over the M3 Apple silicon it replaces. Like the M3, this is a fast, efficient, 3-nanometer chip with plenty of headroom for AAA gaming, video editing, music creation and, of course, Apple Intelligence.

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)

From one perspective, the biggest upgrade might be in the value space. Apple doubled the base memory from 8GB of unified memory to 16GB while reducing the price to $999 / £999 / AU$1,699. That’s a shocking, and very welcome, turn of events. The best MacBook is now back to its pre–MacBook Air M3 price, and better value because of it.

It really is hard to find any fault with the MacBook Air 13-inch (M4). It’s lightweight, attractive, powerful, easy to use, and up for anything. I gamed, streamed video, browsed the web, answered email, texted friends, conducted FaceTime calls, edited video, practiced guitar, and wrote this review on it. I’m not concerned about the lack of design changes, and I like the new color, the Center Stage Camera, and especially the price. I would not be surprised to see the MacBook Air 13-inch (M4) rise to the very top of our best laptops list.

Apple MacBook Air 13-inch (M4) review: Price and availability

  • Starts at $999 / £999 / AU$1,699
  • Lower launch price than the discontinued M3 model
  • M2 and M3 models no longer on the Apple Store, but M2 MacBooks can be found at third-party retailers

Rarely do I get to write about a price drop for a new product that arrives with feature enhancements. Usually, we get the same or sometimes a little less for the money. That is not the case with the MacBook Air 13-inch M4.

Even though Apple hasn't radically refreshed its best MacBook, the updates in performance, memory, and video conferencing, plus a new color, hit all the right notes – and when paired with a now $100 (in the US) lower price, they have me singing a happy tune.

Funnily enough, the first 3lb MacBook Air – the one that slid out of a manilla envelope in 2008 – cost $1,799. It would take a few years for it to hit that $999 sweet spot, which it maintained until recently.

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)

Sometimes that $999 got you a lower-end Intel Core I, but in the age of Apple silicon we’re getting great performance and efficiency at an excellent price.

The MacBook Air 13-inch (M4) comes in three base configurations. If you upgrade to the $1,199 / £1,199 model the GPU gets a bump from eight to 10 cores, and the storage doubles to 512GB. Go for the $1,499 / £1,499 / AU$2,399 top-tier model and the base unified memory is increased from 16GB to 24GB, and you can get up to 2TB of storage. Whichever option you go for, you can upgrade the RAM to 32GB.

It’s available in the new Sky Blue (like my 256GB review unit), Midnight, Starlight, and Silver. Apple has discontinued Space Gray (for now).

Apple unveiled the MacBook Air 13-inch (M4) on March 5, 2025, and the laptop starts shipping on March 12.

  • Price score: 4.5/5

Apple MacBook Air 13-inch (M4) review: Specs

The Apple MacBook Air 13-inch (M4) comes in three pre-configured options.

Apple MacBook Air 13-inch (M4) review: Design

  • No major redesign
  • Sky Blue is subtle but attractive
  • Excellent construction, materials, keyboard, and trackpad

There are still some who mourn the passing of the original MacBook Air’s wedge design, the one that started at a more than half inch (1.61 cm) at one end and ended at 0.16 inches (4.064mm) at the other. That design remains so popular that the M1 model featuring it is still a top seller at Walmart.

I’ve moved on. The MacBook Air M4 is just 2.7lbs / 1.24kg, and at 11.97 x 8.46 x 0.44 inches / 30.41 x 21.5 x 1.13cm, is thinner than the OG MacBook Air was at its thickest point. This is a laptop that's built for your backpack and, yes, it’s light enough that you might forget it’s there.

Everything about the MacBook Air M4 feels premium. The 100% recycled aluminum enclosure is light but solid and has all the exacting tolerances Apple is known for. It’s a finely machined, eye-catching piece of hardware, and few laptops can match its elegance.

Image 1 of 8

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 2 of 8

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 3 of 8

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 4 of 8

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 5 of 8

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 6 of 8

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 7 of 8

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 8 of 8

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)

The backlit keyboard is an absolute pleasure to type on, and has remarkable travel and response for such a thin design. It includes all your function keys and a multipurpose power / sleep / Touch ID button that’s useful for unlocking the MacBook Air and logging into various apps and services with your registered fingertips.

I do prefer the Microsoft Surface Laptop’s Windows Hello feature, which lets you log on using your face in much the way you do with Face ID on any of the best iPhones, although I don’t have to touch anything because I set the MacBook Air to unlock automatically with my Apple Watch.

While Apple hasn't redesigned the keyboard, there is one small change that you might not notice at first glance: the mute key now features a speaker icon with a line through it, which matches what you see on-screen when you press the key. It's a small but clarifying change.

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)

There’s ample room to rest your palms, and the glass-covered multi-touch trackpad is huge and responsive.

Ports and other elements are unchanged from the last two MacBook Air generations. There are two Thunderbolt 4 ports on the left side with up to 40GBps of throughput and which are capable of driving two external screens, even with the MacBook Air lid open. Next to those is the MagSafe charging port, and on the right side is the 3.5mm headphone jack.

The four-speaker stereo sound system is hidden in the hinge below the display. It can fill a room with bright, crisp audio, although it mostly lacks bass (the 15-inch model offers a 6-speaker sound system with force-cancelling sound woofers).

  • Design score: 4.5/5

Apple MacBook Air 13-inch (M4) review: Display and Center Stage

With one exception, the 13-inch M4 MacBook Air’s display is identical to the last generation. It’s still a 13.6-inch Liquid Retina panel with 2560 x 1664 resolution and 500 nits of sustained brightness, which in my experience is viewable in direct sunlight, and support for one billion colors. It’s a fantastic display for everything from gaming to streaming to content creation.

There is a notch at the top for the camera, but most apps do not wrap around that cutout, and it’s not distracting on the desktop.

Image 1 of 3

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 2 of 3

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 3 of 3

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)

The notch also contains the new 12MP Center Stage Camera. The idea here is that the lens is an ultra-wide camera, but for the purposes of video conferencing it crops to an undistorted rectangle. Then, as you move around, the frame moves around to keep you in the frame. If you like to get up and walk around, or people walk in and out of the video conversation, this can be tremendously useful, and it worked well for me as long as I didn't stray too far out of frame. If you need the camera to stay still (as I do when I use the 1080p camera to go on TV), you can easily turn Center Stage off.

Apple MacBook Air 13-inch (M4)

(Image credit: Future)

Compared to Microsoft’s excellent Surface Laptop 7, the screen is missing one feature: touch. I used Surface laptops for years, and I did enjoy being able to touch and even draw on the display with a dedicated Bluetooth pen. Apple has steadfastly resisted introducing touch on its MacBook line – and Apple co-founder Steve Jobs didn’t think it made sense. If you require that kind of multipurpose device, you may want to consider the M4 iPad Pro 13-inch plus a Magic Keyboard.

  • Display score: 4.5/5

Apple MacBook Air 13-inch (M4) review: macOS and Apple Intelligence

  • macOS Sequoia is a rich, deep, and well-organized platform
  • Everything is well integrated into Apple's wider ecosystem
  • Apple Intelligence can be useful, but it's not yet compelling

With macOS Sequoia, Apple has built one of the most consistent and stable desktop platforms on the planet. It virtually never crashes, and it’s full of useful features.

The latest version is mostly a refinement of the platform, but if it’s been a while since you’ve upgraded you will notice feature enhancement like better widgets and window-management tools, the excellent new Passwords app, and audio transcription on Notes.

Image 1 of 3

Apple MacBook Air 13-inch (M4) Review

(Image credit: Future)
Image 2 of 3

Apple MacBook Air 13-inch (M4) Review

(Image credit: Future)
Image 3 of 3

Apple MacBook Air 13-inch (M4) Review

(Image credit: Future)

What’s more, macOS makes excellent use of the M4’s power.

At one point I ran Garage Band, and I was pleased to discover that not only could I use the MacBook Air to tune my guitar, but it could also tell me if I was playing my chords correctly. I also used Pixelmator Pro image and video editor (now owned by Apple) to effortlessly apply complex masks.

Image 1 of 2

Apple MacBook 13-inch M4

(Image credit: Future)
Image 2 of 2

Apple MacBook 13-inch M4

(Image credit: Future)

Of course, the big news on the software side is Apple Intelligence, Apple’s own brand of AI, which is supported by the M4’s 16-core neural engine.

It enables features like Image Playground, which lets you imagine wild scenes that can include representations of you and others from your Photos library. It’s good fun, but I still struggle to see the utility, and I wonder when Apple will offer a more open-range image-generation platform, one that enables me to describe a complex scene in a prompt and get a result. Most Windows laptops running Copilot can do this.

Image 1 of 4

Apple MacBook Air 13-inch (M4)

(Image credit: Future)
Image 2 of 4

Apple MacBook Air 13-inch M4

(Image credit: Future)
Image 3 of 4

Apple MacBook Air 13-inch M4

(Image credit: Future)
Image 4 of 4

Apple MacBook Air 13-inch M4

(Image credit: Future)

Writing Tools, which is available in Apple's native text composition apps like Notes and Mail, is useful, especially if you struggle to write clear, cogent sentences. It's of limited utility to me.

Similarly, Siri got a few nice upgrades, like the ability to respond to text prompts and better handle broken speech patterns, but it's still unable to carry on longer conversations or learn anything about you, and you still can't use it to comprehensively control your MacBook. What’s worse is that promised updates to Siri that would have made it a more able competitor to ChatGPT and Gemini have failed to materialize. At least Siri can now tap into ChatGPT (if you allow it) for more complex queries.

Safari is an excellent browser, but I still find myself using Chrome.

  • Software score: 4/5

Apple MacBook Air 13-inch (M4) review: Performance

  • M4 has more CPU cores than the M3 that preceded it
  • Ample power
  • Decent but not massive performance upgrade
  • Excellent platform and increasing Apple Intelligence capabilities
Benchmarks

Here’s how the MacBook Air 13-inch (M4) performed in our suite of benchmark tests:

Geekbench 6.2.2 Single-Core: 3679; Multi-Core: 14430
Geekbench Metal score (8-core GPU): 48515
Cinebench 2024 Single-core: 165; Multi-core: 652
Battery life (web surfing): 14 hours 51 minutes, and 59 seconds

For comparison, here’s how the MacBook Air 13-inch (M3) performed in our suite of benchmark tests:

Geekbench 6.2.2 Single-Core: 3,148; Multi-Core: 11,893
Geekbench Metal score (10-core GPU): 49090
Cinebench 2024 Single-core: 141; Multi-core: 615

Ever since Apple switched from Intel to Apple silicon we’ve seen significant gains in performance and efficiency. The power of these lightweight laptops and the M-class chips can appear limitless, and all-day battery life is now usually a given.

Of course, the world has not stood still. Some Windows laptops are now arriving with the Qualcomm Snapdragon X Elite, and these ultraportables often nearly match Apple silicon for performance and battery life.

The M4 10-core CPU and 8-Core GPU backed by 16GB of unified memory inside my test system generally outperformed the X Elite on single-core scores but are now matched for multi-core performance.

These are just numbers of course, and I prefer to rely on real-world performance. In my tests, the MacBook Air 13 and its M4 chip handled everything I threw at it. It can be difficult to stress out the system – I played the AAA game Lies of Pi at maximum settings and it was smooth as butter, thanks no doubt in part to the new Game Mode that optimizes performance for gaming.

I highly recommend getting a controller (I use one designed for the Xbox), but regardless, the new MacBook Air offers a great gaming experience with thrilling, smooth graphics, and excellent sound.

Image 1 of 2

Apple MacBook 13-inch M4

(Image credit: Future)
Image 2 of 2

Apple MacBook 13-inch M4

(Image credit: Future)

I often ran the game alongside multiple background apps, including Final Cut Pro. I had no trouble editing four 4K 30fps streams at once, but when I loaded up four 4K 120fps clips, I did notice some stuttering on video playback, although as this is not a considerably more expensive MacBook Pro, that doesn’t concern me.

I noticed in my benchmarking that the Metal Score on the MacBook Air M3 was slightly higher than that of the M4 system, but that’s because I had a 10-core GPU on the older MacBook and just an eight-core GPU on the new M4 system. You can, as I noted earlier in the price section, pay a bit more for the two extra cores. It’s worth noting, though, that the differences in performance between the M3 10 Core and M4 8-Core GPU were minimal.

The system supports WiFi 6e and Bluetooth 5.3, which is good, if not entirely forward-leaning – I'd like to see WiFi 7 and Bluetooth 5.4.

  • Performance score: 4.5/5

Apple MacBook Air 13-inch (M4) review: Battery life

  • 14 hours battery life (web activities)
  • Effectively lasts all day (mixed use)
  • Charges to 50% in 90 minutes; 100% in three hours

Apple is promising up to 18 hours of battery life from the MacBook Air 13-inch (M4), which is mostly a test of how long the laptop can play 1080p video for; for comparison, Microsoft promises 20 hours from its Surface Laptop 7 for a similar task. The MacBook Air 13 M4’s real-world battery life numbers will vary significantly when performing a mix of sometimes CPU-intensive tasks.

Image 1 of 2

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)
Image 2 of 2

Apple MacBook Air 13-inch (M4) REVIEW

(Image credit: Future / Lance Ulanoff)

In my tests, which included playing games (which made the base of the laptop quite warm), editing video, opening multiple browser windows and streaming video, battery life came in around eight hours. That’s quite good for a hard day of work, and especially for such a thin and light laptop. In our Future Labs test, which is primarily web browsing, the MacBook Air 13-inch (M4) managed 14 hours, 51 minutes, which is about 30 minutes longer than the M3 but for slightly different tasks.

Overall, you're getting good, all-day battery life, but your experience will vary based on the tasks you perform.

After I drained the laptop to zero, I recharged it with the included 30W charger (the more expensive 24GB model comes with a 35W charger) and (matching Sky Blue) woven MagSafe charger to 50% in 90 minutes, and 100% in three-and-a-half hours.

  • Battery score: 5/5

Should you buy the Apple MacBook Air 13-inch (M4)?

Buy it if...

You want the best ultraportable experience
The MacBook Air 13-inch (M4) might look the same as last year's model, but it's a definite upgrade – and that price makes it a winner.

You like your laptops thin and light
At 0.44 inches / 1.13cm thick and just 2.7lbs /1.24kg, the new 13-inch Air is a perfect backpack companion.

You need a good blend of power and efficiency
The MacBook Air 13-inch (M4) packs more than enough power for most users and you can bank on all-day battery life.

Don't buy it if...

You want a touchscreen
Apple may never introduce a touchscreen MacBook. For that, look to the Surface Laptop, or an iPad Pro paired with a Magic Keyboard.

You want more AI
Apple Intelligence is showing promise, but it still pales in comparison to what you'll find on some Windows Laptops with the Qualcomm Snapdragon X Elite.

Apple MacBook Air 13-inch (M3) review: Also consider

If our Apple MacBook Air 13-inch (M4) review has you considering other options, here are two laptops to consider...

Apple MacBook Air 15-inch (M4)
The MacBook Air 15-inch (M4) is virtually the same as the 13-inch model in every aspect except size (and screen size), but the base model does start with two extra GPU cores. It also gets a price reduction compared to the M3 model, so if screen real estate matters to you, this is the MacBook Air to go for.

Check out our MacBook Air 15-inch (M4) review

Dell XPS 13 Plus
Its thin and light design, stunning OLED screen, great sound quality, and comfortable keyboard make this a premium Windows 11 laptop that in many ways rivals the MacBook Air. However, it’s prone to overheating, and the touch bar is divisive.

Read more: Dell XPS 13 Plus review

How I tested the Apple MacBook Air 13-inch (M4)

Apple MacBook Air 13-inch (M4)

(Image credit: Future / Lance Ulanoff)
  • I used the Apple MacBook Air 13-inch (M4) for five days
  • I worked, played, listened, edited, and wrote this review on it
  • I usually ran multiple apps at once

After receiving my MacBook Air 13-inch (M4) review unit I immediately unboxed it and began testing, and it did not leave my side for much of the next five days.

I ran benchmarks, installed multiple apps, and then began using it to edit images and video, play AAA games, listen to music, stream movies and shows, answer email, browse the web, and generate words and images with Apple Intelligence.

I've been reviewing technology for over 30 years, and I've tested everything from DOS-based word processors to Apple's Vision Pro. I've reviewed laptops of all stripes, including traditional clamshells and convertibles. I regularly work on macOS but also use the Windows platform almost every day – I like to keep my hands in all the ecosystems.

Read more about how we test

First reviewed March 2025

I’ve reviewed three generations of 3D V-cache processors, and the AMD Ryzen 9 9950X3D is the best there is
4:00 pm |

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , | Comments: Off

AMD Ryzen 9 9950X3D: Two-minute review

So the AMD Ryzen 9 9950X3D has something of a high bar to clear given the strength of AMD's first Zen 5 3D V-Cache chip, the Ryzen 7 9800X3D, but having spent a week testing this chip, I can say unequivocally that AMD has produced the best processor ever made for the consumer market.

Whether it's gaming, creating, or general productivity work, the Ryzen 9 9950X3D doesn't suffer from the same hang-ups that held its predecessor, the AMD Ryzen 9 7950X3D, from completely dominating its competition among the previous generation of processors.

Like its predecessor, the Ryzen 9 9950X3D will sell for $699 / £699 / AU$1,349 when it goes on sale on March 12, 2025. This makes it the most expensive consumer processor on the market, so definitely be prepared to invest quite a bit for this chip, especially if you're upgrading from an Intel or AMD AM4 system. As an AM5 chip, you'll need to upgrade some major components, including motherboard and possibly RAM.

Unlike nearly all other X3D chips besides the 9800X3D and 9900X3D, however, the Ryzen 9 9950X3D is fully overclockable thanks to AMD rearchitecting the way the 3D V-cache sits on the compute die, so there's a lot more that this chip can do that other X3D chips can't.

That includes beating out the current champ for the best gaming CPU, the 9800X3D, in most games while also offering substantially better general and creative performance thanks to twice as many processing cores.

That doesn't mean that the AMD Ryzen 9 9950X3D is flawless, as there are some things to caveat here (which I'll get into in more depth below), but as an overall package, you simply won't find a better CPU on the market right now that will let you do just about anything you want exceptionally well while still letting you run a more reasonable cooling solution. Just be prepared to pay a premium for all that performance.

AMD Ryzen 9 9950X3D: Price & availability

An AMD Ryzen 9 9950X3D leaning against its retail packaging

(Image credit: Future / John Loeffler)
  • How much will it cost? US MSRP is $699 / £699 / AU$1,349
  • When is it available? It goes on sale on March 12, 2025
  • Where is it available? It will be available in the US, UK, and Australia at launch

The Ryzen 9 9950X3D goes on sale March 12, 2025, for a US MSRP of $699 / £699 / AU$1,349 in the US, UK, and Australia, respectively, making it the most expensive processor on the market.

It comes in at the same price as its predecessor, the Ryzen 9 7950X3D when it launched, and costs $100 more than the Ryzen 9 9900X3D that launches on the same day.

This is also just over $200 more expensive than the Ryzen 7 9800X3D which has nearly the same level of gaming performance (and in some cases surpasses the 9950X3D), so if you are strictly looking for a gaming CPU, the 9800X3D might be the better value.

Compared to Intel's latest flagship processor, meanwhile, the Ryzen 9 9950X3D is just over $100 more expensive than the Intel Core Ultra 9 285K, though that chip requires a whole new motherboard chipset if you're coming from an Intel LGA 1700 chip like the Intel Core i9-12900K, so it might represent a much larger investment overall.

  • Value: 3.5 / 5

AMD Ryzen 9 9950X3D: Specs

  • 128MB L3 Cache (96MB + 32MB)
  • Fully overclockable
  • Not all processing cores have access to 3D V-cache

Compared to the Ryzen 9 7950X3D, there don't seem to be too many changes spec wise, but there's a lot going on under the hood here.

First, the way the 3D V-cache is seated over the CCX for the 9950X3D differs considerably than with the 7950X3D, specifically that its seated underneath the processing die, rather than above it.

This means that the processing cores are now in 'direct' contact with the lid and cooling solution for the chip, allowing the 9950X3D to be fully overclocked, whereas the V-cache in the 7950X3D sat between the lid and the processing cores, making careful thermal design and limiting necessary and ruling out overclocking.

The 9950X3D does keep the same two-module split in its L3 cache as the 7950X3D, so that only one of the eight-core CCXs in the chip actually has access to the added V-cache (32MB + 64MB), while the other just has access to 32MB.

This had some benefit for more dedicated, directy access for individual cores in use more cache. In the last-gen, this honestly produced somewhat mixed results compared to the 7800X3D, which didn't split the V-cache up this way, leading ultimately to high levels of gaming performance for the 7800X3D.

Whatever issue there was with the 7950X3D looks to have been largely fixed with the 9950X3D, but some hiccups remains, which I'll get to in the performance section.

Beyond that, the 9950X3D has slightly higher base and boost clock speeds, as well as a 50W higher TDP, but its 170W TDP isn't completely unmanageable, especially next to Intel's competing chips.

  • Specs: 4.5 / 5

AMD Ryzen 9 9950X3D: Performance

An AMD Ryzen 9 9950X3D in a motherboard

(Image credit: Future / John Loeffler)
  • Almost best-in-class gaming performance
  • Strong overall performance

While the Ryzen 7 7800X3D was indisputably a better gaming chip than the Ryzen 9 7950X3D by the numbers, I was very curious going into my testing how this chip would fare against the 9800X3D, but I'm happy to report that not only is it better on the whole when it comes to gaming, it's a powerhouse for general computing and creative work as well, making it the best all-around processor on the market right now.

On the synthetic side, the Ryzen 9 9950X3D goes toe-to-toe with the Intel Core Ultra 9 285K in multi-core performance, coming within 2% of Intel's best on average, and chocking up a 10% stronger single-core result than the 285K.

Compared to its predecessor, the 7950X3D, the 9950X3D is about 15% faster in multi-core and single-core performance, while also barely edging out the Ryzen 9 9950X in multi-core performance.

Compared to the Ryzen 7 9800X3D, the eight-core difference between the two really shows up in the results, with the 9950X3D posting a 61% better multi-core performance, and a roughly 5% better single core score compared to the 9800X3D.

On the creative front, the 9950X3D outclasses Intel's best and anything else in the AMD Ryzen lineup that I've tested overall (we'll see how it fares against the 9900X3D once I've had a chance to test that chip), though it is worth noting that the Intel Core Ultra 9 285K is still the better processor for video editing work.

The AMD Ryzen X3D line is all about gaming though, and here, the Ryzen 9 9950X3D posts the best gaming performance of all the chips tested, with one caveat.

In the Total War: Warhammer III Mirrors of Madness benchmark, the Ryzen 9 9950X3D only scores a few fps higher than the non-X3D Ryzen 9 9950X (331 fps to 318 fps, respectively), while also scoring substantially lower than the 9800X3D's 506 fps in that same benchmark. That's a roughly 35% slower showing for the 9950X3D, and given its roughly where the non-X3D chip scored, it's clear that Total War: Warhammer III was running on one of those cores that didn't have access to the extra V-cache.

This is an issue with the Windows process scheduler that might be fixed in time so that games are run on the right cores to leverage the extra cache available, but that's not a guarantee the way it is with the 9800X3D, which gives all cores access to its added V-cache so there aren't similar issues.

It might be a fairly rare occurence, but if your favorite game does take advantage of the extra cache that you're paying a lot of money for, that could be an issue, and it might not be something you'll ever know unless you have a non-X3D 9950X handy to test the way I do.

With that in mind, if all you want is a gaming processor, and you really don't care about any of these other performance categories, you're probably going to be better served by the 9800X3D, as you will get guaranteed gaming performance increases, even if you don't get the same boost in other areas.

While that's a large caveat, it can't take away from the overall performance profile of this chip, which is just astounding pretty much across the board.

If you want the best processor on the market overall, this is it, even with its occasional blips, especially since it runs much cooler than Intel's chips and its power draw is much more acceptable for midrange PCs to manage.

  • Performance: 4.5 / 5

Should you buy the AMD Ryzen 9 9950X3D?

A masculine hand holding an AMD Ryzen 9 9950X3D processor

(Image credit: Future / John Loeffler)

Buy the AMD Ryzen 9 9950X3D if...

You want spectacular performance no matter the workload
While gamers will be especially interested in this chip, it's real strength is that it's strong everywhere.

You want the best gaming performance
When using 3D V-cache, this processor's gaming chops are unbeatable.

Don't buy it if...

You want consistent top-tier gaming performance
When games run on one of this chip's 3D V-cache cores, you're going to get the best performance possible, but Windows might not assign a game to those cores, so you might miss out on this chip's signature feature.

You're on a budget
This chip is crazy expensive, so only buy it if you're flush with cash.

Also consider

AMD Ryzen 7 9800X3D
If you want consistent, top-tier gaming performance, the 9800X3D will get you performance nearly as good as this chip's, though more consistently.

Read the full AMD Ryzen 7 9800X3D review

How I tested the AMD Ryzen 9 9950X3D

  • I spent several days with the AMD Ryzen 9 9950X3D
  • I used the chip as my main workstation processor and used my updated battery of benchmarks to measure its performance
  • I used it for general productivity, creative, and gaming workloads

I spent about a week with the Ryzen 9 9950X3D as my main workstation CPU, where I ran basic computing workloads as well as extensive creative work, such as Adobe Photoshop.

I also spent as much time as I could gaming with the chip, including titles like Black Myth: Wukong and Civilization VII. I also used my updated suite of benchmark tools including industry standard utilities like Geekbench 6.2, Cyberpunk 2077, and PugetBench for Creators.

I've been reviewing components for TechRadar for three years now, including more than a dozen processor reviews in that time, so you can trust my testing process and recommendations if you're looking for the best processor for your needs and budget.

  • First reviewed March 2025
The AMD RX 9070 XT delivers exactly what the market needs with stunning performance at an unbeatable price
5:00 pm | March 5, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

AMD Radeon RX 9070 XT: Two-minute review

AMD had one job to do with the launch of its RDNA 4 graphics cards, spearheaded by the AMD Radeon RX 9070 XT, and that was to not get run over by Blackwell too badly this generation.

With the RX 9070 XT, not only did AMD manage to hold its own against the GeForce RTX monolith, it perfectly positions Team Red to take advantage of the growing discontent among gamers upset over Nvidia's latest GPUs with one of the best graphics cards I've ever tested.

The RX 9070 XT is without question the most powerful consumer graphics card AMD's put out, beating the AMD Radeon RX 7900 XTX overall and coming within inches of the Nvidia GeForce RTX 4080 in 4K and 1440p gaming performance.

It does so with an MSRP of just $599 (about £510 / AU$870), which is substantially lower than those two card's MSRP, much less their asking price online right now. This matters because AMD traditionally hasn't faced the kind of scalping and price inflation that Nvidia's GPUs experience (it does happen, obviously, but not nearly to the same extent as with Nvidia's RTX cards).

That means, ultimately, that gamers who look at the GPU market and find empty shelves, extremely distorted prices, and uninspiring performance for the price they're being asked to pay have an alternative that will likely stay within reach, even if price inflation keeps it above AMD's MSRP.

The RX 9070 XT's performance comes at a bit of a cost though, such as the 309W maximum power draw I saw during my testing, but at this tier of performance, this actually isn't that bad.

This card also isn't too great when it comes to non-raster creative performance and AI compute, but no one is looking to buy this card for its creative or AI performance, as Nvidia already has those categories on lock. No, this is a card for gamers out there, and for that, you just won't find a better one at this price. Even if the price does get hit with inflation, it'll still likely be way lower than what you'd have to pay for an RX 7900 XTX or RTX 4080 (assuming you can find them at this point) making the AMD Radeon RX 9070 XT a gaming GPU that everyone can appreciate and maybe even buy.

AMD Radeon RX 9070 XT: Price & availability

An AMD Radeon RX 9070 XT made by Sapphire on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • How much is it? MSRP is $599 (about £510 / AU$870)
  • When can you get it? The RX 9070 XT goes on sale March 6, 2025
  • Where is it available? The RX 9070 XT will be available in the US, UK, and Australia at launch

The AMD Radeon RX 9070 XT is available as of March 6, 2025, starting at $599 (about £510 / AU$870) for reference-spec third-party cards from manufacturers like Asus, Sapphire, Gigabyte, and others, with OC versions and those with added accoutrements like fancy cooling and RGB lighting likely selling for higher than MSRP.

At this price, the RX 9070 XT comes in about $150 cheaper than the RTX 5070 Ti, and about $50 more expensive than the RTX 5070 and the AMD Radeon RX 9070, which also launches alongside the RX 9070 XT. This price also puts the RX 9070 XT on par with the MSRP of the RTX 4070 Super, though this card is getting harder to find nowadays.

While I'll dig into performance in a bit, given the MSRP (and the reasonable hope that this card will be findable at MSRP in some capacity) the RX 9070 XT's value proposition is second only to the RTX 5070 Ti's, if you're going by its MSRP. Since price inflation on the RTX 5070 Ti will persist for some time at least, in many cases you'll likely find the RX 9070 XT offers better performance per price paid of any enthusiast card on the market right now.

  • Value: 5 / 5

AMD Radeon RX 9070 XT: Specs

An AMD Radeon RX 9070 XT made by Sapphire on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • PCIe 5.0, but still just GDDR6
  • Hefty power draw

The AMD Radeon RX 9070 XT is the first RDNA 4 card to hit the market, and so its worth digging into its architecture for a bit.

The new architecture is built on TSMC's N4P node, the same as Nvidia Blackwell, and in a move away from AMD's MCM push with the last generation, the RDNA 4 GPU is a monolithic die.

As there's no direct predecessor for this card (or for the RX 9070, for that matter), there's not much that we can apples-to-apples compare the RX 9070 XT against, but I'm going to try, putting the RX 9070 XT roughly between the RX 7800 XT and the RX 7900 GRE if it had a last-gen equivalent.

The Navi 48 GPU in the RX 9070 XT sports 64 compute units, breaking down into 64 ray accelerators, 128 AI accelerators, and 64MB of L3 cache. Its cores are clocked at 1,600MHz to start, but can run as fast as 2,970MHz, just shy of the 3GHz mark.

It uses the same GDDR6 memory as the last-gen AMD cards, with a 256-bit bus and a 644.6GB/s memory bandwidth, which is definitely helpful in pushing out 4K frames quickly.

The TGP of the RX 9070 XT is 304W, which is a good bit higher than the RX 7900 GRE, though for that extra power, you do get a commensurate bump up in performance.

  • Specs: 4 / 5

AMD Radeon RX 9070 XT: Design

An AMD Radeon RX 9070 XT made by Sapphire on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • No AMD reference card
  • High TGP means bigger coolers and more cables

There's no AMD reference card for the Radeon RX 9070 XT, but the unit I got to test was the Sapphire Pulse Radeon RX 9070 XT, which I imagine is pretty indicative of what we can expect from the designs of the various third-party cards.

The 304W TGP all but ensures that any version of this card you find will be a triple-fan cooler over a pretty hefty heatsink, so it's not going to be a great option for small form factor cases.

Likewise, that TGP just puts it over the line where it needs a third 8-pin PCIe power connector, something that you may or may not have available in your rig, so keep that in mind. If you do have three spare power connectors, there's no question that cable management will almost certainly be a hassle as well.

After that, it's really just about aesthetics, as the RX 9070 XT (so far) doesn't have anything like the dual pass-through cooling solution of the RTX 5090 and RTX 5080, so it's really up to personal taste.

As for the card I reviewed, the Sapphire Pulse shroud and cooling setup on the RX 9070 XT was pretty plain, as far as desktop GPUs go, but if you're looking for a non-flashy look for your PC, it's a great-looking card.

  • Design: 4 / 5

AMD Radeon RX 9070 XT: Performance

An AMD Radeon RX 9070 XT in a test bench

(Image credit: Future / John Loeffler)
  • Near-RTX 4080 levels of gaming performance, even with ray tracing
  • Non-raster creative and AI performance lags behind Nvidia, as expected
  • Likely the best value you're going to find anywhere near this price point
A note on my data

The charts shown below offer the most recent data I have for the cards tested for this review. They may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

Simply put, the AMD Radeon RX 9070 XT is the gaming graphics card that we've been clamoring for this entire generation. While it shows some strong performance in synthetics and raster-heavy creative tasks, gaming is where this card really shines, managing to come within 7% overall of the RTX 4080 and getting within 4% of the RTX 4080's overall gaming performance. For a card launching at half the price of the RTX 4080's launch price, this is a fantastic showing.

The RX 9070 XT is squaring up against the RTX 5070 Ti, however, and here the RTX 5070 Ti does manage to pull well ahead of the RX 9070 XT, but it's much closer than I thought it would be starting out.

On the synthetics side, the RX 9070 XT excels at rasterization workloads like 3DMark Steel Nomad, while the RTX 5070 Ti wins out in ray-traced workloads like 3DMark Speed Way, as expected, but AMD's 3rd generation ray accelerators have definitely come a long way in catching up with Nvidia's more sophisticated hardware.

Also, as expected, when it comes to creative workloads, the RX 9070 XT performs very well in raster-based tasks like photo editing, and worse at 3D modeling in Blender, which is heavily reliant on Nvidia's CUDA instruction set, giving Nvidia an all but permanent advantage there.

In video editing, the RX 9070 XT likewise lags behind, though it's still close enough to Nvidia's RTX 5070 Ti that video editors won't notice much difference, even if the difference is there on paper.

Gaming performance is what we're on about though, and here the sub-$600 GPU holds its own against heavy hitters like the RTX 4080, RTX 5070 Ti, and Radeon RX 7900 XTX.

In 1440p gaming, the RX 9070 XT is about 8.4% faster than the RTX 4070 Ti and RX 7900 XTX, just under 4% slower than the RTX 4080, and about 7% slower than the RTX 5070 Ti.

This strong performance carries over into 4K gaming as well, thanks to the RX 9070 XT's 16GB VRAM. Here, it's about 15.5% faster than the RTX 4070 Ti and about 2.5% faster than the RX 7900 XTX. Against the RTX 4080, the RX 9070 XT is just 3.5% slower, while it comes within 8% of the RTX 5070 Ti's 4K gaming performance.

When all is said and done, the RX 9070 XT doesn't quite overpower one of the best Nvidia graphics cards of the last-gen (and definitely doesn't topple the RTX 5070 Ti), but given its performance class, it's power draw, its heat output (which wasn't nearly as bad as the power draw might indicate), and most of all, it's price, the RX 9070 XT is easily the best value of any graphics card playing at 4K.

And given Nvidia's position with gamers right now, AMD has a real chance to win over some converts with this graphics card, and anyone looking for an outstanding 4K GPU absolutely needs to consider it before making their next upgrade.

  • Performance: 5 / 5

Should you buy the AMD Radeon RX 9070 XT?

Buy the AMD Radeon RX 9070 XT if...

You want the best value proposition for a high-end graphics card
The performance of the RX 9070 XT punches way above its price point.

You don't want to pay inflated prices for an Nvidia GPU
Price inflation is wreaking havoc on the GPU market right now, but this card might fare better than Nvidia's RTX offerings.

Don't buy it if...

You're on a tight budget
If you don't have a lot of money to spend, this card is likely more than you need.

You need strong creative or AI performance
While AMD is getting better at creative and AI workloads, it still lags far behind Nvidia's competing offerings.

How I tested the AMD Radeon RX 9070 XT

  • I spent about a week with the AMD Radeon RX 9070 XT
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about a week with the AMD Radeon RX 9070 XT, which was spent benchmarking, using, and digging into the card's hardware to come to my assessment.

I used industry standard benchmark tools like 3DMark, Cyberpunk 2077, and Pugetbench for Creators to get comparable results with other competing graphics cards, all of while have been tested using the same testbench setup listed on the right.

I've reviewed more than 30 graphics cards in the last three years, and so I've got the experience and insight to help you find the best graphics card for your needs and budget.

  • Originally reviewed March 2025
Huawei Mate 70 Pro gets a Premium Edition with CPU downgrade
1:01 pm | February 28, 2025

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

Huawei introduced the Mate 70 Pro back in November with a Kirin 9020 chipset and today the company introduced a somewhat confusing alternative version. Dubbed Premium Edition it will go on sale on March 5. The name suggests an improved version, but in reality, it has an underclocked CPU and a lower price tag on the company store Vmall. The listing did not actually mention the chip downgrade – we caught that in a separate Geekbench listing. The Huawei Mate 70 Pro Premium Edition has a model number PLR-AL50 with 1,450 score for a single core and 3,793 for multiple cores. These are...

I tested the iPhone 16e for a week and found it’s a good phone that stretches the definition of ‘budget’
5:00 am | February 27, 2025

Author: admin | Category: Computers Gadgets iPhone Phones | Tags: , , , | Comments: Off

Apple iPhone 16e: Two-Minute Review

The iPhone 16e is a good phone. It has a pleasing design, and it feels like a true member of the iPhone 16 family. It is not a great phone, though – how could it be with a retro notch in the Super Retina XDR display and just a single 48MP camera?

There are 'budget' phones that cost far less and which have larger screens and multiple rear cameras. They're not iOS handsets, and that counts for something – any new iPhone joins an expansive and well-designed ecosystem offering connective tissue between excellent Apple services and other Apple hardware. I mostly live in that world now, and I appreciate how well my iPhone 16 Pro Max works with, for instance, my Mac, and how all my cloud-connected services know it's me on the line.

It's been a while since I've had such conflicting feelings about an iPhone. I appreciate that Apple thought it was time to move away from the iPhone SE design language, one that owed most of its look and feel to 2017's iPhone 8. I'm sure Apple couldn't wait to do away with the Lightning port and the Home button with Touch ID (which lives on in Macs and some iPads). But instead of giving us something fresh, Apple took a bit of this and a bit of that to cobble together the iPhone 16e.

The display is almost the best Apple has to offer if you can ignore the notch, aren't bothered by larger bezels, and don't miss the Dynamic Island too much. The main 48MP Fusion camera is very good and shoots high-quality stills and videos, but don't be fooled by the claims of 2x zoom, which is actually a 12MP crop on the middle of the 48MP sensor. I worry that people paying $599 / £599 / AU$999 for this phone will be a little frustrated that they're not at least getting a dedicated ultra-wide camera at that price.

Conversely, there is one bit of this iPhone 16e that's not only new but is, for the moment, unique among iPhone 16 devices: the C1 chip. I don't know why Apple's cheapest iPhone got this brand-new bit of Apple silicon, but it does a good job of delivering 5G and even satellite connectivity. Plus, it starts moving Apple out from under the yolk of Qualcomm, Apple's cellular modem chip frenemy. That relationship has been fraught for years, and I wonder if Apple had originally hoped to put the C1 in all iPhone 16 models but the development schedule slipped.

Apple iPhone 16e REVIEW

The iPhone 16e (center) with the iPhone 16 (right) and iPhone SE 3 (left). (Image credit: Future / Lance Ulanoff)

In any case, while it's hard to measure the connectivity benefits (it's another good 5G modem), Apple says this is the most efficient cellular modem it's ever put in an iPhone (that seems like a swipe at Qualcomm), and helps to deliver stellar battery life: a claimed 26 hours of video streaming. Battery life in real-world use will, naturally, be a different story.

On balance, I like this phone's performance (courtesy of the A18 chip and 8GB of RAM), its looks, and how it feels in the hand (a matte glass back and Ceramic Shield front), and I think iOS 18 with Apple Intelligence is well-thought-out and increasingly intelligent (though Siri remains a bit of a disappointment); but if you're shopping for a sub-$600 phone, there may be other even better choices from the likes of Google (Pixel 8a), OnePlus (OnePlus 13R) and the anticipated Samsung Galaxy S25 FE. You just have to be willing to leave the Apple bubble.

Apple iPhone 16e: Price and availability

Apple unveiled the iPhone 16e on February 19, 2025. It joins the iPhone 16 lineup, and starts at $599 / £599 / AU$999 with 128GB of storage, making it the most affordable smartphone of the bunch. It's available in black or white.

While some might consider the iPhone 16e to be the successor to the iPhone SE 3, it has little in common with that device. In particular, that was a $429 phone. At $599, Apple might be stretching the definition of budget, but it is $200 cheaper than the base iPhone 16. The phone's price compares somewhat less favorably outside the iOS sphere. The OnePlus 13R for instance is a 6.7-inch handset with three cameras, and the Google Pixel 8a matches the iPhone 16e's 6.1-inch screen size (though at a lower resolution), but also includes two rear cameras.

You won't find more affordable new phones in the iOS space. The iPhone 15 has the main and ultra-wide camera and the Dynamic Island, but it costs $699 / £699 / AU$1,249. A refurbished iPhone 14 costs $529, but neither it nor the iPhone 15 supports Apple Intelligence.

  • Value score: 4/5

Apple iPhone 16: Specs

Apple iPhone 16e: Design

  • No trace of the iPhone SE design remains
  • Hybrid iPhone 14/15 design
  • Sharper edges than the current iPhone 16 design
Image 1 of 5

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 2 of 5

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 3 of 5

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 4 of 5

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 5 of 5

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

There's no question that the iPhone 16e is a part of the iPhone 16 family. At a glance, especially when the screen is off, it's almost a dead ringer for the base model; the aerospace aluminum fame is only slightly smaller.

Upon closer examination, those similarities recede, and I can see the myriad differences that make this a true hybrid design. This is now the only iPhone with a single camera, which almost looks a little lonely on the matte glass back. The edges of the metal band that wraps around the body are noticeably sharper than those of any other iPhone 16, but the phone still feels good in the hand.

The button configuration is essentially what you'd find on an iPhone 15. There's the power / sleep / Siri button on the right, and on the left are the two volume buttons and the Action button. Unlike the rest of the iPhone 16 lineup the 16e doesn't get the Camera Control, but at least the Action button is configurable, so you can set it to activate the camera or toggle the Flashlight, Silent Mode, Voice Memo, and more. I set mine to launch Visual Intelligence, an Apple Intelligence feature: you press and hold the Action button once to open it, and press again to grab a photo, and then you can select on-screen if you want ChatGPT or Google Search to handle the query. Apple Intelligence can also analyze the image directly and identify the subject.

The phone is iP68 rated to handle water and dust, including a dunk in six meters of water for 30 minutes. The screen is protected with a Ceramic Shield to better protect it from drops, though I'm not sure it does much to prevent scratches.

I put a case on the phone, never dropped it, and handled it gingerly, and yet within a day I noticed a long scratch on the screen, although I have no recollection of brushing the display against anything. I had a similar situation with the Samsung Galaxy S25 Ultra; I await the phone that can handle life in my pocket (empty other than the phone) without sustaining a scratch.

Overall, if you like the looks of the iPhone 16 lineup (or even the iPhone 14 and 15 lineups) the iPhone 16e will not disappoint.

  • Design score: 4 / 5

Apple iPhone 16e: Display

  • Almost Apple's best smartphone display
  • The notch is back
  • The bezels are a little bigger

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

If you're coming from the iPhone SE to the iPhone 16E, you're in for quite a shock. This 6.1-inch Super Retina XDR OLED screen is nothing like the 4.7-inch LCD display on that now-retired design.

The iPhone 16e features a lovely edge-to-edge design – with slightly larger bezels than you'll find on other iPhone 16 phones – that leaves no room for the dearly departed Touch ID Home button. Instead, this phone adopts the Face ID biometric security, which is, as far as I'm concerned, probably the best smartphone face recognition in the business. Face ID lives in the TrueDepth camera system notch, which also accommodates, among other things, the 12MP front-facing camera, microphone, and proximity sensor.

While I never had a big problem with the notch, I can't say I'm thrilled to see it return here. The rest of the iPhone 16 lineup features the versatile Dynamic Island, which I think most would agree is preferable to this cutout.

Image 1 of 3

Apple iPhone 16e REVIEW

The iPhone 16e (left) next to the iPhone SE 3 (middle), and the iPhone 16. (Image credit: Future / Lance Ulanoff)
Image 2 of 3

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 3 of 3

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

The iPhone 16e shares the iPhone 16's 460ppi resolution, but it does lose a few pixels (2532 x 1170 versus 2556 x 1179 for the iPhone 16). It still supports True Tone, Wide color (P3), and a 2,000,000:1 contrast ratio. The only area where it loses a bit of oomph is on the brightness front. Peak brightness for HDR content is 1,200 nits, and all other content is 800nits. The iPhone 16's peak outdoor brightness is 2,000 nits. As with other non-pro models, the refresh rate on the iPhone 16e sits at a fixed 60Hz.

Even so, I had no trouble viewing the iPhone 16e screen in a wide variety of lighting situations, and any shortcomings are only evident in the brightest, direct sunlight.

In day-to-day use, everything from photos and video to AAA games, apps, and websites looks great on this display. Colors are bright and punchy, and the blacks are inky. I'm not distracted by the notch on games, where it can cut a bit into the gameplay view, and most video streaming defaults to a letterbox format that steers clear of it, with black bars on the left and right sides of the screen.

  • Display score: 4 / 5

Apple iPhone 16e: Software and Apple Intelligence

  • iOS 18 is a rich and well-thought-out platform
  • Apple Intelligence has some impressive features, but we await the Siri of our dreams
  • Mail and photo redesigns leave something to be desired

iOS 18 is now smarter, more proactive, and more customizable than ever before. I can transform every app icon from 'Light' to 'Tinted' (monochromatic), fill my home screen with widgets, and expand them until they almost fill the screen. This customizability carries through to the Control Center, which is now a multi-page affair that I can leave alone, or completely reorganize so the tools I care about are available with a quick swipe down from the upper-right corner.

Image 1 of 2

Apple iPhone 16e REVIEW

(Image credit: Future)
Image 2 of 2

Apple iPhone 16e REVIEW

(Image credit: Future)

Apple Intelligence, which Apple unveiled last June, is growing in prominence and utility. It lives across apps like Messages and Email in Writing Tools, which is a bit buried so I often forget it exists. It's in notification summaries that can be useful for at-a-glance action but which are sometimes a bit confusing, and in image-generation tools like Image Playground and Genmojis.

It's also in Visual intelligence, which, as have it set up, gives me one-button access to ChatGPT and Google Search.

Image 1 of 2

Apple iPhone 16e review

Apple Intelligence Clean Up does an excellent job of removing those big lights (Image credit: Future / Lance Ulanoff)
Image 2 of 2

Apple iPhone 16e review

See? (Image credit: Future / Lance Ulanoff)

I think I prefer the more utilitarian features of Apple Intelligence like Clean Up. It lets you quickly remove people and objects from photos as if they were never there in the first place.

I'm also a fan of Audio Mix, which is not a part of Apple Intelligence, but uses machine learning to clean up the messiest audio to make it usable in social media, podcasts, or just for sharing with friends.

iOS 18 also features updated Photos and Mail apps with Apple Intelligence. I've struggled a bit with how Photos reorganized my images, and I've had similar issues with how Mail is now reorganizing my emails. I hope Apple takes another run at these apps in iOS 19.

Siri is smarter and more aware of iPhone features than before. It can handle my vocal missteps, and still knows what I want, but remains mostly unaware of my on-device information, and feels far less conversational and powerful as a chatbot than Google Gemini and ChatGPT.

  • Software score: 4.5 / 5

Apple iPhone 16e: Camera

  • 48MP Fusion is a good camera
  • The front-facing camera shines as well
  • A single rear camera at this price is disappointing

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

With a more powerful CPU, a bigger screen, and the new C1 chip, I can almost understand why Apple set the iPhone 16e price as high as it did. Almost… until I consider the single, rear 48MP Fusion camera. Most smartphones in this price range feature at least two lenses, and usually the second one is an ultra-wide – without that lens you miss out on not only dramatic ultra-wide shots but also macro photography capabilities. Had Apple priced this camera at $499, I might understand.

Image 1 of 4

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 2 of 4

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 3 of 4

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)
Image 4 of 4

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

Still, I like this camera. It defaults to shooting in 24MP, which is a bin of the 48MP available on the sensor (two pixels for each single image pixel to double the image information). There's a 2x zoom option, which is useful, but it's only shooting at 12MP because it's only using the central 12 megapixels from the full 48MP frame. These images are still good, but just not the same resolution as the default or what you could get shooting full-frame.

Overall, the camera shoots lovely photos with exquisite detail and the kind of color fidelity I appreciate (in people and skies especially) in a wide variety of scenarios. I captured excellent still lifes, portraits, and night-mode shots. I was also impressed with the front camera, which is especially good for portrait-mode selfies. Much of this image quality is thanks to the work Apple has done on its Photonic Engine. Apple's computational image pipeline pulls out extraordinary detail and nuance in most photographic situations, even if it is for just these two cameras.

iPhone 16 camera samples

Image 1 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
Image 2 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 1x (Image credit: Future / Lance Ulanoff)
Image 3 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera (Image credit: Future / Lance Ulanoff)
Image 4 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera portrait mode (Image credit: Future / Lance Ulanoff)
Image 5 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 1x (Image credit: Future / Lance Ulanoff)
Image 6 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
Image 7 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
Image 8 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x, night mode (Image credit: Future / Lance Ulanoff)
Image 9 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x, night mode (Image credit: Future / Lance Ulanoff)
Image 10 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 1x (Image credit: Future / Lance Ulanoff)
Image 11 of 15

Apple iPhone 16e REVIEW camera samples

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
Image 12 of 15

Apple iPhone 16e REVIEW

Rear camera, 1x (Image credit: Future / Lance Ulanoff)
Image 13 of 15

Apple iPhone 16e REVIEW

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
Image 14 of 15

Apple iPhone 16e REVIEW

Rear camera, 1x (Image credit: Future / Lance Ulanoff)
Image 15 of 15

Apple iPhone 16e REVIEW

Rear camera, 2x (Image credit: Future / Lance Ulanoff)
  • Camera score: 4 / 5

Apple iPhone 16e: Performance

  • The A18 is an excellent and powerful CPU
  • It's ready for Apple Intelligence
  • C1, Apple's first cellular modem, is effective for 5G and satellite connectivity

If you're wondering why the successor to the iPhone SE is not a $429 smartphone, you might look at the processing combo of the powerful A18 and the new C1.

The A18 is the same chip you'll find in the iPhone 16, with the exception of one fewer GPU core. I promise you'll never notice the difference.

Performance scores are excellent, and in line with the numbers we got for other A18 chips (and slightly lower than what you get from the A18 Pro in the iPhone 16 Pro and 16 Pro Max).

The A18 has more than enough power not just for day-to-day tasks like email and web browsing, but for 4K video editing (which I did in CapCut) and AAA gaming (game mode turns on automatically to divert more resources toward gaming). I played Asphalt 9 United, Resident Evil 4, and Call of Duty Mobile, and made things easier for myself by connecting my Xbox controller. My only criticism would be that a 6.1-inch screen is a little tight for these games. The audio from the stereo speakers, by the way, is excellent – I get an impressive spatial audio experience with Resident Evil 4.

Image 1 of 3

Apple iPhone 16e review

(Image credit: Future / Lance Ulanoff)
Image 2 of 3

Apple iPhone 16e review

(Image credit: Future / Lance Ulanoff)
Image 3 of 3

Apple iPhone 16e review

(Image credit: Future / Lance Ulanoff)

There's also the new C1 chip, which is notable because it's Apple's first custom cellular mobile chip. Previously Apple relied on, among other partners, Qualcomm for this silicon. I didn't notice any difference in connectivity with the new chip, which is a good thing – and I was impressed that I could use text via satellite.

Apple iPhone 16e REVIEW

(Image credit: Future)

I didn't think I'd get to test this feature, but AT&T connectivity is so bad in my New York neighborhood that the SOS icon appeared at the top of my iPhone 16e screen, and next to it I noticed the satellite icon. I opened messages, and the phone asked if I wanted to use the Satellite texting feature. I held the phone near my screen door to get a clear view of the sky, and followed the on-display guide that told me which way to point the phone. I got a 'Connected' notification, and then sent a few SMS texts over satellite. It's a nifty feature, and it was a nice little test of the C1's capabilities.

  • Performance score: 5 / 5

Apple iPhone 16e: Battery

  • Long lasting
  • Wireless charging
  • No MagSafe

Apple iPhone 16e REVIEW

(Image credit: Future / Lance Ulanoff)

It's clear that Apple has prioritized battery life on the iPhone 16e over some other features. That would likely explain, for instance, why we have wireless charging but not MagSafe support – adding that magnetic ring might have eaten into battery space. The C1 chip is apparently smaller than the modem chip in other iPhone 16 models, and even the decision to include one camera instead of two probably helped make room for what is a larger battery than even the one in the iPhone 16.

Apple rates the iPhone 16e for 26 hours of video-rundown battery life – that's about four hours more than the iPhone 16. In my real-world testing the battery life has been very good, but varied use can run the battery down in far fewer than 26 hours.

On one day when I did everything from email and web browsing to social media consumption and then a lot of gaming, battery life was about 12 hours – gaming in particular really chewed through the battery and made the phone pretty warm.

My own video rundown test (I played through episodes of Better Call Saul on Netflix) returned about 24 hours of battery life.

I used a 65W USB-C charger to charge the phone to 57% in 30 minutes, with a full charge taking about one hour and 50 minutes. I also tried a 20W charger, which charged the phone to 50% in 30 minutes.

  • Battery score: 5 / 5

Should you buy the Apple iPhone 16e?

iPhone 16e score card

Buy it if..

You want an affordable, smaller iPhone

This is now your only brand-new 'budget' iPhone choice.

You want sub-$600 access to Apple Intelligence

Apple squeezed a A18 chip inside this affordable iPhone to give you access to Apple's own brand of AI.

Don’t buy it if...

You're a photographer

A single, albeit excellent, rear lens won't be enough for people who like to shoot wide-angle and macros.

You never liked the notch

Apple bringing back a none-too-loved display feature doesn't make a lot of sense. If you want the Dynamic Island at a more affordable price than the iPhone 16, take a look at the iPhone 15.

You want a real zoom lens

The 2x zoom on the iPhone 16e is not a true optical zoom; instead, it's a full-frame sensor crop. If a big optical zoom is your thing, look elsewhere.

Apple iPhone 16: Also consider

iPhone 15

For $100 more you get two cameras, the Dynamic Island, and the Camera Control.

Read TechRadar's iPhone 15 review.

Google Pixel 8a

As soon as you step outside the Apple ecosystem you'll find more affordable phones with more features. The Pixel 8a is not as powerful as the iPhone 16e, but it has a nice build, two cameras, excellent Google services integration, and affordable access to Gemini AI features.

Read TechRadar's Google Pixel 8a review.

Apple iPhone 16: How I tested

I've reviewed countless smartphones ranging from the most affordable models to flagships and foldables. I put every phone through as many rigorous tests and everyday tasks as possible.

I had the iPhone 16e for just under a week, and after receiving it I immediately started taking photos, running benchmarks, and using it as an everyday device for photos, videos, email, social media, messaging, streaming video, and gaming.

Correction 2-27-2025: A previous version of this review listed Audio Mix as part of Apple Intelligence.

First reviewed February 26, 2025

Nvidia GeForce RTX 5090: the supercar of graphics cards
5:00 pm | January 23, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Nvidia GeForce RTX 5090: Two-minute review

The Nvidia GeForce RTX 5090 is a difficult GPU to approach as a professional reviewer because it is the rare consumer product that is so powerful, and so good at what it does, you have to really examine if it is actually a useful product for people to buy.

Right out the gate, let me just lay it out for you: depending on the workload, this GPU can get you up to 50% better performance versus the GeForce RTX 4090, and that's not even factoring in multi-frame generation when it comes to gaming, though on average the performance is still a respectable improvement of roughly 21% overall.

Simply put, whatever it is you're looking to use it for, whether gaming, creative work, or AI research and development, this is the best graphics card for the job if all you care about is pure performance.

Things get a bit more complicated if you want to bring energy efficiency into the equation. But if we're being honest, if you're considering buying the Nvidia RTX 5090, you don't care about energy efficiency. This simply isn't that kind of card, and so as much as I want to make energy efficiency an issue in this review, I really can't. It's not intended to be efficient, and those who want this card do not care about how much energy this thing is pulling down—in fact, for many, the enormous TDP on this card is part of its appeal.

Likewise, I can't really argue too much with the card's price, which comes in at $1,999 / £1,939 / AU$4,039 for the Founders Edition, and which will likely be much higher for AIB partner cards (and that's before the inevitable scalping begins). I could rage, rage against the inflation of the price of premium GPUs all I want, but honestly, Nvidia wouldn't charge this much for this card if there wasn't a line out the door and around the block full of enthusiasts who are more than willing to pay that kind of money for this thing on day one.

Do they get their money's worth? For the most part, yes, especially if they're not a gamer but a creative professional or AI researcher. If you're in the latter camp, you're going to be very excited about this card.

If you're a gamer, you'll still get impressive gen-on-gen performance improvements over the celebrated RTX 4090, and the Nvidia RTX 5090 is really the first consumer graphics card I've tested that can get you consistent, high-framerate 8K gameplay even before factoring in Multi-Frame Generation. That marks the RTX 5090 as something of an inflection point of things to come, much like the Nvidia RTX 2080 did back in 2018 with its first-of-its-kind hardware ray tracing.

Is it worth it though?

That, ultimately, is up to the enthusiast buyer who is looking to invest in this card. At this point, you probably already know whether or not you want it, and many will likely be reading this review to validate those decisions that have already been made.

In that, rest easy. Even without the bells and whistles of DLSS 4, this card is a hearty upgrade to the RTX 4090, and considering that the actual price of the RTX 4090 has hovered around $2,000 for the better part of two years despite its $1,599 MSRP, if the RTX 5090 sticks close to its launch price, it's well worth the investment. If it gets scalped to hell and sells for much more above that, you'll need to consider your purchase much more carefully to make sure you're getting the most for your money. Make sure to check out our where to buy an RTX 5090 guide to help you find stock when it goes on sale.

Nvidia GeForce RTX 5090: Price & availability

  • How much is it? MSRP is $1,999 / £1,939 / AU$4,039
  • When can you get it? The RTX 5090 goes on sale January 30, 2025
  • Where is it available? The RTX 5090 will be available in the US, UK, and Australia at launch
Where to buy the RTX 5090

Looking to pick up the RTX 5090? Check out our Where to buy RTX 5090 live blog for updates to find stock in the US and UK

The Nvidia GeForce RTX 5090 goes on sale on January 30, 2025, starting at $1,999 / £1,939 / AU$4,039 for the Nvidia Founders Edition and select AIB partner cards. Overclocked (OC) and other similarly tweaked cards and designs will obviously run higher.

It's worth noting that the RTX 5090 is 25% more expensive than the $1,599 launch price of the RTX 4090, but in reality, we can expect the RTX 5090 to sell for much higher than its MSRP in the months ahead, so we're really looking at an asking price closer to the $2,499.99 MSRP of the Turing-era Nvidia Titan RTX (if you're lucky).

Of course, if you're in the market for the Nvidia RTX 5090, you're probably not squabbling too much about the price of the card. You're already expecting to pay the premium, especially the first adopter premium, that comes with this release.

That said, this is still a ridiculously expensive graphics card for anyone other than an AI startup with VC backing, so it's worth asking yourself before you confirm that purchase if this card is truly the right card for your system and setup.

  • Value: 3 / 5

Nvidia GeForce RTX 5090: Specs & features

The Nvidia GeForce RTX 5090's power connection port

(Image credit: Future / John Loeffler)
  • First GPU with GDDR7 VRAM and PCIe 5.0
  • Slightly slower clocks
  • Obscene 575W TDP

There are a lot of new architectural changes in the Nvidia RTX 50 series GPUs that are worth diving into, especially the move to a transformer AI model for its upscaling, but let's start with the new specs for the RTX 5090.

First and foremost, the flagship Blackwell GPU is the first consumer graphics card to feature next-gen GDDR7 video memory, and it is substantially faster than GDDR6 and GDDR6X (a roughly 33% increase in Gbps over the RTX 4090). Add in the much wider 512-bit memory interface and you have a total memory bandwidth of 1,790GB/s.

This, more than even the increases VRAM pool of 32GB vs 24GB for the RTX 4090, makes this GPU the first really capable 8K graphics card on the market. 8K textures have an enormous footprint in memory, so moving them through the rendering pipelines to generate playable framerates isn't really possible with anything less than this card has.

Yes, you can, maybe, get playable 8K gaming with some RTX 40 or AMD Radeon RX 7000 series cards if you use aggressive upscaling, but you won't really be getting 8K visuals that'll be worth the effort. In reality, the RTX 5090 is what you want if you want to play 8K, but good luck finding an 8K monitor at this point. Those are still years away from really going mainstream (though there are a growing number of 8K TVs).

If you're settling in at 4K though, you're in for a treat, since all that bandwidth means faster 4K texture processing, so you can get very fast native 4K gaming with this card without having to fall back on upscaling tech to get you to 60fps or higher.

The GeForce RTX logo on the Nvidia GeForce RTX 5090

(Image credit: Future / John Loeffler)

The clock speeds on the RTX 5090 are slightly slower, which is good, because the other major top-line specs for the RTX 5090 are its gargantuan TDP of 575W and its PCIe 5.0 x16 interface. For the TDP, this thermal challenge, according to Nvidia, required major reengineering of the PCB inside the card, which I'll get to in a bit.

The PCIe 5.0 x16 interface, meanwhile, is the first of its kind in a consumer GPU, though you can expect AMD and Intel to quickly follow suit. Why this matters is because a number of newer motherboards have PCIe 5.0 lanes ready to go, but most people have been using those for PCIe 5.0 m.2 SSDs.

If your motherboard has 20 PCIe 5.0 lanes, the RTX 5090 will take up 16 of those, leaving just four for your SSD. If you have one PCIe 5.0 x4 SSD, you should be fine, but I've seen motherboard configurations that have two or three PCIe 5.0 x4 m.2 slots, so if you've got one of those and you've loaded them up with PCIe 5.0 SSDs, you're likely to see those SSDs drop down to the slower PCIe 4.0 speeds. I don't think it'll be that big of a deal, but it's worth considering if you've invested a lot into your SSD storage.

As for the other specs, they're more or less similar to what you'd find in the RTX 4090, just more of it. The new Blackwell GB202 GPU in the RTX 5090 is built on a TSMC 4nm process, compared to the RTX 4090's TSMC 5nm AD102 GPU. The SM design is the same, so 128 CUDA cores, one ray tracing core, and four tensor cores per SM. At 170 SMs, you've got 21,760 CUDA cores, 170 RT cores, and 680 Tensor cores for the RTX 5090, compared to the RTX 4090's 128 SMs (so 16,384 CUDA, 128 RT, and 512 Tensor cores).

  • Specs & features: 4.5 / 5

Nvidia GeForce RTX 5090: Design

The Nvidia GeForce RTX 5090 sitting on its packaging

(Image credit: Future / John Loeffler)
  • Slim, dual-slot form factor
  • Better cooling

So there's a significant change to this generation of Nvidia Founders Edition RTX flagship cards in terms of design, and it's not insubstantial.

Holding the RTX 5090 Founders Edition in your hand, you'll immediately notice two things: first, you can comfortably hold it in one hand thanks to it being a dual-slot card rather than a triple-slot, and second, it's significantly lighter than the RTX 4090.

A big part of this is how Nvidia designed the PCB inside the card. Traditionally, graphics cards have been built with a single PCB that extends from the inner edge of the PC case, down through the PCIe slot, and far enough back to accommodate all of the modules needed for the card. On top of this PCB, you'll have a heatsink with piping from the GPU die itself through a couple of dozen aluminum fins to dissipate heat, with some kind of fan or blower system to push or pull cooler air through the heated fins to carry away the heat from the GPU.

The problem with this setup is that if you have a monolithic PCB, you can only really extend the heatsinks and fans off of the PCB to help cool it since a fan blowing air directly into a plastic wall doesn't do much to help move hot air out of the graphics card.

A split view of the Nvidia GeForce RTX 5090's dual fan passthrough design

(Image credit: Future / John Loeffler)

Nvidia has a genuinely novel innovation on this account, and that's ditching the monolithic PCB that's been a mainstay of graphics cards for 30 years. Instead, the RTX 5090 (and presumably subsequent RTX 50-series GPUs to come), splits the PCB into three parts: the video output interface at the 'front' of the card facing out from the case, the PCIe interface segment of the card, and the main body of the PCB that houses the GPU itself as well as the VRAM modules and other necessary electronics.

This segmented design allows a gap in the front of the card below the fan, so rather than a fan blowing air into an obstruction, it can fully pass over the fins of the GPU's heatsink, substantially improving the thermals.

As a result, Nvidia is able to shrink the width of the card down considerably, moving from a 2.4-inch width to a 1.9-inch width, or a roughly 20% reduction on paper. That said, it feels substantially smaller than its predecessor, and it's definitely a card that won't completely overwhelm your PC case the way the RTX 4090 does.

The 4 8-pin to 16-pin 12VHPWR adapter included with the Nvidia GeForce RTX 5090

(Image credit: Future / John Loeffler)

That said, the obscene power consumption required by this card means that the 8-pin adapter included in the RTX 5090 package is a comical 4-to-1 dongle that pretty much no PSU in anyone's PC case can really accommodate.

Most modular PSUs give you three PCIe 8-pin power connectors at most, so let's just be honest about this setup. You're going to need to get a new ATX 3.0 PSU with at least 1000W to run this card at a minimum (it's officially recommended PSU is 950W, but just round up, you're going to need it), so make sure you factor that into your budget if you pick this card up

Otherwise, the look and feel of the card isn't that different than previous generations, except the front plate of the GPU where the RTX 5090 branding would have gone is now missing, replaced by a finned shroud to allow air to pass through. The RTX 5090 stamp is instead printed on the center panel, similar to how it was done on the Nvidia GeForce RTX 3070 Founders Edition.

As a final touch, the white back-lit GeForce RTX logo and the X strips on the front of the card, when powered, add a nice RGB-lite touch that doesn't look too guady, but for RGB fans out there, you might think it looks rather plain.

  • Design: 4.5 / 5

Nvidia GeForce RTX 5090: Performance

An Nvidia GeForce RTX 5090 slotted into a test bench

(Image credit: Future)
  • Most powerful GPU on the consumer market
  • Substantially faster than RTX 4090
  • Playable 8K gaming
A note on my data

The charts shown below are the most recent test data I have for the cards tested for this review and may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

So how does the Nvidia GeForce RTX 5090 stack up against its predecessor, as well as the best 4K graphics cards on the market more broadly?

Very damn well, it turns out, managing to improve performance over the RTX 4090 in some workloads by 50% or more, while leaving everything else pretty much in the dust.

Though when looked at from 30,000 feet, the overall performance gains are respectable gen-on-gen but aren't the kind of earth-shattering gains the RTX 4090 made over the Nvidia GeForce RTX 3090.

Starting with synthetic workloads, the RTX 5090 scores anywhere from 48.6% faster to about 6.7% slower than the RTX 4090 in various 3DMark tests, depending on the workload. The only poor performance for the RTX 5090 was in 3DMark Night Raid, a test where both cards so completely overwhelm the test that the difference here could be down to CPU bottlenecking or other issues that aren't easily identifiable. On every other 3DMark test, though, the RTX 5090 scores 5.6% better or higher, more often than not by 20-35%. In the most recent;y released test, Steel Nomad, the RTX 5090 is nearly 50% faster than the RTX 4090.

On the compute side of things, the RTX 5090 is up to 34.3% faster in Geekbench 6 OpenGL compute test and 53.9% faster in Vulcan, making it an absolute monster for AI researchers to leverage.

On the creative side, the RTX 5090 is substantially faster in 3D rendering, scoring between 35% and 49.3% faster in my Blender Benchmark 4.30 tests. There's very little difference between the two cards when it comes to video editing though, as they essentially tie in PugetBench for Creators' Adobe Premiere test and in Handbrake 1.7 4K to 1080p encoding.

The latter two results might be down to CPU bottlenecking, as even the RTX 4090 pushes right up against the performance ceiling set by the CPU in a lot of cases.

When it comes to gaming, the RTX 5090 is substantially faster than the RTX 4090, especially at 4K. In non-upscaled 1440p gaming, you're looking at a roughly 18% better average frame rate and a 22.6% better minimum/1% framerate for the RTX 5090. With DLSS 3 upscaling (but no frame generation), you're looking at 23.3% better average and 23% better minimum/1% framerates overall with the RTX 5090 vs the RTX 4090.

With ray tracing turn on without upscaling, you're getting 26.3% better average framerates and about 23% better minimum/1% framerates, and with upscaling turned on to balanced (again, no frame generation), you're looking at about 14% better average fps and about 13% better minimum/1% fps for the RTX 5090 against the RTX 4090.

At 4K, however, the faster memory and wider memory bus really make a difference. Without upscaling and ray tracing turned off, you're getting upwards of 200 fps at 4K for the RTX 5090 on average, compared to the RTX 4090's 154 average fps, a nearly 30% increase. The average minimum/1% fps for the RTX 5090 is about 28% faster than the RTX 4090, as well. With DLSS 3 set to balanced, you're looking at a roughly 22% better average framerate overall compared to the RTX 4090, with an 18% better minimum/1% framerate on average as well.

With ray tracing and no upscaling, the difference is even more pronounced with the RTX 5090 getting just over 34% faster average framerates compared to the RTX 4090 (with a more modest 7% faster average minimum/1% fps). Turn on balanced DLSS 3 with full ray tracing and you're looking at about 22% faster average fps overall for the RTX 5090, but an incredible 66.2% jump in average minimum/1% fps compared to the RTX 4090 at 4K.

Again, none of this even factors in single frame generation, which can already substantially increase framerates in some games (though with the introduction of some input latency). Once Multi-Frame Generation rolls out at launch, you can expect to see these framerates for the RTX 5090 run substantially higher. Pair that with Nvidia Reflex 2 to help mitigate the input latency issues frame generation can introduce, and the playable performance of the RTX 5090 will only get better with time, and it's starting from a substantial lead right out of the gate.

In the end, the overall baseline performance of the RTX 5090 comes in about 21% better than the RTX 4090, which is what you're really looking for when it comes to a gen-on-gen improvement.

That said, you have to ask whether the performance improvement you do get is worth the enormous increase in power consumption. That 575W TDP isn't a joke. I maxed out at 556W of power at 100% utilization, and I hit 100% fairly often in my testing and while gaming.

The dual flow-through fan design also does a great job of cooling the GPU, but at the expense of turning the card into a space heater. That 575W of heat needs to go somewhere, and that somewhere is inside your PC case. Make sure you have adequate airflow to vent all that hot air, otherwise everything in your case is going to slowly cook.

As far as performance-per-price, this card does slightly better than the RTX 4090 on value for the money, but that's never been a buying factor for this kind of card anyway. You want this card for its performance, plain and simple, and in that regard, it's the best there is.

  • Performance: 5 / 5

Should you buy the Nvidia GeForce RTX 5090?

A masculine hand holding an RTX 5090

(Image credit: Future)

Buy the Nvidia GeForce RTX 5090 if...

You want the best performance possible
From gaming to 3D modeling to AI compute, the RTX 5090 serves up best-in-class performance.

You want to game at 8K
Of all the graphics cards I've tested, the RTX 5090 is so far the only GPU that can realistically game at 8K without compromising on graphics settings.

You really want to flex
This card comes with a lot of bragging rights if you're into the PC gaming scene.

Don't buy it if...

You care about efficiency
At 575W, this card might as well come with a smokestack and a warning from your utility provider about the additional cost of running it.

You're in any way budget-conscious
This card starts off more expensive than most gaming PCs and will only become more so once scalpers get their hands on them. And that's not even factoring in AIB partner cards with extra features that add to the cost.

You have a small form-factor PC
There's been some talk about the new Nvidia GPUs being SSF-friendly, but even though this card is thinner than the RTX 4090, it's just as long, so it'll be hard to fit it into a lot of smaller cases.

Also consider

Nvidia GeForce RTX 4090
I mean, honestly, this is the only other card you can compare the RTX 5090 to in terms of performance, so if you're looking for an alternative to the RTX 5090, the RTX 4090 is pretty much it.

Read the full Nvidia GeForce RTX 4090 review

How I tested the Nvidia GeForce RTX 5090

  • I spent about a week and a half with the RTX 5090
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about a week and a half testing the Nvidia GeForce RTX 5090, both running synthetic tests as well as using it in my day-to-day PC for both work and gaming.

I used my updated testing suite, which uses industry standard benchmark tools like 3DMark, Geekbench, Pugetbench for Creators, and various built-in gaming benchmarks. I used the same testbench setup listed to the right for the purposes of testing this card, as well as all of the other cards I tested for comparison purposes.

I've tested and retested dozens of graphics cards for the 20+ graphics card reviews I've written for TechRadar over the last few years, and so I know the ins and outs of these PC components. That's why you can trust my review process to help you make the right buying decision for your next GPU, whether it's the RTX 5090 or any of the other graphics cards I review.

  • Originally reviewed January 2024
There’s a Snapdragon 8 Elite version with just seven CPU cores now
6:03 pm | January 17, 2025

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

The Snapdragon 8 Elite chipset went official in October, and as you may already know it has an octa-core CPU. Right? Well, yes, but also no. The Snapdragon 8 Elite we've known so far definitely does, but it turns out there's another version. This just got listed by Qualcomm on its website (see the Source linked below). And it has just seven CPU cores. Two Prime cores clocked at up to 4.32 GHz, and five Performance cores clocked at up to 3.53 GHz. Basically, one Performance core is missing compared to the regular Snapdragon 8 Elite. This hepta-core CPU is packed inside the Snapdragon...

Lenovo ThinkPad P16v Gen 2 mobile workstation review
10:12 pm | January 15, 2025

Author: admin | Category: Computers Gadgets Pro | Tags: , , | Comments: Off

Lenovo's ThinkPad lineup has always been a significant grouping of offerings for business professionals. The Lenovo ThinkPad P16v Gen 2 is no different. It targets professionals who need workstation-grade performance on the go.

The ThinkPad P16 is one of the best Lenovo ThinkPad laptops around - ideal for heavy computational and graphical work. Compared to the P16, I view the P16v Gen 2 as a ThinkPad P16 lite. But that's not any official branding; it's just my viewpoint. It's a slightly less powerful P16, but still very much enterprise-focused and workstation-esque.

Lenovo ThinkPad P16v Gen 2: Price and Availability

The Lenovo ThinkPad P16v Gen 2 starts at $1,791.92 (pre-tax) and quickly scales up to well over $3,500 before any pre-installed software options if you want to max out the hardware offerings.

These and custom builds are available on Lenovo's website, and pre-built models are available in places like Amazon or other computer retailers.

Lenovo ThinkPad P16v Gen 2

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P16v Gen 2: Unboxing and First Impressions

The ThinkPad P16v Gen 2 laptop comes in the Lenovo packaging, a beefy yellow-tipped Lenovo charger (though you can also charge via USB-C, albeit slower), and other essential documentation. I was immediately reminded of the P16, though the P16v is a bit slimmer and lighter (4.89 lb vs. 6.5 lb).

Another thing that I noticed right away was the port offering and location. I'll discuss this more later, but right off the bat, I was surprised to see a full ethernet port and ports on the back; then again, though thin, this is a workstation. Lastly, I genuinely like the matte black finish on this laptop. It feels professional, and I like it for the same reasons. Though I love some sweet backpack colors, I will always choose black. I love some splashes of color from Apple these days, but I always prefer simple colors. It's clean, goes with everything, and it looks professional.

Lenovo ThinkPad P16v Gen 2: Design and Build Quality

Specs

CPU: Intel Core Ultra 7 165H to Ultra 9 185H options
GPU: NVIDIA RTX 2000 Ada Gen or RTX 3000 Ada Gen
Display: 16” WUXGA (1920 x 1200), IPS, 100% sRGB to 16" WQUXGA (3840 x 2400), IPS, 100%DCI-P3, 60Hz
Storage: 2x 2TB SSD M.2 drives
RAM: 8GB DDR5, upgradable to 96GB .

Unsurprisingly, the Lenovo ThinkPad P16v Gen 2 is very similar to the ThinkPad P16 in design, much like the name. The P16v Gen 2 is slimmer and more portable than a ThinkPad P16. However, it still feels relatively robust and like any of the best mobile workstations I've tried, with actual portability in mind. Thanks to the real estate left behind due to the 16" screen, Lenovo could add a full numpad to the right of the entire keyboard, and better yet, it's comfortable to type on.

The port offering on this computer is excellent for the modern employee needing workstation-grade power. There is an SD Card Reader, an optional Smart Card reader, a full-size HDMI port, a USB-A Port, two Thunderbolt 4 ports, and a full RJ45 Ethernet port. What's fascinating and pretty brilliant is that one of the Thunderbolt ports and the Ethernet port are on the back of the ThinkPad P16v Gen 2. This makes it super easy to plug into a Thunderbolt Docking station and/or that ethernet port, both of which you'd want running away from your desk or workspace exactly how they will when plugged into the back of your laptop.

Lenovo ThinkPad P16v Gen 2

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P16v Gen 2: In use

I've had this laptop in my rotation for the last couple of weeks, and it has been a pretty good computer. It can easily handle my productivity suite of tasks, content creation and video editing, and photo editing. It can handle the 3D modeling software for my 3D printer and all of it at once. I really appreciate the ethernet port and Thunderbolt 4 port on the back, as I could have the not-so-flexible ethernet port run away from my computer when I needed to hardline into the internet at one of my job sites. Whenever I am at my desk, I can easily plug into the docking station I have set up running to my monitors and peripherals.

Lenovo ThinkPad P16v Gen 2

(Image credit: Collin Probst // Future)

Another thing worth mentioning is the reliability and usability of the ThinkPad keyboards. While I never want to use the famous TouchPoint embedded within the keyboard, it's handy when I think about using it. On top of that, the typing experience is quite comfortable, even for all-day typing, as I do.

Lenovo has also chosen to utilize the space granted by the 16-inch screen to fit in a numpad. Some laptops, even with 16-inch screens, will just fit the exact size keyboard in the center of the allotted space. Lenovo chose to utilize that space fitting in a full-numberpad. For those who work with spreadsheets, phone numbers, or numbers in general, having a dedicated numpad makes data entry exponentially faster, and that's easy to do with the ThinkPad P16v Gen 2, adding to the allure for the business professional.

Lenovo ThinkPad P16v Gen 2

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P16v Gen 2: Final verdict

The ThinkPad P16v Gen 2 delivers an exceptional balance of power, portability, and professional features. While it doesn’t quite match the raw performance of the P16, its lighter build and price point make it an excellent choice for professionals on the move who need a reliable machine.


For more workplace computing, we've tested the best business laptops.

Lenovo ThinkPad P1 Gen 7 mobile workstation review
10:41 am | December 25, 2024

Author: admin | Category: Computers Gadgets Pro | Tags: , , | Comments: Off

The Lenovo ThinkPad P1 Gen 7 is Lenovo's take on an all-around perfect portable workstation machine. The Gen 7, of course, replaces the Gen 6 and now boasts up to an Intel Core Ultra 9 185H and an NVIDIA RTX 4070. However, it can also be built with integrated graphics and an Intel Core Ultra 5 with a light 16GB of RAM.

Much like Dell's Precision line-up, the ThinkPad P series is designed for professionals needing a computer that can handle computationally demanding tasks like 3D rendering, video editing, coding, data analysis, and things of that nature. Like many of the best Lenovo ThinkPad laptops I've reviewed, while casual users can use it, this price point focuses on professional users who rely on their machines to be workhorses and get work done.

Lenovo ThinkPad P1 Gen 7: Price and Availability

The Lenovo ThinkPad P1 Gen 7 starts at the base level for under $2,000 with an Intel Core Ultra 5, 16GB of RAM, and integrated graphics. This can be upgraded to a machine that costs over $5,000 when equipped with an Intel Core Ultra 9, NVIDIA RTX 4070 Graphics, 64GB of RAM, and 4TB SSD. What's great about this is that yes. At the same time, this is not an entry-level computer. Thanks to the customization options available for processor, memory, storage, and graphics, it can be kitted to fit just about any professional need. That said, check out our Lenovo coupon codes to see if you can save on the ThinkPad P1 Gen 7.

Lenovo ThinkPad P1 Gen 7

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P1 Gen 7: Unboxing and First Impressions

Out of the box, it's clear this is not designed to be a super-lightweight-ultra-portable-thinnest-device-ever kind of machine. It's beefy. But not in a way that resembles the laptops of a decade ago. As we've seen from many of the best mobile workstations, it's sleek where it can be but houses a lot under the hood -- or keyboard. Depending on the GPU configuration, the P1 Gen 7 has a 135W or 170W charger, the appropriate manuals, and any accessories purchased at Lenovo. The minimalist matte-black design exudes sleek professionalism. However, one thing to note is that it is prone to smudges.

Lenovo ThinkPad P1 Gen 7

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P1 Gen 7: Design and Build Quality

Specs

CPU: Up to an Intel Core Ultra 9 185H
GPU: Up to an NVIDIA RTX 4070
Display: Up to 4K OLED
RAM: Up to 64GB LPDDR5X
Storage: Up to 8TB SSD with built-in RAID options

Overall, the laptop is 17mm thick and 4.3lb. That's not huge in the world of laptops, though it is larger than some of the laptops I am working with. The P1 Gen 7 is made of a combination of Magnesium and Aluminum and has a durability rating of MIL-STD 810H. It can withstand your daily wear and tear and the burdens of being an everyday workhorse.

Completing the all-too-famous ThinkPad design, the TrackPoint is prominently in the center of the keyboard, and the overall design language matches what is frequently found with ThinkPad.

Lenovo ThinkPad P1 Gen 7

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P1 Gen 7: In use

I have used this computer extensively in my workflow for the past few months. Overall, it is an impressive machine. It is remarkably powerful, easily handles multitasking and demanding performance programs, and has a sleek and attractive design. What more could you ask for in a computer? It even has a better port offering than the ever-popular Dell Powerhouses and better port offerings than MacBooks. I have only heard the fans kick on during heavily intensive or many heavy tasks stacked together. Outside of that, I have not heard the fan kick on for my day-to-day professional work even once.

Some more features that make this computer great would be the Wi-Fi 7 antennae, great port offering, a solid trackpad, a comfortable keyboard, and a decent battery.

Lenovo ThinkPad P1 Gen 7

(Image credit: Collin Probst // Future)

I've enjoyed using this computer for everything in my day to day. The keyboard is comfortable enough for long email sessions or writing articles (like this one). The trackpad is responsive enough that I don't need to bring a mouse in my backpack when I am away from my desk for the day. The ports are fantastic. I can leave my dongles at home since this laptop has everything I could need on a given notice built into the computer. Another thing that makes this computer great is that it is super portable. Yes, it's powerful and practical, but it's also surprisingly easy to carry around from place to place in my studio, office, coffee shop, bag, house, and so on. It's simple, and it doesn't get in the way. It's great for my professional workflow.

Lenovo ThinkPad P1 Gen 7

(Image credit: Collin Probst // Future)

Lenovo ThinkPad P1 Gen 7: Final verdict

The Lenovo ThinkPad P1 Gen 7 is an impressive example of what mobile workstations can be. Though premium priced, its versatility, build quality, and performance justify its cost for professionals seeking the best tools to do their work reliably.


For more workplace hardware, we've reviewed the best business laptops

Next Page »