Organizer
Gadget news
WhatsApp infuses more AI into group chats
10:01 pm | March 6, 2025

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

WhatsApp is adding more AI. Baked into the latest WhatsApp beta release for Android is a way to have generative AI make your group chat icons. And strangely enough, this is only for group chat icons - you can't generate images to use for your profile picture on one-on-one chats. Perhaps that's coming later. For now, if you are in the WhatsApp for Android beta release channel, the latest update should enable the group chat icon generation function as seen in the shots below. This is of course powered by Meta AI, which is also available inside group chats to "help settle debates",...

Digg is coming back from the dead, will use AI to ease the work of moderators
6:10 pm | March 5, 2025

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

The older among you may remember Digg – a social link sharing site. For the younger among you, it was “the front page of the Internet” before Reddit took over the crown. After a disastrous redesign in 2010, Digg fell off hard and never really recovered – but now it is coming back. Kevin Rose, Digg’s original founder, and Alexis Ohanian, co-founder of Reddit, have teamed up to “revive the social platform with a fresh vision to restore the spirit of discovery and genuine community that made the early web a fun and exciting place to be.” After the 2010 redesign, Digg took a downward...

The AMD RX 9070 XT delivers exactly what the market needs with stunning performance at an unbeatable price
5:00 pm |

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

AMD Radeon RX 9070 XT: Two-minute review

AMD had one job to do with the launch of its RDNA 4 graphics cards, spearheaded by the AMD Radeon RX 9070 XT, and that was to not get run over by Blackwell too badly this generation.

With the RX 9070 XT, not only did AMD manage to hold its own against the GeForce RTX monolith, it perfectly positions Team Red to take advantage of the growing discontent among gamers upset over Nvidia's latest GPUs with one of the best graphics cards I've ever tested.

The RX 9070 XT is without question the most powerful consumer graphics card AMD's put out, beating the AMD Radeon RX 7900 XTX overall and coming within inches of the Nvidia GeForce RTX 4080 in 4K and 1440p gaming performance.

It does so with an MSRP of just $599 (about £510 / AU$870), which is substantially lower than those two card's MSRP, much less their asking price online right now. This matters because AMD traditionally hasn't faced the kind of scalping and price inflation that Nvidia's GPUs experience (it does happen, obviously, but not nearly to the same extent as with Nvidia's RTX cards).

That means, ultimately, that gamers who look at the GPU market and find empty shelves, extremely distorted prices, and uninspiring performance for the price they're being asked to pay have an alternative that will likely stay within reach, even if price inflation keeps it above AMD's MSRP.

The RX 9070 XT's performance comes at a bit of a cost though, such as the 309W maximum power draw I saw during my testing, but at this tier of performance, this actually isn't that bad.

This card also isn't too great when it comes to non-raster creative performance and AI compute, but no one is looking to buy this card for its creative or AI performance, as Nvidia already has those categories on lock. No, this is a card for gamers out there, and for that, you just won't find a better one at this price. Even if the price does get hit with inflation, it'll still likely be way lower than what you'd have to pay for an RX 7900 XTX or RTX 4080 (assuming you can find them at this point) making the AMD Radeon RX 9070 XT a gaming GPU that everyone can appreciate and maybe even buy.

AMD Radeon RX 9070 XT: Price & availability

An AMD Radeon RX 9070 XT made by Sapphire on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • How much is it? MSRP is $599 (about £510 / AU$870)
  • When can you get it? The RX 9070 XT goes on sale March 6, 2025
  • Where is it available? The RX 9070 XT will be available in the US, UK, and Australia at launch

The AMD Radeon RX 9070 XT is available as of March 6, 2025, starting at $599 (about £510 / AU$870) for reference-spec third-party cards from manufacturers like Asus, Sapphire, Gigabyte, and others, with OC versions and those with added accoutrements like fancy cooling and RGB lighting likely selling for higher than MSRP.

At this price, the RX 9070 XT comes in about $150 cheaper than the RTX 5070 Ti, and about $50 more expensive than the RTX 5070 and the AMD Radeon RX 9070, which also launches alongside the RX 9070 XT. This price also puts the RX 9070 XT on par with the MSRP of the RTX 4070 Super, though this card is getting harder to find nowadays.

While I'll dig into performance in a bit, given the MSRP (and the reasonable hope that this card will be findable at MSRP in some capacity) the RX 9070 XT's value proposition is second only to the RTX 5070 Ti's, if you're going by its MSRP. Since price inflation on the RTX 5070 Ti will persist for some time at least, in many cases you'll likely find the RX 9070 XT offers better performance per price paid of any enthusiast card on the market right now.

  • Value: 5 / 5

AMD Radeon RX 9070 XT: Specs

An AMD Radeon RX 9070 XT made by Sapphire on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • PCIe 5.0, but still just GDDR6
  • Hefty power draw

The AMD Radeon RX 9070 XT is the first RDNA 4 card to hit the market, and so its worth digging into its architecture for a bit.

The new architecture is built on TSMC's N4P node, the same as Nvidia Blackwell, and in a move away from AMD's MCM push with the last generation, the RDNA 4 GPU is a monolithic die.

As there's no direct predecessor for this card (or for the RX 9070, for that matter), there's not much that we can apples-to-apples compare the RX 9070 XT against, but I'm going to try, putting the RX 9070 XT roughly between the RX 7800 XT and the RX 7900 GRE if it had a last-gen equivalent.

The Navi 48 GPU in the RX 9070 XT sports 64 compute units, breaking down into 64 ray accelerators, 128 AI accelerators, and 64MB of L3 cache. Its cores are clocked at 1,600MHz to start, but can run as fast as 2,970MHz, just shy of the 3GHz mark.

It uses the same GDDR6 memory as the last-gen AMD cards, with a 256-bit bus and a 644.6GB/s memory bandwidth, which is definitely helpful in pushing out 4K frames quickly.

The TGP of the RX 9070 XT is 304W, which is a good bit higher than the RX 7900 GRE, though for that extra power, you do get a commensurate bump up in performance.

  • Specs: 4 / 5

AMD Radeon RX 9070 XT: Design

An AMD Radeon RX 9070 XT made by Sapphire on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • No AMD reference card
  • High TGP means bigger coolers and more cables

There's no AMD reference card for the Radeon RX 9070 XT, but the unit I got to test was the Sapphire Pulse Radeon RX 9070 XT, which I imagine is pretty indicative of what we can expect from the designs of the various third-party cards.

The 304W TGP all but ensures that any version of this card you find will be a triple-fan cooler over a pretty hefty heatsink, so it's not going to be a great option for small form factor cases.

Likewise, that TGP just puts it over the line where it needs a third 8-pin PCIe power connector, something that you may or may not have available in your rig, so keep that in mind. If you do have three spare power connectors, there's no question that cable management will almost certainly be a hassle as well.

After that, it's really just about aesthetics, as the RX 9070 XT (so far) doesn't have anything like the dual pass-through cooling solution of the RTX 5090 and RTX 5080, so it's really up to personal taste.

As for the card I reviewed, the Sapphire Pulse shroud and cooling setup on the RX 9070 XT was pretty plain, as far as desktop GPUs go, but if you're looking for a non-flashy look for your PC, it's a great-looking card.

  • Design: 4 / 5

AMD Radeon RX 9070 XT: Performance

An AMD Radeon RX 9070 XT in a test bench

(Image credit: Future / John Loeffler)
  • Near-RTX 4080 levels of gaming performance, even with ray tracing
  • Non-raster creative and AI performance lags behind Nvidia, as expected
  • Likely the best value you're going to find anywhere near this price point
A note on my data

The charts shown below offer the most recent data I have for the cards tested for this review. They may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

Simply put, the AMD Radeon RX 9070 XT is the gaming graphics card that we've been clamoring for this entire generation. While it shows some strong performance in synthetics and raster-heavy creative tasks, gaming is where this card really shines, managing to come within 7% overall of the RTX 4080 and getting within 4% of the RTX 4080's overall gaming performance. For a card launching at half the price of the RTX 4080's launch price, this is a fantastic showing.

The RX 9070 XT is squaring up against the RTX 5070 Ti, however, and here the RTX 5070 Ti does manage to pull well ahead of the RX 9070 XT, but it's much closer than I thought it would be starting out.

On the synthetics side, the RX 9070 XT excels at rasterization workloads like 3DMark Steel Nomad, while the RTX 5070 Ti wins out in ray-traced workloads like 3DMark Speed Way, as expected, but AMD's 3rd generation ray accelerators have definitely come a long way in catching up with Nvidia's more sophisticated hardware.

Also, as expected, when it comes to creative workloads, the RX 9070 XT performs very well in raster-based tasks like photo editing, and worse at 3D modeling in Blender, which is heavily reliant on Nvidia's CUDA instruction set, giving Nvidia an all but permanent advantage there.

In video editing, the RX 9070 XT likewise lags behind, though it's still close enough to Nvidia's RTX 5070 Ti that video editors won't notice much difference, even if the difference is there on paper.

Gaming performance is what we're on about though, and here the sub-$600 GPU holds its own against heavy hitters like the RTX 4080, RTX 5070 Ti, and Radeon RX 7900 XTX.

In 1440p gaming, the RX 9070 XT is about 8.4% faster than the RTX 4070 Ti and RX 7900 XTX, just under 4% slower than the RTX 4080, and about 7% slower than the RTX 5070 Ti.

This strong performance carries over into 4K gaming as well, thanks to the RX 9070 XT's 16GB VRAM. Here, it's about 15.5% faster than the RTX 4070 Ti and about 2.5% faster than the RX 7900 XTX. Against the RTX 4080, the RX 9070 XT is just 3.5% slower, while it comes within 8% of the RTX 5070 Ti's 4K gaming performance.

When all is said and done, the RX 9070 XT doesn't quite overpower one of the best Nvidia graphics cards of the last-gen (and definitely doesn't topple the RTX 5070 Ti), but given its performance class, it's power draw, its heat output (which wasn't nearly as bad as the power draw might indicate), and most of all, it's price, the RX 9070 XT is easily the best value of any graphics card playing at 4K.

And given Nvidia's position with gamers right now, AMD has a real chance to win over some converts with this graphics card, and anyone looking for an outstanding 4K GPU absolutely needs to consider it before making their next upgrade.

  • Performance: 5 / 5

Should you buy the AMD Radeon RX 9070 XT?

Buy the AMD Radeon RX 9070 XT if...

You want the best value proposition for a high-end graphics card
The performance of the RX 9070 XT punches way above its price point.

You don't want to pay inflated prices for an Nvidia GPU
Price inflation is wreaking havoc on the GPU market right now, but this card might fare better than Nvidia's RTX offerings.

Don't buy it if...

You're on a tight budget
If you don't have a lot of money to spend, this card is likely more than you need.

You need strong creative or AI performance
While AMD is getting better at creative and AI workloads, it still lags far behind Nvidia's competing offerings.

How I tested the AMD Radeon RX 9070 XT

  • I spent about a week with the AMD Radeon RX 9070 XT
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about a week with the AMD Radeon RX 9070 XT, which was spent benchmarking, using, and digging into the card's hardware to come to my assessment.

I used industry standard benchmark tools like 3DMark, Cyberpunk 2077, and Pugetbench for Creators to get comparable results with other competing graphics cards, all of while have been tested using the same testbench setup listed on the right.

I've reviewed more than 30 graphics cards in the last three years, and so I've got the experience and insight to help you find the best graphics card for your needs and budget.

  • Originally reviewed March 2025
Honor confirms commitment to open collaboration in the AI space
2:35 pm |

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

Honor Alpha Plan was introduced on Sunday, and now we learn more details from the company about its plans in the world of AI. James Li, CEO of Honor, attended two panels at MWC 2025 at which he revealed the company will push for open collaboration in artificial intelligence. He also talked about a recently published white paper on protecting user data and the five principles his company intends to follow. The Honor executive pointed out that the world is moving into a physical AI era. The industry needs to cooperate and co-create a platform for a wider range of devices that should...

I really wanted to like the Nvidia GeForce RTX 5070, but it broke my heart and it shouldn’t have to break yours, too
5:00 pm | March 4, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , , | Comments: Off

Nvidia GeForce RTX 5070: Two-minute review

A lot of promises were made about the Nvidia GeForce RTX 5070, and in some narrow sense, those promises are fulfilled with Nvidia's mainstream GPU. But the gulf between what was expected and what the RTX 5070 actually delivers is simply too wide a gap to bridge for me and the legion of gamers and enthusiasts out there who won't be able to afford—or even find, frankly—Nvidia's best graphics cards from this generation.

Launching on March 5, 2025, at an MSRP of $549 / £549 / AU$1,109 in the US, UK, and Australia, respectively, this might be one of the few Nvidia Blackwell GPUs you'll find at MSRP (along with available stock), but only for lack of substantial demand. As the middle-tier GPU in Nvidia's lineup, the RTX 5070 is meant to have broader appeal and more accessible pricing and specs than the enthusiast-grade Nvidia GeForce RTX 5090, Nvidia GeForce RTX 5080, and Nvidia GeForce RTX 5070 Ti, but of all the cards this generation, this is the one that seems to have the least to offer prospective buyers over what's already on the market at this price point.

That's not to say there is nothing to commend this card. The RTX 5070 does get up to native Nvidia GeForce RTX 4090 performance in some games thanks to Nvidia Blackwell's exclusive Multi-Frame Generation (MFG) technology. And, to be fair, the RTX 5070 is a substantial improvement over the Nvidia GeForce RTX 4070, so at least in direct gen-on-gen uplift, there is a roughly 20-25% performance gain.

But this card is a far, far cry from the promise of RTX 4090 performance that Nvidia CEO Jensen Huang presented on stage at CES 2025, even with the qualifier that such an achievement would be "impossible without artificial intelligence," which implies a heavy reliance on DLSS 4 and MFG to get this card over the line.

If we're just talking framerates, then in some very narrow cases this card can do that, but at 4K with ray tracing and cranked-up settings, the input latency for the RTX 5070 with MFG can be noticeable depending on your settings, and it can become distracting. Nvidia Reflex helps, but if you take RTX 4090 performance to mean the same experience as the RTX 4090, you simply won't get that with MFG, even in the 80 or so games that support it currently.

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)

Add to all this the fact that the RTX 5070 barely outpaces the Nvidia GeForce RTX 4070 Super when you take MFG off the table (which will be the case for the vast majority of games played on this card) and you really don't have anything to show for the extra 30W of power this card pulls down over the RTX 4070 Super.

With the RTX 5070 coming in at less than four percent faster in gaming without MFG than the non-OC RTX 4070 Super, and roughly 5% faster overall, that means that the RTX 5070 is essentially a stock-overclocked RTX 4070 Super, performance-wise, with the added feature of MFG. An overclocked RTX 4070 Super might even match or exceed the RTX 5070's overall performance in all but a handful of games, and that doesn't even touch upon AMD's various offerings in this price range, like the AMD Radeon RX 7900 GRE or AMD's upcoming RX 9070 XT and RX 9070 cards.

Given that the RTX 4070 Super is still generally available on the market (at least for the time being) at a price where you're likely to find it for less than available RTX 5070 cards, and competing AMD cards are often available for less, easier to find, and offer roughly the same level of performance, I really struggle to find any reason to recommend this card, even without the questionable-at-best marketing for this card to sour my feelings about it.

I caught a lot of flack from enthusiasts for praising the RTX 5080 despite its 8-10% performance uplift over the Nvidia GeForce RTX 4080 Super, but at the level of the RTX 5080, there is no real competition and you're still getting the third-best graphics card on the market with a noticeable performance boost over the RTX 4080 Super for the same MSRP. Was it what enthusiasts wanted? No, but it's still a fantastic card with few peers, and the base performance of the RTX 5080 was so good that the latency problem of MFG just wasn't an issue, making it a strong value-add for the card.

You just can't claim that for the RTX 5070. There are simply too many other options for gamers to consider at this price point, and MFG just isn't a strong enough selling point at this performance level to move the needle. If the RTX 5070 is the only card you have available to you for purchase and you need a great 1440p graphics card and can't wait for something better (and you're only paying MSRP), then you'll ultimately be happy with this card. But the Nvidia GeForce RTX 5070 could have and should have been so much better than it ultimately is.

Nvidia GeForce RTX 5070: Price & availability

An Nvidia GeForce RTX 5070 sitting on top of its retail packaging

(Image credit: Future / John Loeffler)
  • How much is it? MSRP/RRP starting at $549 / £549 / AU$1,109
  • When can you get it? The RTX 5070 goes on sale on March 5, 2025
  • Where is it available? The RTX 5070 will be available in the US, UK, and Australia at launch

The Nvidia GeForce RTX 5070 is available starting March 5, 2025, with an MSRP of $549 / £549 / AU$1,109 in the US, UK, and Australia, respectively.

This puts it at the same price as the current RTX 4070 MSRP, and slightly less than that of the RTX 4070 Super. It's also the same MSRP as the AMD's RX 7900 GRE and upcoming RX 9070, and slightly cheaper than the AMD RX 9070 XT's MSRP.

The relatively low MSRP for the RTX 5070 is one of the bright spots for this card, as well as the existence of the RTX 5070 Founders Edition card, which Nvidia will sell directly at MSRP. This will at least put something of an anchor on the card's price in the face of scalping and general price inflation.

  • Value: 4 / 5

Nvidia GeForce RTX 5070: Specs

  • GDDR7 VRAM and PCIe 5.0
  • Higher power consumption
  • Still just 12GB VRAM, and fewer compute units

The Nvidia GeForce RTX 5070 is a mixed bag when it comes to specs. On the one hand, you have advanced technology like the new PCIe 5.0 interface and new GDDR7 VRAM, both of which appear great on paper.

On the other hand, it feels like every other spec was configured and tweaked to make sure that it compensated for any performance benefit these technologies would impart to keep the overall package more or less the same as the previous generation GPUs.

For instance, while the RTX 5070 sports faster GDDR7 memory, it doesn't expand the VRAM pool beyond 12GB, unlike its competitors. If Nvidia was hoping that the faster memory would make up for keeping the amount of VRAM the same, it only makes a modest increase in the number of compute units in the GPU (48 compared to the RTX 4070's 46), and a noticeable decrease from the RTX 4070 Super's (56).

Whatever performance gains the RTX 5070 makes with its faster memory, then, is completely neutralized by the larger number of compute units (along with the requisite number of CUDA cores, RT cores, and Tensor cores) in the RTX 4070 Super.

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)

The base clock on the RTX 5070 is notably higher, but its boost clock is only slightly increased, which is ultimately where it counts while playing games or running intensive workloads.

Likewise, whatever gains the more advanced TSMC N4P node offers the RTX 5070's GPU over the TSMC N4 node of its predecessors seems to be eaten up by the cutting down of the die. If there was a power or cost reason for this, I have no idea, but I think that this decision is what ultimately sinks the RTX 5070.

It seems like every decision was made to keep things right where they are rather than move things forward. That would be acceptable, honestly, if there was some other major benefit like a greatly reduced power draw or much lower price (I've argued for both rather than pushing for more performance every gen), but somehow the RTX 5070 manages to pull down an extra 30W of power over the RTX 4070 Super and a full 50W over the RTX 4070, and the price is only slightly lower than the RTX 4070 was at launch.

Finally, this is a PCIe 5.0 x16 GPU, which means that if you have a motherboard with 16 PCIe lanes or less, and you're using a PCIe 5.0 SSD, one of these two components is going to get nerfed down to PCIe 4.0, and most motherboards default to prioritizing the GPU.

You might be able to set your PCIe 5.0 priority to your SSD in your motherboard's BIOS settings and put the RTX 5070 into PCIe 4.0, but I haven't tested how this would affect the performance of the RTX 5070, so be mindful that this might be an issue with this card.

  • Specs: 2.5 / 5

Nvidia GeForce RTX 5070: Design

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)
  • No dual-pass-through cooling
  • FE card is the same size as the RTX 4070 and RTX 4070 Super FE cards

The Nvidia GeForce RTX 5070 Founders Edition looks identical to the RTX 5090 and RTX 5080 that preceeded it, but with some very key differences, both inside and out.

One of the best things about the RTX 5090 and RTX 5080 FE cards was the innovative dual pass-through cooling solution on those cards, which improved thermals so much that Nvidia was able to shrink the size of those cards from the gargantuan bricks of the last generation to something far more manageable and practical.

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)

It would have been nice to see what such a solution could have done for the RTX 5070, but maybe it just wasn't possible to engineer it so it made any sense. Regardless, it's unfortunate that it wasn't an option here, even though the RTX 5070 is hardly unwieldy (at least for the Founders Edition card).

Otherwise, it sports the same 16-pin power connector placement as the RTX 5090 and RTX 5080, so 90-degree power connectors won't fit the Founders Edition, though you will have better luck with most, if not all, AIB partner cards which will likely stick to the same power connector placement of the RTX 40 series.

The RTX 5070 FE will easily fit inside even a SFF case with ease, and its lighter power draw means that even if you have to rely on the included two-to-one cable adapter to plug in two free 8-pin cables from your power supply, it will still be a fairly manageable affair.

Lastly, like all the Founders Edition cards before it, the RTX 5070 has no RGB, with only the white backlight GeForce RTX logo on the top edge of the card to provide any 'flair' of that sort.

  • Design: 3.5 / 5

Nvidia GeForce RTX 5070: Performance

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)
  • Almost no difference in performance over the RTX 4070 Super without MFG
  • Using MFG can get you native RTX 4090 framerates in some games
  • Significantly faster performance over the RTX 4070
A note on my data

The charts shown below offer the most recent data I have for the cards tested for this review. They may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

Boy howdy, here we go.

The best thing I can say about the performance of this card is that it is just barely the best 1440p graphics card on the market as of this review, and that DLSS 4's Multi Frame Generation can deliver the kind of framerates Nvidia promises in those games where the technology is available, either natively or through the Nvidia App's DLSS override feature.

Both of those statements come with a lot of caveats, though, and the RTX 5070 doesn't make enough progress from the last gen to make a compelling case for itself performance-wise, especially since its signature feature is only available in a smattering of games at the moment.

On the synthetic side of things, the RTX 5070 looks strong against the card it's replacing, the RTX 4070, and generally offers about 25% better performance on synthetic benchmarks like 3DMark Steel Nomad or Speed Way. It also has higher compute performance in Geekbench 6 than its direct predecessor, though not be as drastic a margin (about 10% better).

Compared to the RTX 4070 Super, however, the RTX 5070's performance is only about 6% better overall, and only about 12% better than the AMD RX 7900 GRE's overall synthetic performance.

Again, a win is a win, but it's much closer than it should be gen-on-gen.

The RTX 5070 runs into similar issues on the creative side, where it only outperforms the RTX 4070 Super by about 3% overall, with its best performance coming in PugetBench for Creators' Adobe Premiere benchmark (~13% better than the RTX 4070 Super), but faltering somewhat with Blender Benchmark 4.3.0.

This isn't too surprising, as the RTX 5070 hasn't been released yet and GPUs tend to perform better in Blender several weeks or months after the card's release when the devs can better optimize things for new releases.

All in all, for this class of cards, the RTX 5070 is a solid choice for those who might want to dabble in creative work without much of a financial commitment, but real pros are better off with the Nvidia GeForce RTX 5070 Ti if you're looking to upgrade without spending a fortune.

It's with gaming, though, where the real heartbreak comes with this card.

Technically, with just 12GB VRAM, this isn't a 4K graphics card, but both the RTX 4070 Super and RTX 5070 are strong enough cards that you can get playable native 4K in pretty much every game so long as you never, ever touch ray tracing, global illumination, or the like. Unfortunately, both cards perform roughly the same under these conditions at 4K, with the RTX 5070 pulling into a slight >5 fps lead in a few games like Returnal and Dying Light 2.

However, in some titles like F1 2024, the RTX 4070 Super actually outperforms the RTX 5070 when ray tracing is turned on, or when DLSS is set to balanced and without any Frame Generation. Overall and across different setting configurations, the RTX 5070 only musters a roughly 4.5% better average FPS at 4K than the RTX 4070 Super.

It's pretty much the same story at 1440p, as well, with the RTX 5070 outperforming the RTX 4070 Super by about 2.7% across configurations at 1440p. We're really in the realm of what a good overclock can get you on an RTX 4070 Super rather than a generational leap, despite all the next-gen specs that the RTX 5070 brings to bear.

OK, but what about the RTX 4090? Can the RTX 5070 with DLSS 4 Multi Frame Generation match the native 4K performance of the RTX 4090?

Yes, it can, at least if you're only concerned with average FPS. The only game with an in-game benchmark that I can use to measure the RTX 5070's MFG performance is Cyberpunk 2077, and I've included those results here, but in Indiana Jones and the Great Circle and Dragon Age: Veilguard (using the Nvidia App's override function) I pretty much found MFG to perform consistently as promised, delivering substantially faster FPS than DLSS 4 alone and landing in the ballpark of where the RTX 4090's native 4K performance ends up.

And so long as you stay far away from ray tracing, the base framerate at 4K will be high enough on the RTX 5070 that you won't notice too much, if any, latency in many games. But when you turn ray tracing on, even the RTX 5090's native frame rate tanks, and it's those baseline rendered frames that handle changes based on your input, and the three AI-generated frames based on that initial rendered frame don't factor in whatever input changes you've made at all.

As such, even though you can get up to 129 FPS at 4K with Psycho RT and Ultra preset in Cyberpunk 2077 on the RTX 5070 (blowing way past the RTX 5090's native 51 average FPS on the Ultra preset with Psycho RT), only 44 of the RTX 5070's 129 frames per second are reflecting active input. This leads to a situation where your game looks like its flying by at 129 FPS, but feels like it's still a sluggish 44 FPS.

For most games, this isn't going to be a deal breaker. While I haven't tried the RTX 5070 with 4x MFG on Satisfactory, I'm absolutely positive I will not feel the difference, as it's not the kind of game where you need fast reflexes (other than dealing with the effing Stingers), but Marvel Rivals? You're going to feel it.

Nvidia Reflex definitely helps take the edge off MFG's latency, but it doesn't completely eliminate it, and for some games (and gamers) that is going to matter, leaving the RTX 5070's MFG experience too much of a mixed bag to be a categorical selling point. I think the hate directed at 'fake frames' is wildly overblown, but in the case of the RTX 5070, it's not entirely without merit.

So where does that leave the RTX 5070? Overall, it's the best 1440p card on the market right now, and it's relatively low MSRP makes it the best value proposition in its class. It's also much more likely that you'll actually be able to find this card at MSRP, making the question of value more than just academic.

For most gamers out there, Multi Frame Generation is going to be great, and so long as you go easy on the ray tracing, you'll probably never run into any practical latency in your games, so in those instances, the RTX 5070 might feel like black magic in a circuit board.

But my problem with the RTX 5070 is that it is absolutely not the RTX 4090, and for the vast majority of the games you're going to be playing, it never will be, and that's essentially what was promised when the RTX 5070 was announced. Instead, the RTX 5070 is an RTX 4070 Super with a few games running MFG slapped to its side that look like they're playing on an RTX 4090, but may or may not feel like they are, and that's just not good enough.

It's not what we were promised, not by a long shot.

  • Performance: 3 / 5

Should you buy the Nvidia GeForce RTX 5070?

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)

Buy the Nvidia GeForce RTX 5070 if...

You don't have the money for (or cannot find) an RTX 5070 Ti or RTX 4070 Super
This isn't a bad graphics card, but there are so many better cards that offer better value or better performance within its price range.

You want to dabble in creative or AI work without investing a lot of money
The creative and AI performance of this card is great for the price.

Don't buy it if...

You can afford to wait for better
Whether it's this generation or the next, this card offers very little that you won't be able to find elsewhere within the next two years.

Also consider

Nvidia GeForce RTX 5070 Ti
The RTX 5070 Ti is a good bit more expensive, especially with price inflation, but if you can get it at a reasonable price, it is a much better card than the RTX 5070.

Read the full Nvidia GeForce RTX 5070 Ti review

Nvidia GeForce RTX 4070 Super
With Nvidia RTX 50 series cards getting scalped to heck, if you can find an RTX 4070 Super for a good price, it offers pretty much identical performance to the RTX 5070, minus the Multi Frame Generation.

Read the full Nvidia GeForce RTX 4070 Super review

How I tested the Nvidia GeForce RTX 5070

  • I spent about a week with the RTX 5070
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about a week testing the Nvidia GeForce RTX 5070, using it as my main workstation GPU for creative content work, gaming, and other testing.

I used my updated testing suite including industry standard tools like 3DMark and PugetBench for Creators, as well as built-in game benchmarks like Cyberpunk 2077, Civilization VII, and others.

I've reviewed more than 30 graphics cards for TechRadar in the last two and a half years, as well as extensively testing and retesting graphics cards throughout the year for features, analysis, and other content, so you can trust that my reviews are based on experience and data, as well as my desire to make sure you get the best GPU for your hard earned money.

  • Originally reviewed March 2025
Honor Alpha plan will see $10B invested in AI products with smartphones at the front
12:42 am | March 3, 2025

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

Honor Alpha Plan was announced today at an event in Barcelona, ahead of the official start of the MWC tomorrow. The company's new corporate strategy will see it develop a larger AI device ecosystem. Honor has set aside $10 billion, which it will invest over the next five years, as it develops an intelligent phone that will "revolutionize human-to-device interaction" and bridge the AI ecosystem with consumers. The strategy also sees the release of AI-enhanced PCs, tablets, and wearables. The new Honor CEO, James Li, urged industry partners to expand their AI capabilities and...

I tested the Acer Swift 14 AI for two weeks – if you need a new work laptop, this one is close to perfection
8:30 pm | February 26, 2025

Author: admin | Category: Computers Computing Gadgets Laptops | Tags: , | Comments: Off

Acer Swift 14 AI: Two-minute review

Acer Aspire 14 AI laptop closed to show its black exterior

(Image credit: Future / Jasmine Mannan)

With so many laptops hitting the scene at the moment, Acer has thrown it’s hat in the ring with the Acer Swift 14 AI. This mighty laptop has swept me off of my feet almost immediately and could very well be a contender for one of the best laptops on the market right now.

Boasting a gorgeous OLED screen, I was stunned by the visuals I was getting when watching videos or editing pictures. You also get a great battery life which will last you all day when working and with some juice left in the tank afterwards. While it didn’t live up to the 17 hours claimed by Acer, it came in pretty close at 14 hours. I took some issues with the design because it just felt a bit clunky, but this is something that you likely wouldn’t notice unless you were specifically looking for issues.

Coming in at $1,199 / £899 / AU$1,899 (currently on sale in the UK), this laptop is on the pricier side, with similar alternatives also coming in around this price point. However, when factoring in the sale price, I am genuinely gobsmacked that you’re able to get a laptop of this quality in terms of both hardware and performance, for less than £900 ($1,200).

Acer Swift 14 AI: Price and availability

  • How much does it cost? $1,199 / £899 / AU$1,899
  • When is it available? Available now
  • Where can you get it? Available in the US, UK, and Australia

The Acer Swift 14 AI comes in at $1,199 / £1,199 / AU$1,899, which is definitely not a budget-friendly option when it comes to laptops, however you get some very good specifications for this price point. You can currently pick it up in most regions. It’s on sale at Currys in the UK at the moment, with the Intel edition sitting at a much more affordable £899. On the US side of the map, you can pick it up on sale at Amazon, Best Buy, Newegg, and other similar retailers.

Alternatives on the market include the Asus Zenbook A14, which has similar specs and also features an OLED screen. It sits at a higher price point of $1,399.99, though. Another alternative is the Apple MacBook Air 2024 edition, which features the M3 chip and comes in at $1,199. This makes the Acer Swift 14 AI a great option to pick up while it's on sale, but it’s still decent value when it's not.

Value: 5 / 5

Acer Swift 14 AI: Specs

You can get an Intel or a Qualcomm version of the Acer Swift 14 AI. In the UK these are the same price, however the sale over at Currys only applies to the Intel version, which does make the Qualcomm one more expensive. Elsewhere in the world, you’ll also spend more on the Qualcomm edition as it can deliver more power. All of the rest of the specifications are the same.

As of right now, you can’t configure these specifications, however considering everything other than the processor is identical it doesn’t seem like you’d need to. The Acer website does state that this laptop comes with up to 32GB of RAM suggesting that other configurations may be in the works.

Acer Swift 14 AI: Design

The Acer Aspire 14 AI's keyboard and left-side ports

(Image credit: Future / Jasmine Mannan)

When first unboxing the Acer Swift 14 AI, I was somewhat underwhelmed. While it is housed in a high quality and sturdy chassis, it doesn’t have any particular flare in terms of design. It weighs in at 1.28kg (2.82 lbs), which is definitely lightweight, but not the lightest option on the market with the Asus Zenbook 14 coming in at under a kilogram (2.2 lbs).

Something that particularly irked me when using this laptop was the fact that the edges weren’t rounded off properly which made it feel a bit clunky. You’ll also find a slightly thicker bezel around the webcam which again subtracts from the sleekness of the design. However, the Acer Swift 14 AI is 1.49cm (0.59 inches) thick, which keeps it slimline. It’s easy to stick in a backpack or sleeve and take on the go with you.

You get two USB-C ports, either can be used for charging. You also get two USB-A ports which is greatly appreciated. As someone who uses a range of peripherals, whether it be a USB headset, keyboard, mouse or extra monitors, I found there were enough ports for me to use this laptop as part of my workstation, just about. You also get an HDMI port.

The chassis of this laptop is made of metal rather than plastic like some other lightweight alternatives making for a more premium feel. The keyboard is very low profile, making for satisfying key presses. You’ll find that the keys are the same texture as the rest of the laptop too rather than being made from plastic which again contributes to the higher quality feeling.

Design: 4 / 5

Acer Swift 14 AI: Performance

Acer Aspire 14 AI laptop display showing the Windows 11 login screen

(Image credit: Future / Jasmine Mannan)

Using the Acer Swift 14 AI did feel like a bit of a step down from the Asus Zenbook A14 I just tested, however at £200 less this is to be expected. Using it on the go wasn’t as satisfying as my typical MacBook Air despite it being a similar size and weight, with it feeling more clunky when putting it in my bag.

When booting up the laptop, I was stunned by the screen. The 2K OLED panel is gorgeous and makes for a fantastic experience when watching videos or movies. For creatives who design graphics or video assets, seeing the screen bring your creations to life through color is amazing. With most laptops being able to deliver great performance, they now have to stand out with other features and the OLED screen does a great job of this.

Acer Swift 14 AI: Benchmarks - Gaming laptops only

Here's how the Acer Swift 14 AI performed in our suite of benchmark tests:

3DMark: Night Raid: 36616 ; Fire Strike: 8898 ; Time Spy: 4438

GeekBench 6: 1884 (single-core); 7657 (multi-core)

CrossMark: Overall: 978 Productivity: 914 Creativity: 1082 Responsiveness: 883

PCMark 10 Battery Life: 14 Hours 7 Mins

TechRadar movie test: 13 Hours 43 Mins

While this laptop isn’t made for intensive tasks, I still put it to the test. Things like basic photo and video editing on Abode Photoshop and Adobe Premiere Pro worked absolutely fine, and while Premiere Pro did feel a bit sluggish, it was by no means unusable. Taxing 3D animation is certainly off the table here but using this laptop for standard everyday productivity is perfectly fine.

I was sure to try multitasking here too, with a range of programs open at once and to my surprise, it functioned quite well even with just 16GB of RAM. Even being in a video call while also creating a PowerPoint and watching a video, the laptop didn’t stutter at all. This felt like a decent upgrade from my old Macbook Air which immediately starts whining when I open a second Chrome tab.

Even when using this laptop on the go, it stayed very quiet which was great. One of my biggest fears is booting up my MacBook on a train and its begins to sound like a rocket taking off, and I never had to worry about that with the Acer Swift 14 AI.

Of course you get the benefits of CoPilot+ with this laptop too but for me this is no longer a standout feature and instead just expected.

Performance: 5 / 5

Acer Swift 14 AI: Battery life

When using the Acer Swift 14 AI in my day to day life I found that the battery life was pretty impressive. It wasn’t a device you could go days without charging by any means, but you could certainly get through a full workday and still have some battery left over at the end of the day.

Acer claim that this laptop boasts 17 hours of battery life and our testing came in pretty close…but no cigar, at 14 hours of continuous video playback - which is obviously not how many people will be using their device on a day-to-day basis.

Battery life: 4 / 5

Should I buy the Acer Swift 14 AI?

Buy it if...

You want a slightly more affordable laptop

Currently on sale to £899 ($1,199), the Acer Swift 14 AI delivers the same performance as higher priced alternativesView Deal

You want good battery life

You can easily work all day and have battery left over. View Deal

You want to be able to multitask

The laptop will happily run multiple programs at once without stuttering or slowing downView Deal

Don't buy it if...

You want something easy and satisfying to travel with

Despite being thin and lightweight, the design leads to a clunky feeling device. View Deal

You want something with a lot of processing power

While this device is powerful, it won’t be able to undertake super intensive tasksView Deal

You want a device you can game on

This laptop is just not made for gaming and while you likely could play some titles on here, they wouldn’t be the best performanceView Deal

Also Consider

If our Acer Swift 14 AI review has you considering other options, here are two laptops to consider...

Asus Zenbook A14

If you’re looking for a premium option that's lightweight and easy to take with you on the go then the Asus Zenbook A14 is the laptop for you. You still get the stunning OLED display and a powerful processor with even better portability. It does come with a slight price tag increase, though.

Read our full Asus Zenbook A14 reviewView Deal

Apple Macbook Pro 16-inch (M4 Pro, 2024)

If you want a laptop that's capable of performing more intensive tasks like 3D Animation or heavy video rendering then the M4 Pro processor in the Apple Macbook Pro is going to help you out more.

Read our full Apple Macbook Pro 16-inch (M4 Pro, 2024) reviewView Deal

How I tested the Acer Swift 14 AI

I spent two weeks using the Acer Swift 14 AI as my every day laptop for work and leisure. I was sure to use the device all day for my typical work day, and then also watched videos, movies and browsed social media on it in the evenings. I took it on the go with me to different locations where I needed to work to see how it would feel outside of my workstation. As well as using it every day I also benchmarked the laptop using a range of different benchmarking software.

First reviewed February 2025

Honor brings old Manchester United photos back to life with AI upscaling
5:47 pm | February 21, 2025

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

The Honor Magic7 Pro comes with a suite of AI features, including Photo Upscale, which brings older photos back to life. In partnership with Qualcomm, the company demoed the feat in a promo celebrating the 115th anniversary of Old Trafford, the home of English football club Manchester United. The Before and After difference was revealed by Jason Leach, a Man United club historian, who explained the significance of some pictures, including the Mark Hughes goal against Norwich City from the season when United won its first Premier League trophy. Honor Magic7 Pro is powered by...

Honor brings old Manchester United photos back to life with AI upscaling
5:47 pm |

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

The Honor Magic7 Pro comes with a suite of AI features, including Photo Upscale, which brings older photos back to life. In partnership with Qualcomm, the company demoed the feat in a promo celebrating the 115th anniversary of Old Trafford, the home of English football club Manchester United. The Before and After difference was revealed by Jason Leach, a Man United club historian, who explained the significance of some pictures, including the Mark Hughes goal against Norwich City from the season when United won its first Premier League trophy. Honor Magic7 Pro is powered by...

Microsoft introduces an AI model for game development called Muse
11:01 pm | February 20, 2025

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

Microsoft partnered with Teachable AI Experiences and the Xbox Games Studios' Ninja Theory to introduce a new AI model aimed at helping game developers. It's called Muse. What can Muse do? It can generate game visuals and controller actions based on a 3D understanding of a game. The model was trained on 1.6 billion parameters and with 1 billion images and controller actions. That's about 7 years of continuous human gameplay. They used Ninja Theory's Bleeding Edge game. The data used for the model was collected using gameplay data from users who gave their consent. The model works...

« Previous PageNext Page »