Organizer
Gadget news
Nvidia announces RTX 5060 and 5060 Ti graphics cards
7:00 am | April 16, 2025

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

Nvidia today announced the RTX 5060 and the RTX 5060 Ti, which are the entry-level offerings in the company's 50-series graphics. Starting with the more powerful RTX 5060 Ti, this model features the GB206 die with 4608 CUDA cores. It features 2.57GHz boost clock and 2.41GHz base clock. Like the 4060 Ti, you get a choice of 16GB and an abysmal 8GB video memory options, this time in GDDR7 and clocked at 28Gbps with a 128-bit bus. The card has 180W power limit for both memory options. Then there's the RTX 5060, which also features the same GB206 die but with 3840 CUDA cores. It has...

I’ve reviewed dozens of gaming laptops, and the new Razer Blade 16 with Nvidia’s RTX 5090 is one of the best I’ve ever seen
4:00 pm | March 28, 2025

Author: admin | Category: Computers Computing Gadgets Gaming Computers Gaming Laptops | Tags: , , | Comments: Off

Razer Blade 16 (2025): Two minute review

Weeks of stock shortages and scalping later, I can finally breathe: RTX 5000 laptops are here, offering a fresh path into Nvidia's glorious ray-traced future that doesn't involve selling your soul on Ebay - though you might need to sell a kidney to afford the new Razer Blade 16, especially if you're eyeing the top-of-the-line RTX 5090 configuration graciously provided to me for this review.

Yes, Razer is not beating the 'pricey hardware' allegations any time soon; the new Blade 16 starts at a wallet-battering $2,999.99 / £2,699.99 / AU$4,899.95, and the higher-spec configurations rocket beyond the four-thousand mark in the US, UK, and Europe.

Don't get me wrong, though: the sky-high pricing is just about the only criticism I have here. Razer's iconic 16-inch laptop has undergone a subtle redesign - and I mean subtle, bordering on indistinguishable - that provides a range of small but worthy improvements, and at the heart of it all, Nvidia's Blackwell GPU lineup delivers boosted performance and a new wealth of features ready to do battle with the best gaming laptops.

The Razer Blade 16 photographed for TechRadar on a white surface with plants in the background.

(Image credit: Future)

I'll get into the real meat of these graphics upgrades later on, but here's the short version: this thing goes hard. Between DLSS 4, Multi Frame Generation, Reflex 2, and the general generational improvements from RTX 4000, even the most demanding titles deliver crisp, speedy frame rates on the QHD+ OLED screen - and the 240Hz refresh rate means you won't find your game performance capped by the display.

The Razer Blade 16 isn't all steak and no sizzle, either. This is one classy-looking gaming laptop, as I've come to expect from Razer, with an anodized aluminum chassis, per-key RGB lighting, and the same overall top-notch build quality any previous Razer owner will be well accustomed to. Again, scroll on down to that Design section for all the juicy deets, but I will take a quick moment here to remark on the new-and-improved keyboard, which now features greater key travel and smoother actuation than previous models to provide a more pleasant typing experience as well as responsive inputs when gaming.

With how difficult it's proving to get your hands on a desktop RTX 5000 GPU, many will be looking towards the incoming slate of Blackwell-equipped laptops to scratch that hardware upgrade itch. If that's you - and you can stomach the price tag - then the Razer Blade 16 is the laptop to pick.

Razer Blade 16 (2025): Price and availability

The Razer Blade 16 photographed for TechRadar on a white surface with plants in the background.

(Image credit: Future)
  • How much is it? Starting at $2,999.99 / £2,699.99 / AU$4,899.95
  • When is it available? Available now
  • Where can you get it? Available in the US, UK, and Australia

There's no getting around it: this is one expensive laptop, even pricier than the previous RTX 4000 Razer Blade 16 we reviewed back in late 2023. Granted, you're getting a hell of a lot of gaming power for your money, but it's still pricey.

The Razer Blade 16 (2025) will cost you $2,999.99 / £2,699.99 / AU$4,899.95 for the base configuration: that gets you an RTX 5070 Ti, 10-core AMD Ryzen AI 9 HX 365 processor, 32GB of ultra-speedy LPDDR5X 8,000MHz memory, and a 1TB SSD. That's right - the long-standing industry baseline of 16GB of RAM is out, 32GB is the new standard for Razer.

There are a few different configurations, with more RAM or storage and an upgrade to either an RTX 5080 or 5090. The top-spec model costs a piggy-bank-busting $4,899.99 / £4,299.99 / AU$7,999.95, packing the RTX 5090 along with a 12-core Ryzen AI 9 HX 370 CPU, 64GB of RAM, and a huge 4TB of storage (in the form of two 2TB SSDs). Every version has the same 16-inch 240Hz QHD+ OLED display; sadly, there's no 4K model on offer, which does feel like a somewhat odd omission considering that the RTX 5090 is absolutely capable of 4K gaming.

I really can't stress enough that this is a comically enormous amount of money for the average PC gamer. As a fun little exercise, I went looking online for used cars that cost less than the Blade 16 review unit I'm typing this review on. 47,764 results. Oof.

Still, when scalpers are selling RTX 5090 cards on Ebay for upwards of four thousand bucks, it's not an entirely unattractive proposition. I mean, you're getting the whole system here, while the additional cost of a full desktop PC build capable of supporting Nvidia's latest flagship GPU without bottlenecking isn't exactly cheap. I have no doubt there will be laptops from other manufacturers that offer the same (or at least similar) specs at a lower price, but there will be sacrifices made - whether that's chassis materials, display quality, or memory speed.

  • Value: 2.5 / 5

Razer Blade 16 (2025): Specs

The Razer Blade 16 photographed for TechRadar on a white surface with plants in the background.

(Image credit: Future)

Razer Blade 16 (2025): Design

The Razer Blade 16 photographed for TechRadar on a white surface with plants in the background.

(Image credit: Future)
  • Same classic Blade finish with premium-feel anodized metal casing
  • New keyboard design is a real improvement
  • Chassis is lighter and more compact than previous models

Razer has long held a reputation as a company known for using premium materials for its hardware, and the new Blade 16 certainly doesn't buck this trend: every Blade begins life as a singular slab of tempered aluminum, which is then precision-milled into shape and electrochemically anodized to create a wear-resistant color finish designed to last.

The Razer Blade 16 photographed for TechRadar on a white surface with plants in the background.

(Image credit: Future)

Although the overall aesthetic of Razer's modern Blade lineup has changed very little over the years - compare this laptop side-by-side with the Razer Blade we reviewed back in 2018, and you'll see what I mean - there have been some small adjustments this time around, and all of them are good changes.

For starters, the laptop has been retooled to reduce the overall weight and size; there's a limit to this, especially with a 16-inch display, but Razer has managed to cut down the total package volume by almost 30%. It's almost half a centimetre thinner than the previous-gen Blade 16 (4.59mm, to be precise), and it adds that half-centimete to the length of the laptop instead - in practice, this doesn't make the Blade 16's footprint any appreciably larger, but does make it look and feel a lot thinner. It's also 310 grams lighter than the previous model.

The Razer Blade 16 photographed for TechRadar on a white surface with plants in the background.

(Image credit: Future)

Razer has also seen fit to improve the keyboard this time around, and it makes for a more satisfying typing experience than previous Blades. The key travel has been increased by 50%, and the actuation force sits at a finely-tuned 63G, providing a good amount of physical feedback that helped me avoid misinputs while both gaming and typing. The keys are quite widely spaced, which makes it comfortable to use even for long periods.

Naturally, this Blade now comes with a dedicated button for Copilot, Microsoft's AI assistant for Windows, though I doubt most buyers will have much cause to use that. More interesting is the new row of five programmable macro keys, giving you instant access to the functions you use most often.

Additionally, this ain't your daddy's backlit RGB keyboard: not only does the full layout have per-key RGB lighting, but multiple keys actually feature two LEDs beneath the keycap, enabling a nifty feature where holding down Fn or Shift switches the illumination to instantly highlight the relevant keys. You can see what I mean in the GIF below - it's a small bonus, but I rather like it.

A GIF showing the swap-lighting effect on the Razer Blade 16's RGB keyboard when the Shift and Function keys are pressed.

(Image credit: Future)

The touchpad is nothing to write home about (let's be honest, you'll be using a mouse anyway), but it is nice and large with a firm click to it. There's also a pretty straightforward 1080p webcam and microphone array, both of which work fine.

More impressive are the speakers: a six-speaker array with THX Spatial Audio support makes for one of the best audio experiences I've seen on a laptop. It's no secret that laptop makers often skimp on speaker quality because so many people will simply connect a headset anyway, but that's certainly not the case here. The bass is rich and punchy, and the midrange is crystal clear; high pitches are a little bit thin, but it's still a strong showing overall.

I'm not going to dedicate a huge amount of time here to the pre-loaded Razer Synapse software package, but I will say that there are some welcome improvements over the old (and rather wonky) version. Tweaking your system performance and lighting effects is nice and straightforward, as well as syncing and modifying any Razer peripherals you want to use.

The Razer Blade 16 photographed for TechRadar on a white surface with plants in the background.

(Image credit: Future)

I can't not talk about the screen, of course. Razer's hardware lineup has always offered high-end display configurations, often employing OLED panels, which remains the case here - though as I mentioned further up in this review, there's strangely no 4K configuration available this time around. Instead, every 2025 Blade 16 model comes with the exact same 16-inch 240Hz OLED display, with a resolution of 2560x1600. That's a 16:10 aspect ratio, which I'm personally a big fan of on laptops.

Still, it's an undeniably gorgeous screen: colors are bright and vibrant, contrast is sharp, and blacks are deep. It looks fantastic in motion while playing games like Cyberpunk 2077; the rainy, neon-splattered streets of Night City after dark are vividly colorful and realistic on this display.

One feature that I'm always happy to see is upgradability. Unfortunately, the RAM in the Razer Blade 16 is soldered, but the SSD is user-upgradable. In fact, anything less than the 4TB configuration (which uses two 2TB SSDs) comes with an empty NVMe slot for you to easily plug in a second drive if you want to expand the storage yourself.

Lastly, we've got a nice broad port selection here, with two USB-C ports, three USB-As, an HDMI port for connecting a second display, the omnipresent 3.5mm headphone jack, and finally a full-size SD card reader for creative users - a wise inclusion considering that this laptop can comfortably pull double duty as a work system (more on that later).

  • Design: 5 / 5

Razer Blade 16 (2025): Performance

The Razer Blade 16 photographed for TechRadar on a white surface with plants in the background.

(Image credit: Future)
  • Best-in-class performance
  • RTX 5090 and Ryzen 9 AI HX 370 are a deadly combo
  • New Nvidia features offer a huge performance boost

Enough about aesthetics: this is a gaming laptop, so how well does it run games? The answer is: extremely well, especially once you factor in Nvidia's DLSS and frame-gen tech.

Our standard testing process involves running games without using any form of upscaling or frame-gen tech, and you can see the results below. It's worth noting that the 2024 Razer Blade 16 with RTX 4090 I've used for comparative purposes has an Intel Core i9-14900HX processor, which quite literally has double the core count of the Ryzen 9 AI HX 370 chip in this laptop, so without implementing Nvidia's latest goodies, the differences are relatively minor in many games.

Performance is still strong, make no mistake - but if you're aiming to play the latest games at maximum graphical settings on this laptop's native 1600p resolution, you're going to want to use DLSS.

Kick DLSS 4 and Multi Frame Generation (MFG) into gear, and it's a totally different story. I tested a few different supported titles at their respective maximum presets with ray tracing enabled, and both enjoyed a serious performance bump with Nvidia's fancy AI-powered software enabled.

There's been quite some debate about tools such as resolution upscaling and frame generation, not least due to the use of AI for both, and I admit I've been skeptical in the past. Here, it's a revelation. Earlier iterations of DLSS - and the frame-gen model seen in the previous RTX 4000 generation - were imperfect, prone to impact lag and visual glitching, especially on hardware that would struggle to hit 60fps without any AI-assisted add-ons. But DLSS 4 and MFG work phenomenally well on a laptop packing an RTX 5090; in Alan Wake 2, a thoroughly beautiful (and therefore demanding) game, I was lucky to reach above the 60fps mark without any upscaling or frame-gen enabled. With those settings turned on? 200fps, easy.

Meanwhile, Cyberpunk 2077 and Returnal saw similarly massive framerate bumps. In Cyberpunk, the maxed-out ray-tracing preset struggled a bit at native resolution, scoring a meager average of 43 fps. With DLSS 4 and MFG, it averaged 217fps while still looking absolutely stunning. In Returnal, 113fps went to 240fps (which was the active cap) at 1440p - it doesn't have the option to run at the Blade 16's 1600p 16:10 resolution, but still, you get the idea.

It looks so good now, too; gone are the tearing and blurring I noted in my early experiments with DLSS, without any appreciable amount of input latency either. I imagine it's still there, perhaps noticeable to a pro esports gamer playing a twitchy shooter like Counter-Strike 2 or Valorant, but I certainly wasn't able to detect it.

Of course, DLSS 4 and MFG aren't available universally. Developers have to add support for the functionality, although there's also a new DLSS Override option for 'force-enabling' it in unsupported games, which I deployed for Returnal - as far as I could tell, it worked without issues, though of course that's just for one modern game.

Performance in synthetic tests was also strong, with good - though not world-beating - performance across both gaming and creative workloads. If you're hoping to use this laptop for professional creative work, it won't let you down. In fact, the performance it offers compared to the weight of the laptop is among the best I've ever seen, making it ideal for working on the go.

It's worth noting here that the 2024 Blade 16 actually outperforms the new model across several of our tests, but again, we can put that down to the significantly more powerful CPU found in the 2024 model. The power efficiency of the Ryzen chip is not to be understated, though - take a look at the battery life section, and you'll see what I mean.

  • Performance: 5 / 5

Razer Blade 16 (2025): Battery life

  • Surprisingly good battery life
  • Almost a full day's regular use, about two and a half hours of gaming
  • Charges fast but uses a proprietary charger

Battery life is rarely a selling point of gaming laptops, but I was pleasantly surprised with the battery life on the Razer Blade 16. In the PCMark 10 Gaming battery test, it lasted for almost two and a half hours; in real-world tests, I found this figure highly accurate, assuming you're playing with the battery efficiency preset on in Windows and brightness at 50% or lower.

Outside of gaming and running similarly demanding software, the Blade 16 offers some impressive longevity for a gaming laptop. The 90Whr battery lasted for almost seven and a half hours in our Battery Informant Web Surfing test, and it also holds charge remarkably well when not in use. This is likely due to the improved Nvidia Optimus tech, which offloads graphical processing to the Ryzen CPU's integrated graphics when you're not playing games or running GPU-intensive apps. Razer claims that the new Blade 16 offers 'up to 11 hours' of use, which is probably true if you really try to squeeze the battery with minimum brightness and power-saving mode turned on.

Although it needs a fairly chunky power adapter with a proprietary Razer connector, the Blade 16 also charges very quickly, charging up to 50% in about 30 minutes and 100% in just over an hour.

  • Battery Life: 4.5 / 5

Should you buy the Razer Blade 16 (2025)?

Buy the Razer Blade 16 (2025) if...

You want the best gaming performance there is
The RTX 5090 laptop GPU inside this laptop is a monster, delivering top-notch frame rates in games and offering the full suite of performance-boosting Nvidia software.

You want a gaming laptop you can use for work
Thanks to its surprisingly strong battery life and great capabilities when it comes to handling creative and AI workloads, the Razer Blade 16 can comfortably pull double duty as a work laptop when you're not using it for gaming.

Don't buy it if...

You're on a budget
Starting at over two grand, this is not a cheap gaming laptop by any means.

You want something compact
Although Razer has worked miracles reducing the weight and thickness of the new Blade 16, no 16-inch laptop can reasonably be called 'small'.

Also consider

If my Razer Blade 16 (2025) review has you considering other options, here is another laptop to consider:

Razer Blade 14 (2024)
If you're in the market for something a bit more svelte, consider the Blade 16's little sibling, the Blade 14. These aren't available with RTX 5000 GPUs, however - at least, not yet. But you still get the same excellent design and build quality, and a lower price tag too.

Read our full Razer Blade 14 (2024) review

MSI Titan 18 HX
Another absolute beast of a gaming laptop, the Titan 18 HX from MSI is a strong pick if you're looking for a gaming laptop that can also function as a premium workstation PC. With an Intel Core i9-14900HX CPU and up to 128GB(!!!) of RAM, this is one of the finest desktop-replacement systems on the market.

Read our full MSI Titan 18 HX review

How I tested the Razer Blade 16 (2025)

I spent just over a week with the Razer Blade 16 (2025), using it almost every day for both work and gaming. I don't always love working, but damn, if this didn't make it more pleasant.

Naturally, we ran plenty of performance tests on the Blade 16, taking additional time to test out the new DLSS 4 and Multi Frame Generation features on a handful of supported titles. What I played the most was Warframe, which isn't in our testing suite but look, I'm an addict.

In terms of work, I used the Blade 16 for everything from word processing to web browsing to image editing, and even took it out into my garden to work in the sun and put the battery life and display to the test in a real-world setting - both held up great.

  • First reviewed March 2025
The Thrustmaster Sol-R breaks free from Earth’s atmosphere in style, with a fantastic stick for space fans
9:00 pm | March 19, 2025

Author: admin | Category: Computers Gadgets Gaming Gaming Accessories | Tags: , | Comments: Off

One-minute review

Thrustmaster is arguably at the top of its game when it comes to flight sticks, so it’s perhaps not all that surprising that the company is now making a play for space sim gear.

The Sol-R range has a cute name, but don’t let that fool you – this is a serious stick (or pair thereof) for anyone who spends plenty of time in the hour-devouring black mass of titles like Elite Dangerous.

If you’re not playing a whole lot of space games, it might not appeal, and the taller and more integrated nature of the Thrustmaster F/A-18 Super Hornet would still be our pick for the best flight stick. Still, if what you’re playing has a whole host of fiddly toggles, mapping those to the Sol-R’s array of switches, buttons, and dials feels like magic.

Ahead of the launch, it’s worth noting that things aren’t quite dialed in, so for the time being, you can expect to spend plenty of time tweaking buttons and mapping, but if that’s what you’re looking for in your next voyage, this is a great place to start around the $200/£200 mark – at least sort of.

I was sent the Duo pack which includes both Sol-R sticks for the right-hand and left-hand, which will cost you considerably more ($389.99 or £299.99). Still, with each packing plenty of inputs, even a single stick could be ideal for your setup.

Thrustmaster Sol-R

(Image credit: Future)

Price and availability

  • List price: $219.99 / £179.99, or $399.99 / £299.99 for the Duo
  • Available worldwide
  • Pre-orders open March 19, available April 16

While each stick in the Sol-R range will run you $219.99 / £179.99, they’re considerably cheaper than rivals like the Saitek Pro Flight, but a little more than Thrustmaster’s own T.Flight Hotas One.

You can preorder from March 19, and they’ll start shipping on April 16, 2025. While no throttle is included, you can use the Thrust sliders on the front of the base.

That pricing makes it a little more than the T-Flight Hotas One, which remains Thrustmaster’s entry-level model but doesn’t include as many buttons, switches, or LED lighting. In fact, it’s around a similar price to Turtle Beach’s VelocityOne which is relatively similar in terms of feature set.

Specs

Thrustmaster Sol-R

(Image credit: Future)

Design and features

  • Plenty of customizable inputs
  • Flexibility of two sticks
  • Nice lighting

While I was sent the Sol-R stick’s ‘Duo’ configuration which includes two of the sticks and bases for use at the same time, anyone buying a single one is unlikely to feel short-changed.

Each stick screws on easily but securely to the base, but even before doing that, it’s worth taking in the base itself. Each one has eight buttons, two dials, a thrust slider (with accompanying lighting), and a quarter of switches.

Moving to the stick, there’s a subtle button near the base, and a main trigger at thumb rest height, with an additional one beyond that. Then there are two hat switches, two more buttons, and a thumbstick, all of which combine to make menu navigation much, much simpler.

Speaking of which, you can actually use the F/A-18C Hornet grip or the Viper Grip on the base, giving you plenty of flexibility with your setup.

The blue lighting around the base and top of the stick is a nice touch, too, adding to the futuristic, space travel feel.

There’s a pleasing resistance to the dials and thrust slider, but I do wish the stick didn’t quite wobble so much. The included stabilizers click on with ease and do a great job of minimizing the way the base tends to roll to each side, but it’s still not quite perfect in those intense dogfights.

Those dials aren’t just inputs, either. They’re actually used as modifiers, so you can trigger different effects for any button, trigger, or switch depending on which position they’re in. That makes 44 customizable inputs per stick.

If you’re a left-handed pilot and you’re not looking to invest in both sticks, there’s good news — you can swap the included ergonomic supports over to make the Sol-R ambidextrous. And, if you’re looking for the Z axes, you can find that by twisting the grip.

It’s also worth noting that I have relatively large hands, but the Sol-R fits in them nicely in both hands, with the thumb rest perfectly located.

Thrustmaster Sol-R

(Image credit: Future)

Performance

  • Maps as ‘Generic stick’
  • Rewards tinkerers
  • T.A.R.G.E.T. download is still very much just for drivers

I wanted to put the Sol-R to the test with one of my favorite games in Elite Dangerous. While I did test with more traditional flight sims, Elite is a game that’s about exploring space in your own way, meaning it’s a great way to test out everything from dogfighting to space cargo hauling.

It’s also the kind of game that the Sol-R was built for, with a whole host of controls to tweak and map — and therein lies the rub.

Because the Sol-R, at least in its pre-launch stage, is identified in compatible games as a ‘Generic Joystick’, you can expect to do a lot of customization to get it just right in your game of choice. Elite recognized both sticks, for example, but I had to manually map each button.

That might be something that’ll be ironed out at launch, but on the plus side, it does give you scope to tweak as you see fit. In my testing, I got to a really great spot where the thrust slider adjusted my speed, while the variety of buttons switched power to various systems on my virtual ship.

Thrustmaster Sol-R

(Image credit: Future)

The hat switch and scroll wheel allowed me to check in-game messages and I could use the toggle switches for landing gear. Is that the kind of thing I’d have painstakingly done if the setup defaulted to a “good enough” button mapping layout? It’s hard to say.

For flight sims, it performs admirably, too. The sheer number of buttons is like a blank canvas for the likes of Microsoft Flight Simulator so you can set a button for your altimeter, anti-ice and more, and the fact the Sol-R appears as a generic stick means you can tailor it just like any other.

I also played Star Wars Squadrons, but the game doesn’t really need many inputs so it was actually more enjoyable with a single stick. Pulling off tight turns and loops did serve to highlight that wobbliness of the base, though.

Thrustmaster’s T.A.R.G.E.T. software is still pretty rudimentary, and while you can download presets, that wasn’t available for the Sol-R ahead of launch — but I’m curious to see how players adjust to the tools on offer. T.A.R.G.E.T. really just acts to get your drivers installed otherwise, and to its credit, it’s a simple, centralized hub for everything Thrustmaster.

Unlike the T.Flight HOTAS One, the Sol-R 1 is sadly PC only — so you won’t be able to use it on your PS5 or Xbox console.

Should I buy the Thrustmaster Sol-R?

Buy it if...

You’re big on space sims
For this price, even the single stick will give you a fantastic space sim experience, juggling comfort with a whole host of input options.

You’re patient
With the button remapping, you’ll get out of the Sol-R what you put in, meaning you can make it feel like a stick (or pair of sticks) entirely bespoke to your use case.

Don't buy it if...

You’re looking for a more stable flight stick
The Sol-R comes with stabilizers for the corners, but if you’re dogfighting, you might find your stick moving all over the place even with those fitted.

Also consider

Still not sold on the Thrustmaster Sol-R 1? Here’s how it compares to two similar products.

Turtle Beach VelocityOne
Turtle Beach’s VelocityOne is arguably the closest rival to the Sol-R, mainly because it offers a great stick experience flanked by additional inputs. It has a similar build quality, too, but you don’t get the flexibility of adding a second stick.

For more information, check out our full Turtle Beach VelocityOne review

Thrustmaster T Flight Hotas One
T Flight HOTAS is still close to the gold standard for new flyers, offering a comfortable stick with a detachable throttle — all at a great price. It’s also ideal for console gamers that play on Xbox.

For more information, check out our full T Flight Hotas One review

How I tested the Thrustmaster Sol-R

  • Tested over a period of weeks
  • Used on a gaming PC with an RTX 4070 Ti
  • Tested using Elite Dangerous, Star Wars Squadrons, and Microsoft Flight Simulator 2024

I spent the majority of my time (around 15 hours testing) with Elite Dangerous, simply because it has such a vast array of things that can be triggered by the Sol-R inputs.

I also had a blast blowing away TIE Fighters in Star Wars Squadrons, and also taking on some transatlantic flights in Microsoft Flight Simulator 2024 — but it feels best for an open-ended experience like Elite Dangerous where you can engage with an array of systems with the huge number of input options.

I tested on my gaming PC rig where I recently tested the F/A-18 stick and Viper TQS mission pack, as well as the T.Flight HOTAS One.

Read more about how we test

First reviewed March 2025

I really wanted to like the Nvidia GeForce RTX 5070, but it broke my heart and it shouldn’t have to break yours, too
5:00 pm | March 4, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , , | Comments: Off

Nvidia GeForce RTX 5070: Two-minute review

A lot of promises were made about the Nvidia GeForce RTX 5070, and in some narrow sense, those promises are fulfilled with Nvidia's mainstream GPU. But the gulf between what was expected and what the RTX 5070 actually delivers is simply too wide a gap to bridge for me and the legion of gamers and enthusiasts out there who won't be able to afford—or even find, frankly—Nvidia's best graphics cards from this generation.

Launching on March 5, 2025, at an MSRP of $549 / £549 / AU$1,109 in the US, UK, and Australia, respectively, this might be one of the few Nvidia Blackwell GPUs you'll find at MSRP (along with available stock), but only for lack of substantial demand. As the middle-tier GPU in Nvidia's lineup, the RTX 5070 is meant to have broader appeal and more accessible pricing and specs than the enthusiast-grade Nvidia GeForce RTX 5090, Nvidia GeForce RTX 5080, and Nvidia GeForce RTX 5070 Ti, but of all the cards this generation, this is the one that seems to have the least to offer prospective buyers over what's already on the market at this price point.

That's not to say there is nothing to commend this card. The RTX 5070 does get up to native Nvidia GeForce RTX 4090 performance in some games thanks to Nvidia Blackwell's exclusive Multi-Frame Generation (MFG) technology. And, to be fair, the RTX 5070 is a substantial improvement over the Nvidia GeForce RTX 4070, so at least in direct gen-on-gen uplift, there is a roughly 20-25% performance gain.

But this card is a far, far cry from the promise of RTX 4090 performance that Nvidia CEO Jensen Huang presented on stage at CES 2025, even with the qualifier that such an achievement would be "impossible without artificial intelligence," which implies a heavy reliance on DLSS 4 and MFG to get this card over the line.

If we're just talking framerates, then in some very narrow cases this card can do that, but at 4K with ray tracing and cranked-up settings, the input latency for the RTX 5070 with MFG can be noticeable depending on your settings, and it can become distracting. Nvidia Reflex helps, but if you take RTX 4090 performance to mean the same experience as the RTX 4090, you simply won't get that with MFG, even in the 80 or so games that support it currently.

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)

Add to all this the fact that the RTX 5070 barely outpaces the Nvidia GeForce RTX 4070 Super when you take MFG off the table (which will be the case for the vast majority of games played on this card) and you really don't have anything to show for the extra 30W of power this card pulls down over the RTX 4070 Super.

With the RTX 5070 coming in at less than four percent faster in gaming without MFG than the non-OC RTX 4070 Super, and roughly 5% faster overall, that means that the RTX 5070 is essentially a stock-overclocked RTX 4070 Super, performance-wise, with the added feature of MFG. An overclocked RTX 4070 Super might even match or exceed the RTX 5070's overall performance in all but a handful of games, and that doesn't even touch upon AMD's various offerings in this price range, like the AMD Radeon RX 7900 GRE or AMD's upcoming RX 9070 XT and RX 9070 cards.

Given that the RTX 4070 Super is still generally available on the market (at least for the time being) at a price where you're likely to find it for less than available RTX 5070 cards, and competing AMD cards are often available for less, easier to find, and offer roughly the same level of performance, I really struggle to find any reason to recommend this card, even without the questionable-at-best marketing for this card to sour my feelings about it.

I caught a lot of flack from enthusiasts for praising the RTX 5080 despite its 8-10% performance uplift over the Nvidia GeForce RTX 4080 Super, but at the level of the RTX 5080, there is no real competition and you're still getting the third-best graphics card on the market with a noticeable performance boost over the RTX 4080 Super for the same MSRP. Was it what enthusiasts wanted? No, but it's still a fantastic card with few peers, and the base performance of the RTX 5080 was so good that the latency problem of MFG just wasn't an issue, making it a strong value-add for the card.

You just can't claim that for the RTX 5070. There are simply too many other options for gamers to consider at this price point, and MFG just isn't a strong enough selling point at this performance level to move the needle. If the RTX 5070 is the only card you have available to you for purchase and you need a great 1440p graphics card and can't wait for something better (and you're only paying MSRP), then you'll ultimately be happy with this card. But the Nvidia GeForce RTX 5070 could have and should have been so much better than it ultimately is.

Nvidia GeForce RTX 5070: Price & availability

An Nvidia GeForce RTX 5070 sitting on top of its retail packaging

(Image credit: Future / John Loeffler)
  • How much is it? MSRP/RRP starting at $549 / £549 / AU$1,109
  • When can you get it? The RTX 5070 goes on sale on March 5, 2025
  • Where is it available? The RTX 5070 will be available in the US, UK, and Australia at launch

The Nvidia GeForce RTX 5070 is available starting March 5, 2025, with an MSRP of $549 / £549 / AU$1,109 in the US, UK, and Australia, respectively.

This puts it at the same price as the current RTX 4070 MSRP, and slightly less than that of the RTX 4070 Super. It's also the same MSRP as the AMD's RX 7900 GRE and upcoming RX 9070, and slightly cheaper than the AMD RX 9070 XT's MSRP.

The relatively low MSRP for the RTX 5070 is one of the bright spots for this card, as well as the existence of the RTX 5070 Founders Edition card, which Nvidia will sell directly at MSRP. This will at least put something of an anchor on the card's price in the face of scalping and general price inflation.

  • Value: 4 / 5

Nvidia GeForce RTX 5070: Specs

  • GDDR7 VRAM and PCIe 5.0
  • Higher power consumption
  • Still just 12GB VRAM, and fewer compute units

The Nvidia GeForce RTX 5070 is a mixed bag when it comes to specs. On the one hand, you have advanced technology like the new PCIe 5.0 interface and new GDDR7 VRAM, both of which appear great on paper.

On the other hand, it feels like every other spec was configured and tweaked to make sure that it compensated for any performance benefit these technologies would impart to keep the overall package more or less the same as the previous generation GPUs.

For instance, while the RTX 5070 sports faster GDDR7 memory, it doesn't expand the VRAM pool beyond 12GB, unlike its competitors. If Nvidia was hoping that the faster memory would make up for keeping the amount of VRAM the same, it only makes a modest increase in the number of compute units in the GPU (48 compared to the RTX 4070's 46), and a noticeable decrease from the RTX 4070 Super's (56).

Whatever performance gains the RTX 5070 makes with its faster memory, then, is completely neutralized by the larger number of compute units (along with the requisite number of CUDA cores, RT cores, and Tensor cores) in the RTX 4070 Super.

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)

The base clock on the RTX 5070 is notably higher, but its boost clock is only slightly increased, which is ultimately where it counts while playing games or running intensive workloads.

Likewise, whatever gains the more advanced TSMC N4P node offers the RTX 5070's GPU over the TSMC N4 node of its predecessors seems to be eaten up by the cutting down of the die. If there was a power or cost reason for this, I have no idea, but I think that this decision is what ultimately sinks the RTX 5070.

It seems like every decision was made to keep things right where they are rather than move things forward. That would be acceptable, honestly, if there was some other major benefit like a greatly reduced power draw or much lower price (I've argued for both rather than pushing for more performance every gen), but somehow the RTX 5070 manages to pull down an extra 30W of power over the RTX 4070 Super and a full 50W over the RTX 4070, and the price is only slightly lower than the RTX 4070 was at launch.

Finally, this is a PCIe 5.0 x16 GPU, which means that if you have a motherboard with 16 PCIe lanes or less, and you're using a PCIe 5.0 SSD, one of these two components is going to get nerfed down to PCIe 4.0, and most motherboards default to prioritizing the GPU.

You might be able to set your PCIe 5.0 priority to your SSD in your motherboard's BIOS settings and put the RTX 5070 into PCIe 4.0, but I haven't tested how this would affect the performance of the RTX 5070, so be mindful that this might be an issue with this card.

  • Specs: 2.5 / 5

Nvidia GeForce RTX 5070: Design

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)
  • No dual-pass-through cooling
  • FE card is the same size as the RTX 4070 and RTX 4070 Super FE cards

The Nvidia GeForce RTX 5070 Founders Edition looks identical to the RTX 5090 and RTX 5080 that preceeded it, but with some very key differences, both inside and out.

One of the best things about the RTX 5090 and RTX 5080 FE cards was the innovative dual pass-through cooling solution on those cards, which improved thermals so much that Nvidia was able to shrink the size of those cards from the gargantuan bricks of the last generation to something far more manageable and practical.

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)

It would have been nice to see what such a solution could have done for the RTX 5070, but maybe it just wasn't possible to engineer it so it made any sense. Regardless, it's unfortunate that it wasn't an option here, even though the RTX 5070 is hardly unwieldy (at least for the Founders Edition card).

Otherwise, it sports the same 16-pin power connector placement as the RTX 5090 and RTX 5080, so 90-degree power connectors won't fit the Founders Edition, though you will have better luck with most, if not all, AIB partner cards which will likely stick to the same power connector placement of the RTX 40 series.

The RTX 5070 FE will easily fit inside even a SFF case with ease, and its lighter power draw means that even if you have to rely on the included two-to-one cable adapter to plug in two free 8-pin cables from your power supply, it will still be a fairly manageable affair.

Lastly, like all the Founders Edition cards before it, the RTX 5070 has no RGB, with only the white backlight GeForce RTX logo on the top edge of the card to provide any 'flair' of that sort.

  • Design: 3.5 / 5

Nvidia GeForce RTX 5070: Performance

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)
  • Almost no difference in performance over the RTX 4070 Super without MFG
  • Using MFG can get you native RTX 4090 framerates in some games
  • Significantly faster performance over the RTX 4070
A note on my data

The charts shown below offer the most recent data I have for the cards tested for this review. They may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

Boy howdy, here we go.

The best thing I can say about the performance of this card is that it is just barely the best 1440p graphics card on the market as of this review, and that DLSS 4's Multi Frame Generation can deliver the kind of framerates Nvidia promises in those games where the technology is available, either natively or through the Nvidia App's DLSS override feature.

Both of those statements come with a lot of caveats, though, and the RTX 5070 doesn't make enough progress from the last gen to make a compelling case for itself performance-wise, especially since its signature feature is only available in a smattering of games at the moment.

On the synthetic side of things, the RTX 5070 looks strong against the card it's replacing, the RTX 4070, and generally offers about 25% better performance on synthetic benchmarks like 3DMark Steel Nomad or Speed Way. It also has higher compute performance in Geekbench 6 than its direct predecessor, though not be as drastic a margin (about 10% better).

Compared to the RTX 4070 Super, however, the RTX 5070's performance is only about 6% better overall, and only about 12% better than the AMD RX 7900 GRE's overall synthetic performance.

Again, a win is a win, but it's much closer than it should be gen-on-gen.

The RTX 5070 runs into similar issues on the creative side, where it only outperforms the RTX 4070 Super by about 3% overall, with its best performance coming in PugetBench for Creators' Adobe Premiere benchmark (~13% better than the RTX 4070 Super), but faltering somewhat with Blender Benchmark 4.3.0.

This isn't too surprising, as the RTX 5070 hasn't been released yet and GPUs tend to perform better in Blender several weeks or months after the card's release when the devs can better optimize things for new releases.

All in all, for this class of cards, the RTX 5070 is a solid choice for those who might want to dabble in creative work without much of a financial commitment, but real pros are better off with the Nvidia GeForce RTX 5070 Ti if you're looking to upgrade without spending a fortune.

It's with gaming, though, where the real heartbreak comes with this card.

Technically, with just 12GB VRAM, this isn't a 4K graphics card, but both the RTX 4070 Super and RTX 5070 are strong enough cards that you can get playable native 4K in pretty much every game so long as you never, ever touch ray tracing, global illumination, or the like. Unfortunately, both cards perform roughly the same under these conditions at 4K, with the RTX 5070 pulling into a slight >5 fps lead in a few games like Returnal and Dying Light 2.

However, in some titles like F1 2024, the RTX 4070 Super actually outperforms the RTX 5070 when ray tracing is turned on, or when DLSS is set to balanced and without any Frame Generation. Overall and across different setting configurations, the RTX 5070 only musters a roughly 4.5% better average FPS at 4K than the RTX 4070 Super.

It's pretty much the same story at 1440p, as well, with the RTX 5070 outperforming the RTX 4070 Super by about 2.7% across configurations at 1440p. We're really in the realm of what a good overclock can get you on an RTX 4070 Super rather than a generational leap, despite all the next-gen specs that the RTX 5070 brings to bear.

OK, but what about the RTX 4090? Can the RTX 5070 with DLSS 4 Multi Frame Generation match the native 4K performance of the RTX 4090?

Yes, it can, at least if you're only concerned with average FPS. The only game with an in-game benchmark that I can use to measure the RTX 5070's MFG performance is Cyberpunk 2077, and I've included those results here, but in Indiana Jones and the Great Circle and Dragon Age: Veilguard (using the Nvidia App's override function) I pretty much found MFG to perform consistently as promised, delivering substantially faster FPS than DLSS 4 alone and landing in the ballpark of where the RTX 4090's native 4K performance ends up.

And so long as you stay far away from ray tracing, the base framerate at 4K will be high enough on the RTX 5070 that you won't notice too much, if any, latency in many games. But when you turn ray tracing on, even the RTX 5090's native frame rate tanks, and it's those baseline rendered frames that handle changes based on your input, and the three AI-generated frames based on that initial rendered frame don't factor in whatever input changes you've made at all.

As such, even though you can get up to 129 FPS at 4K with Psycho RT and Ultra preset in Cyberpunk 2077 on the RTX 5070 (blowing way past the RTX 5090's native 51 average FPS on the Ultra preset with Psycho RT), only 44 of the RTX 5070's 129 frames per second are reflecting active input. This leads to a situation where your game looks like its flying by at 129 FPS, but feels like it's still a sluggish 44 FPS.

For most games, this isn't going to be a deal breaker. While I haven't tried the RTX 5070 with 4x MFG on Satisfactory, I'm absolutely positive I will not feel the difference, as it's not the kind of game where you need fast reflexes (other than dealing with the effing Stingers), but Marvel Rivals? You're going to feel it.

Nvidia Reflex definitely helps take the edge off MFG's latency, but it doesn't completely eliminate it, and for some games (and gamers) that is going to matter, leaving the RTX 5070's MFG experience too much of a mixed bag to be a categorical selling point. I think the hate directed at 'fake frames' is wildly overblown, but in the case of the RTX 5070, it's not entirely without merit.

So where does that leave the RTX 5070? Overall, it's the best 1440p card on the market right now, and it's relatively low MSRP makes it the best value proposition in its class. It's also much more likely that you'll actually be able to find this card at MSRP, making the question of value more than just academic.

For most gamers out there, Multi Frame Generation is going to be great, and so long as you go easy on the ray tracing, you'll probably never run into any practical latency in your games, so in those instances, the RTX 5070 might feel like black magic in a circuit board.

But my problem with the RTX 5070 is that it is absolutely not the RTX 4090, and for the vast majority of the games you're going to be playing, it never will be, and that's essentially what was promised when the RTX 5070 was announced. Instead, the RTX 5070 is an RTX 4070 Super with a few games running MFG slapped to its side that look like they're playing on an RTX 4090, but may or may not feel like they are, and that's just not good enough.

It's not what we were promised, not by a long shot.

  • Performance: 3 / 5

Should you buy the Nvidia GeForce RTX 5070?

An Nvidia GeForce RTX 5070

(Image credit: Future / John Loeffler)

Buy the Nvidia GeForce RTX 5070 if...

You don't have the money for (or cannot find) an RTX 5070 Ti or RTX 4070 Super
This isn't a bad graphics card, but there are so many better cards that offer better value or better performance within its price range.

You want to dabble in creative or AI work without investing a lot of money
The creative and AI performance of this card is great for the price.

Don't buy it if...

You can afford to wait for better
Whether it's this generation or the next, this card offers very little that you won't be able to find elsewhere within the next two years.

Also consider

Nvidia GeForce RTX 5070 Ti
The RTX 5070 Ti is a good bit more expensive, especially with price inflation, but if you can get it at a reasonable price, it is a much better card than the RTX 5070.

Read the full Nvidia GeForce RTX 5070 Ti review

Nvidia GeForce RTX 4070 Super
With Nvidia RTX 50 series cards getting scalped to heck, if you can find an RTX 4070 Super for a good price, it offers pretty much identical performance to the RTX 5070, minus the Multi Frame Generation.

Read the full Nvidia GeForce RTX 4070 Super review

How I tested the Nvidia GeForce RTX 5070

  • I spent about a week with the RTX 5070
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about a week testing the Nvidia GeForce RTX 5070, using it as my main workstation GPU for creative content work, gaming, and other testing.

I used my updated testing suite including industry standard tools like 3DMark and PugetBench for Creators, as well as built-in game benchmarks like Cyberpunk 2077, Civilization VII, and others.

I've reviewed more than 30 graphics cards for TechRadar in the last two and a half years, as well as extensively testing and retesting graphics cards throughout the year for features, analysis, and other content, so you can trust that my reviews are based on experience and data, as well as my desire to make sure you get the best GPU for your hard earned money.

  • Originally reviewed March 2025
The Thrustmaster AVA F/A-18 Super Hornet is a fantastic flight stick that took my setup to new heights
5:00 pm | March 2, 2025

Author: admin | Category: Computers Gadgets Gaming Gaming Accessories | Tags: , , | Comments: Off

Thrustmaster F/A-18 Super Hornet: One-minute review

If you’ve read our Thrustmaster Viper TQS Mission Pack review, you’ll know that flight sim tech is getting closer than ever to turning your setup into something plucked right from a fighter jet.

The Thrustmaster F/A-18 Super Hornet follows on that same line of thinking, dovetailing beautifully with its sister product while offering a fantastic flight stick in its own right, packed with input options and plenty of settings to tweak the game feel to your liking.

It’s ludicrously expensive, though. The review unit we’re testing is formed of multiple modular sections, with the company’s AVA base, a flight stick, and a base plate. These all combine in a nifty bundle for $579 / £450 (cheaper than buying them piecemeal) but it’s definitely an investment for a flight enthusiast.

It’s still hard not to be impressed. It’s a hefty stick, one that would feel right at home doing loop-de-loops in the sky, and it’s absolutely packed with inputs with switches, buttons, and triggers all over it.

If you’re looking to take your flight sim experience to new heights, it’s hard to look past, but don’t forget that the HOTAS X is a much more affordable, entry-level model that comes with a throttle – something the F/A-18 Super Hornet doesn’t have in the box.

Thrustmaster F/A-18 Super Hornet

(Image credit: Future)

Thrustmaster F/A-18 Super Hornet: Price and availability

  • List price: $579 / £450
  • Available worldwide
  • Offered in parts but this bundle is much cheaper

If you do want to pick up each part of the setup we’re testing here for the Thrustmaster F/A-18 Super Hornet, you can expect to spend a fair amount more, with just the baseplate alone coming in at $25 / £25.

That makes the bundle the way to go, and thankfully it’s easy to put together. It took me around five minutes to get everything hooked up, and the weight is a dead giveaway that it’s built to last, weighing in at 7.6lbs / 3.5kg once it’s put together.

The bundle includes the offset adapter, too, letting you tweak the angle at which the stick rests, either for realism or just for comfort.

Thrustmaster F/A-18 Super Hornet: Specs

Thrustmaster F/A-18 Super Hornet: Design and features

  • The base plate could do with better feet
  • Feels great to use
  • Satisfying inputs

As with the Viper TQS Mission Pack, it’s hard not to be in awe of the Thrustmaster F/A-18 Super Hornet once you unbox it.

It’s easy to put together, with a few screws attaching the AVA base to the base plate, and the stick basically screws on without any strenuous effort. It’s really easy to get started, too, since you just need to plug in your USB-C cable (included in the box) and you’re away.

Compared to the T Flight HOTAS I’ve been using for years, there’s a real weight to any movement on the Thrustmaster F/A-18 Super Hornet, and it makes flying feel more authentic as a result.

Button-wise, there’s a trio of hat switches. Two are at the top, while one rests under your thumb, and if the game you’re playing supports all of them you’ll have more buttons than you know what to do with. Two are four-way, while another is eight-way, and combined with physical buttons you have a whopping 19 on a stick that really doesn’t waste any space at all.

Everything feels great to press, and there’s enough effort needed to hit things like the rear buttons so that you won’t find yourself accidentally hitting them too often.

Special props should be given to the trigger, too, which registers full pulls and half pulls if your game supports that, and always feels great to pull whether you’re playing something more grounded or more fantastical.

Thrustmaster F/A-18 Super Hornet

(Image credit: Future)

Thrustmaster F/A-18 Super Hornet: Performance

  • Plug and play
  • Customizable innards
  • T.A.R.G.E.T. software is basically just for drivers

As with the Viper, Thrustmaster recommends using its T.A.R.G.E.T. software, but you can really just use it for the drivers and little else. I found everything was detected nicely in Windows’ own control panel for input devices, and that was a better spot for quick testing of deadzones and the like.

Flight sim fans may lean on it for preloading layouts built to approximate real flight controls, but as I’ve mentioned before, I fancy myself more of a Han Solo than an airline pilot, and the Thrustmaster F/A-18 Super Hornet helps fulfill those dogfighting dreams wonderfully - once you map controls.

If you’re using this and the Viper, games should switch to make the latter the secondary input, but it’s worth noting I ran into some bother as certain games mapped functions to the Thrustmaster F/A-18 Super Hornet that it doesn’t have access to, leading to a bizarre endless spin in the likes of Elite Dangerous and Star Wars: Squadrons. Remapping buttons did the trick, but if you’re worried you’ve got a dodgy unit, rest assured it’s a minor issue.

That aside, both games feel great when using the Thrustmaster F/A-18 Super Hornet. (Intentional) barrel rolls and locking onto TIE Fighters became second nature, even without using the Viper, and I found myself tinkering with the inner chassis just to get things dialed in.

The AVA base can be opened up with ease, letting discerning pilots adjust resistance, travel, and more. It’s easily done and adds a level beyond simple button remapping that experts will no doubt have an awful lot of fun with.

In fact, the only real complaint is that in the heat of the moment, as I pulled back on the stick, I found the feet on the baseplate didn’t give quite as much resistance as I had hoped. That could be down to me and having a relatively smooth-feeling desk, but it’s something to consider.

Thrustmaster F/A-18 Super Hornet

(Image credit: Future)

Should I buy the Thrustmaster F/A-18 Super Hornet?

Buy it if...

You’re a flying enthusiast
The price of admission is high, but this weighty stick has everything you could need for just about any flying title.

You’re a tinkerer
Digital aviation experts will no doubt delight in customizing the inner workings of the Thrustmaster F/A-18 Super Hornet.

Don't buy it if...

You’re on a budget
Sadly, it’s not the cheapest stick around, making it likely to be out of the reach for more casual flying fans.

Also consider...

Still not sold on the Thrustmaster F/A-18 Super Hornet? Here’s how it compares to two similar products.

Turtle Beach VelocityOne
As we mentioned in our Viper review, Turtle Beach’s VelocityOne is a slick stick (say that ten times, quickly), with an OLED display and a comfortable stick with plenty of inputs. It’s not as weighty, though, which makes it feel less premium.

For more information, check out our full Turtle Beach VelocityOne review

Logitech T Flight Hotas One
Our trusty fallback, the T Flight HOTAS remains a very comfortable stick with a throttle included, all for less than half the price of the F/A-18 Super Hornet. It works on Xbox One and Series X|S consoles as well.

For more information, check out our full T Flight Hotas One review

Thrustmaster F/A-18 Super Hornet

(Image credit: Future)

How I tested the Thrustmaster F/A-18 Super Hornet

  • Tested over a period of months
  • Used on a gaming PC with an RTX 4070 Ti
  • Tested with Elite Dangerous, Star Wars Squadrons, and Microsoft Flight Simulator

I’ve been testing the Thrustmaster F/A-18 Super Hornet for a couple of months, but the aforementioned remapping issues meant things took a little longer to settle than I’d care to admit.

Once that was sorted, though, it was off to the races (or airport?). I tested it with Microsoft Flight Simulator for some trans-Atlantic trips, but as I mentioned above, I mostly used it for dogfighting in Star Wars Squadrons or hauling space cargo in Elite Dangerous.

Read more about how we test

First reviewed February 2025

Nvidia GeForce RTX 5070 Ti review: nearly perfect, but with one major flaw
7:10 pm | February 20, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Nvidia GeForce RTX 5070 Ti: Two-minute review

The Nvidia GeForce RTX 5070 Ti definitely had a high expectation bar to clear after the mixed reception of the Nvidia GeForce RTX 5080 last month, especially from enthusiasts.

And while there are things I fault the RTX 5070 Ti for, there's no doubt that it has taken the lead as the best graphics card most people can buy right now—assuming that scalpers don't get there first.

The RTX 5070 Ti starts at $749 / £729 (about AU$1,050), making its MSRP a good bit cheaper than its predecessor was at launch, the Nvidia GeForce RTX 4070 Ti, as well as the buffed-up Nvidia GeForce RTX 4070 Ti Super.

The fact that the RTX 5070 Ti beats both of those cards handily in terms of performance would normally be enough to get it high marks, but this card even ekes out a win over the Nvidia GeForce RTX 4080 Super, shooting it nearly to the top of the best Nvidia graphics card lists.

As one of the best 4K graphics cards I've ever tested, it isn't without faults, but we're really only talking about the fact that Nvidia isn't releasing a Founders Edition card for this one, and that's unfortunate for a couple of reasons.

For one, and probably most importantly, without a Founders Edition card from Nvidia guaranteed to sell for MSRP directly from Nvidia's website, the MSRP price for this card is just a suggestion. And without an MSRP card from Nvidia keeping AIB partners onside, it'll be hard finding a card at Nvidia's $749 price tag, reducing its value proposition.

Also, because there's no Founders Edition, Nvidia's dual pass-through design to keep the card cool will pass the 5070 Ti by. If you were hoping that the RTX 5070 Ti might be SFF-friendly, I simply don't see how the RTX 5070 Ti fits into that unless you stretch the meaning of small form factor until it hurts.

Those aren't small quibbles, but given everything else the RTX 5070 Ti brings to the table, they do seem like I'm stretching myself a bit to find something bad to say about this card for balance.

For the vast majority of buyers out there looking for outstanding 4K performance at a relatively approachable MSRP, the Nvidia GeForce RTX 5070 Ti is the card you're going to want to buy.

Nvidia GeForce RTX 5070 Ti: Price & availability

The Nvidia GeForce RTX 5070 Ti sitting on its retail packaging

(Image credit: Future / John Loeffler)
  • How much is it? MSRP is $749/£729 (about AU$1,050), but with no Founders Edition, third-party cards will likely be higher
  • When can you get it? The RTX 5070 Ti goes on sale February 20, 2025
  • Where is it available? The RTX 5070 Ti will be available in the US, UK, and Australia at launch

The Nvidia GeForce RTX 5070 Ti goes on sale on February 20, 2025, starting at $749/£729 (about AU$1,050) in the US, UK, and Australia, respectively.

Unlike the RTX 5090 and RTX 5080, there is no Founders Edition card for the RTX 5070 Ti, so there are no versions of this card that will be guaranteed to sell at this MSRP price, which does complicate things given the current scalping frenzy we've seen for the previous RTX 50 series cards.

While stock of the Founders Edition RTX 5090 and RTX 5080 might be hard to find even from Nvidia, there is a place, at least, where you could theoretically buy those cards at MSRP. No such luck with the RTX 5070 Ti, which is a shame.

The 5070 Ti MSRP does at least come in under the launch MSRPs of both the RTX 4070 Ti and RTX 4070 Ti Super, neither of which had Founders Edition cards, so stock and pricing will hopefully stay within the bounds of where those cards have been selling for.

The 5070 Ti's MSRP puts it on the lower-end of the enthusiast-class, and while we haven't seen the price for the AMD Radeon RX 9070 XT yet, it's unlikely that AMD's competing RDNA 4 GPU will sell for much less than the RTX 5070 Ti, but if you're not in a hurry, it might be worth waiting a month or two to see what AMD has to offer in this range before deciding which is the better buy.

  • Value: 4 / 5

Nvidia GeForce RTX 5070 Ti: Specs

A closeup of the power connector on the Nvidia GeForce RTX 5070 Ti

(Image credit: Future / John Loeffler)
  • GDDR7 VRAM and PCIe 5.0
  • Slight bump in power consumption
  • More memory than its direct predecessor

Like the rest of the Nvidia Blackwell GPU lineup, there are some notable advances with the RTX 5070 Ti over its predecessors.

First, the RTX 5070 Ti features faster GDDR7 memory which, in addition to having an additional 4GB VRAM than the RTX 4070 Ti's 12GB, means that the RTX 5070 Ti's larger, faster memory pool can process high resolution texture files faster, making it far more capable at 4K resolutions.

Also of note is its 256-bit memory interface, which is 33.3% larger than the RTX 4070 Ti's, and equal to that of the RTX 4070 Ti Super. 64 extra bits might not seem like a lot, but just like trying to fit a couch through a door, even an extra inch or two of extra space can be the difference between moving the whole thing through at once or having to do it in parts, which translates into additional work on both ends.

The output ports on the Nvidia GeForce RTX 5070 Ti

(Image credit: Future / John Loeffler)

There's also the new PCIe 5.0 x16 interface, which speeds up communication between the graphics card, your processor, and your SSD. If you have a PCIe 5.0 capable motherboard, processor, and SSD, just make note of how many PCIe 5.0 lanes you have available.

The RTX 5070 Ti will take up 16 of them, so if you only have 16 lanes available and you have a PCIe 5.0 SSD, the RTX 5070 Ti is going to get those lanes by default, throttling your SSD to PCIe 4.0 speeds. Some motherboards will let you set PCIe 5.0 priority, if you have to make a choice.

The RTX 5070 Ti uses slightly more power than its predecessors, but in my testing it's maximum power draw came in at just under the card's 300W TDP.

As for the GPU inside the RTX 5070 Ti, it's built using TSMC's N4P process node, which is a refinement of the TSMC N4 node used by its predecessors. While not a full generational jump in process tech, the N4P process does offer better efficiency and a slight increase in transistor density.

  • Specs & features: 5 / 5

Nvidia GeForce RTX 5070 Ti: Design & features

The backplate of the Nvidia GeForce RTX 5070 Ti

(Image credit: Future / John Loeffler)
  • No Nvidia Founders Edition card
  • No dual-pass-through cooling (at least for now)

There is no Founders Edition card for the RTX 5070 Ti, so the RTX 5070 Ti you end up with may look radically different than the one I tested for this review, the Asus Prime GeForce RTX 5070 Ti.

Whatever partner card you choose though, it's likely to be a chonky card given the card's TDP, since 300W of heat needs a lot of cooling. While the RTX 5090 and RTX 5080 Founders Edition cards featured the innovative dual pass-through design (which dramatically shrank the card's width), it's unlikely you'll find any RTX 5070 Ti cards in the near future that feature this kind of cooling setup, if ever.

With that groundwork laid, you're going to have a lot of options for cooling setups, shroud design, and lighting options, though more feature-rich cards will likely be more expensive, so make sure you consider the added cost when weighing your options.

As for the Asus Prime GeForce RTX 5070 Ti, the sleek shroud of the card lacks the RGB that a lot of gamers like for their builds, but for those of us who are kind of over RGB, the Prime's design is fantastic and easily worked into any typical mid-tower case.

The Prime RTX 5070 Ti features a triple-fan cooling setup, with one of those fans having complete passthrough over the heatsink fins. There's a protective backplate and stainless bracket over the output ports.

The 16-pin power connector rests along the card's backplate, so even if you invested in a 90-degree angled power cable, you'll still be able to use it, assuming your power supply meets the recommended 750W listed on Asus's website. There's a 3-to-1 adapter included with the card, as well, for those who haven't upgraded to an ATX 3.0 PSU yet.

  • Design: 4 / 5

Nvidia GeForce RTX 5070 Ti: Performance

An Nvidia GeForce RTX 5070 Ti on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • RTX 4080 Super-level performance
  • Massive improvement over the RTX 4070 Ti Super
  • Added features like DLSS 4 with Multi-Frame Generation
A note on my data

The charts shown below offer the most recent data I have for the cards tested for this review. They may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

And so we come to the reason we're all here, which is this card's performance.

Given the...passionate...debate over the RTX 5080's underwhelming gen-on-gen uplift, enthusiasts will be very happy with the performance of the RTX 5070 Ti, at least as far as it relates to the last-gen RTX 4070 Ti and RTX 4070 Ti Super.

Starting with synthetic scores, at 1080p, both the RTX 4070 Ti and RTX 5070 Ti are so overpowered that they get close to CPU-locking on 3DMark's 1080p tests, Night Raid and Fire Strike, though the RTX 5070 Ti does come out about 14% ahead. The RTX 5070 Ti begins to pull away at higher resolutions and once you introduce ray tracing into the mix, with roughly 30% better performance at these higher level tests like Solar Bay, Steel Nomad, and Port Royal.

In terms of raw compute performance, the RTX 5070 Ti scores about 25% better in Geekbench 6 than the RTX 4070 Ti and about 20% better than the RTX 4070 Ti Super.

In creative workloads like Blender Benchmark 4.30, the RTX 5070 Ti pulls way ahead of its predecessors, though the 5070 Ti, 4070 Ti Super, and 4070 Ti all pretty much max out what a GPU can add to my Handbrake 1.9 4K to 1080p encoding test, with all three cards cranking out about 220 FPS encoded on average.

Starting with 1440p gaming, the gen-on-gen improvement of the RTX 5070 Ti over the RTX 4070 Ti is a respectable 20%, even without factoring in DLSS 4 with Multi-Frame Generation.

The biggest complaint that some have about MFG is that if the base frame rate isn't high enough, you'll end up with controls that can feel slightly sluggish, even though the visuals you're seeing are much more fluid.

Fortunately, outside of turning ray tracing to its max settings and leaving Nvidia Reflex off, you're not really going to need to worry about that. The RTX 5070 Ti's minimum FPS for all but one of the games I tested at native 1440p with ray tracing all pretty much hit or exceeded 60 FPS, often by a lot.

Only F1 2024 had a lower-than-60 minimum FPS at native 1440p with max ray tracing, and even then, it still managed to stay above 45 fps, which is fast enough that no human would ever notice any input latency in practice. For 1440p gaming, then, there's absolutely no reason not to turn on MFG whenever it is available since it can substantially increase framerates, often doubling or even tripling them in some cases without issue.

For 4K gaming, the RTX 5070 Ti native performance is spectacular, with nearly every title tested hitting 60 FPS or greater on average, with those that fell short only doing so by 4-5 frames.

Compared to the RTX 4070 Ti and RTX 4070 Ti Super, the faster memory and expanded 16GB VRAM pool definitely turn up for the RTX 5070 Ti at 4K, delivering about 31% better overall average FPS than the RTX 4070 Ti and about 23% better average FPS than the RTX 4070 Ti Super.

In fact, the average 4K performance for the RTX 5070 Ti pulls up pretty much dead even with the RTX 4080 Super's performance, and about 12% better than the AMD Radeon RX 7900 XTX at 4K, despite the latter having 8GB more VRAM.

Like every other graphics card besides the RTX 4090, RTX 5080, and RTX 5090, playing at native 4K with ray tracing maxed out is going to kill your FPS. To the 5070 Ti's credit, though, minimum FPS never dropped so low as to turn things into a slideshow, even if the 5070 Ti's 25 FPS minimum in Cyberpunk 2077 was noticeable.

Turning on DLSS in these cases is a must, even if you skip turning on MFG, but the RTX 5070 Ti's balanced upscaled performance is a fantastic experience.

Leave ray tracing turned off (or set to a lower setting), however, and MFG definitely becomes a viable way to max out your 4K monitor's refresh rate for seriously fluid gaming.

Overall then, the RTX 5070 Ti delivers substantial high-resolution gains gen-on-gen, which should make enthusiasts happy, without having to increase its power consumption all that much.

Of all the graphics cards I've tested over the years, and especially over the past six months, the RTX 5070 Ti is pretty much the perfect balance for whatever you need it for, and if you can get it at MSRP or reasonably close to MSRP, it's without a doubt the best value for your money of any of the current crop of enthusiast graphics cards.

  • Performance: 5 / 5

Should you buy the Nvidia GeForce RTX 5070 Ti?

A masculine hand holding the Nvidia GeForce RTX 5070 Ti

(Image credit: Future / John Loeffler)

Buy the Nvidia GeForce RTX 5070 Ti if...

You want the perfect balance of 4K performance and price
Assuming you can find it at or close to MSRP, the 4K value proposition on this card is the best you'll find for an enthusiast graphics card.

You want a fantastic creative graphics card on the cheap
While the RTX 5070 Ti doesn't have the RTX 5090's creative chops, it's a fantastic pick for 3D modelers and video professionals looking for a (relatively) cheap GPU.

You want Nvidia's latest DLSS features without spending a fortune
While this isn't the first Nvidia graphics card to feature DLSS 4 with Multi Frame Generation, it is the cheapest, at least until the RTX 5070 launches in a month or so.

Don't buy it if...

You want the absolute best performance possible
The RTX 5070 Ti is a fantastic performer, but the RTX 5080, RTX 4090, and RTX 5090 all offer better raw performance if you're willing to pay more for it.

You're looking for something more affordable
While the RTX 5070 Ti has a fantastic price for an enthusiast-grade card, it's still very expensive, especially once scalpers get involved.

You only plan on playing at 1440p
If you never plan on playing at 4K this generation, you might want to see if the RTX 5070 or AMD Radeon RX 9070 XT and RX 9070 cards are a better fit.

Also consider

Nvidia GeForce RTX 5080
While more expensive, the RTX 5080 features fantastic performance and value for under a grand at MSRP.

Read the full Nvidia GeForce RTX 5080 reviewView Deal

Nvidia GeForce RTX 4080 Super
While this card might not be on the store shelves for much longer, the RTX 5070 Ti matches the RTX 4080 Super's performance, so if you can find the RTX 4080 Super at a solid discount, it might be the better pick.

Read the full Nvidia GeForce RTX 4080 Super reviewView Deal

How I tested the Nvidia GeForce RTX 5070 Ti

  • I spent about a week with the RTX 5070 Ti
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about a week testing the Nvidia GeForce RTX 5070 Ti, using it mostly for creative work and gaming, including titles like Indiana Jones and the Great Circle and Avowed.

I also used my updated suite of benchmarks including industry standards like 3DMark and Geekbench, as well as built-in gaming benchmarks like Cyberpunk 2077 and Dying Light 2.

I also test all of the competing cards in a given card's market class using the same test bench setup throughout so I can fully isolate GPU performance across various, repeatable tests. I then take geometric averages of the various test results (which better insulates the average from being skewed by tests with very large test results) to come to comparable scores for different aspects of the card's performance. I give more weight to gaming performance than creative or AI performance, and performance is given the most weight in how final scores are determined, followed closely by value.

I've been testing GPUs, PCs, and laptops for TechRadar for nearly five years now, with more than two dozen graphics card reviews under my belt in the past three years alone. On top of that, I have a Masters degree in Computer Science and have been building PCs and gaming on PCs for most of my life, so I am well qualified to assess the value of a graphics card and whether it's worth your time and money.

  • Originally reviewed February 2025
Acer’s Predator Helios Neo 18 AI and 16 AI bring RTX 5070 Ti
9:07 pm | February 7, 2025

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

Acer used a Counter-Strike tournament in Katowice, Poland, as the debut stage for its two latest laptops - the Helios Neo 16 AI and Helios Neo 18 AI. The pair boasts up to the latest Intel Core Ultra 9 275H CPU, 64GB of RAM, 2TB of storage, and Nvidia's new RTX 5070 Ti laptop GPU, as well as high-speed OLED displays. Both also have the 5th gen AeroBlade 3D fans, vector heat pipes, and liquid metal thermal grease applied to the SoC. Starting with the Predator Helios Neo 18 AI, as the name suggests, it packs an 18-inch display - three 2560x1600px WQXGA models, and a 1920x1200px WUXGA one....

Acer’s Predator Helios Neo 18 AI and 16 AI bring RTX 5070 Ti
9:07 pm |

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

Acer used a Counter-Strike tournament in Katowice, Poland, as the debut stage for its two latest laptops - the Helios Neo 16 AI and Helios Neo 18 AI. The pair boasts up to the latest Intel Core Ultra 9 275H CPU, 64GB of RAM, 2TB of storage, and Nvidia's new RTX 5070 Ti laptop GPU, as well as high-speed OLED displays. Both also have the 5th gen AeroBlade 3D fans, vector heat pipes, and liquid metal thermal grease applied to the SoC. Starting with the Predator Helios Neo 18 AI, as the name suggests, it packs an 18-inch display - three 2560x1600px WQXGA models, and a 1920x1200px WUXGA one....

Nvidia GeForce RTX 5080 review: nearly RTX 4090 performance for a whole lot less
5:00 pm | January 29, 2025

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Nvidia GeForce RTX 5080: Two-minute review

At first glance, the Nvidia GeForce RTX 5080 doesn't seem like that much of an upgrade from the Nvidia GeForce RTX 4080 it is replacing, but that's only part of the story with this graphics card.

Its performance, to be clear, is unquestioningly solid, positioning it as the third-best graphics card on the market right now, by my testing, and its new PCIe 5.0 interface and GDDR7 VRAM further distances it from the RTX 4080 and RTX 4080 Super from the last generation. It also outpaces the best AMD graphics card, the AMD Radeon RX 7900 XTX, by a healthy margin, pretty much locking up the premium, enthusiast-grade GPUs in Nvidia's corner for at least another generation.

Most impressively, it does this all for the same price as the Nvidia GeForce RTX 4080 Super and RX 7900 XTX: $999 / £939 / AU$2,019. This is also a rare instance where a graphics card launch price actually recedes from the high watermark set by its predecessor, as the RTX 5080 climbs down from the inflated price of the RTX 4080 when it launched back in 2022 for $1,199 / £1,189 / AU$2,219.

Then, of course, there's the new design of the card, which features a slimmer dual-slot profile, making it easier to fit into your case (even if the card's length remains unchanged). The dual flow-through fan cooling solution does wonders for managing the extra heat generated by the extra 40W TDP, and while the 12VHPWR cable connector is still present, the 3-to-1 8-pin adapter is at least somewhat less ridiculous the RTX 5090's 4-to-1 dongle.

The new card design also repositions the power connector itself to make it less cumbersome to plug a cable into the card, though it does pretty much negate any of the 90-degree angle cables that gained popularity with the high-end RTX 40 series cards.

Finally, everything is built off of TSMC's 4nm N4 process node, making it one of the most cutting-edge GPUs on the market in terms of its architecture. While AMD and Intel will follow suit with their own 4nm GPUs soon (AMD RDNA 4 also uses TSMC's 4nm process node and is due to launch in March), right now, Nvidia is the only game in town for this latest hardware.

None of that would matter though if the card didn't perform, however, but gamers and enthusiasts can rest assured that even without DLSS 4, you're getting a respectable upgrade. It might not have the wow factor of the beefier RTX 5090, but for gaming, creating, and even AI workloads, the Nvidia GeForce RTX 5080 is a spectacular balance of performance, price, and innovation that you won't find anywhere else at this level.

Nvidia GeForce RTX 5080: Price & availability

An RTX 5080 sitting on its retail packaging

(Image credit: Future)
  • How much is it? MSRP is $999 / £939 / AU$2,019
  • When can you get it? The RTX 5080 goes on sale January 30, 2025
  • Where is it available? The RTX 5080 will be available in the US, UK, and Australia at launch
Where to buy the RTX 5080

Looking to pick up the RTX 5080? Check out our Where to buy RTX 5080 live blog for updates to find stock in the US and UK

The Nvidia GeForce RTX 5080 goes on sale on January 30, 2025, starting at $999 / £939 / AU$2,019 for the Founders Edition and select AIB partner cards, while overclocked (OC) and more feature-rich third-party cards will be priced higher.

This puts the Nvidia RTX 5080 about $200 / £200 / AU$200 cheaper than the launch price of the last-gen RTX 4080, while also matching the price of the RTX 4080 Super.

Both of those RTX 40 series GPUs should see some downward price pressure as a result of the RTX 5080 release, which might complicate the value proposition of the RTX 5080 over the other,

The RTX 5080 is also launching at the same MSRP as the AMD Radeon RX 7900 XTX, which is AMD's top GPU right now. And with AMD confirming that it does not intend to launch an enthusiast-grade RDNA 4 GPU this generation, the RTX 5080's only real competition is from other Nvidia graphics cards like the RTX 4080 Super or RTX 5090.

This makes the RTX 5080 a great value proposition for those looking to buy a premium 4K graphics card, as its price-to-performance ratio is very strong.

  • Value: 4 / 5

Nvidia GeForce RTX 5080: Specs & features

A masculine hand holding an Nvidia GeForce RTX 5080 showing off the power connector

(Image credit: Future)
  • GDDR7 VRAM and PCIe 5.0
  • Still just 16GB VRAM
  • Slightly higher 360W TDP

While the Nvidia RTX 5080 doesn't push the spec envelope quite as far as the RTX 5090 does, its spec sheet is still impressive.

For starters, like the RTX 5090, the RTX 5080 uses the faster, next-gen PCIe 5.0 interface that allows for faster data processing and coordination with the CPU, which translates directly into higher performance.

You also have new GDDR7 VRAM in the RTX 5080, only the second card to have it after the RTX 5090, and it dramatically increases the memory bandwidth and speed of the RTX 5080 compared to the RTX 4080 and RTX 4080 Super. Those latter two cards both use slower GDDR6X memory, so even though all three cards have the same amount of memory (16GB) and memory bus-width (256-bit), the RTX 5080 has a >25% faster effective memory speed of 30Gbps, compared to the 23Gbps of the RTX 4080 Super and the 22.4Gbps on the base RTX 4080.

This is all on top of the Blackwell GPU inside the card, which is built on TSMC's 4nm process, compared to the Lovelace GPUs in the RTX 4080 and 4080 Super, which use TSMC's 5nm process. So even though the transistor count on the RTX 5080 is slightly lower than its predecessor's, the smaller transistors are faster and more efficient.

The RTX 5080 also has a higher SM count, 84, compared to the RTX 4080's 76 and the RTX 4080 Super's 80, meaning the RTX 5080 has the commensurate increase in shader cores, ray tracing cores, and Tensor cores. It also has a slightly faster boost clock (2,617MHz) than its predecessor and the 4080 Super variant.

Finally, there is a slight increase in the card's TDP, 360W compared to the RTX 4080 and RTX 4080 Super's 320W.

  • Specs & features: 4.5 / 5

Nvidia GeForce RTX 5080: Design

An Nvidia GeForce RTX 5080 leaning against its retail packaging with the RTX 5080 logo visible

(Image credit: Future)
  • Slimmer dual-slot form factor
  • Dual flow-through cooling system

The redesign of the Nvidia RTX 5080 is identical to that of the RTX 5090, featuring the same slimmed-down dual slot profile as Nvidia's flagship card.

If I were to guess, the redesign of the RTX 5080 isn't as essential as it is for the RTX 5090, which needed a way to bring better cooling for the much hotter 575W TDP, and the RTX 5080 (and eventually the RTX 5070) just slotted into this new design by default.

That said, it's still a fantastic change, especially as it makes the RTX 5080 thinner and lighter than its predecessor.

The dual flow through cooling system on the Nvidia GeForce RTX 5080

(Image credit: Future)

The core of the redesign is the new dual flow-through cooling solution, which uses an innovative three-part PCB inside to open up a gap at the front of the card, allowing a second fan to blow cooler air over the heat sink fins drawing heat away from the GPU.

A view of the comparative slot width of the Nvidia GeForce RTX 5080 and RTX 4080

(Image credit: Future)

This means that you don't need as thick of a heat sink to pull away heat, which allows the card itself to get the same thermal performance from a thinner form factor, moving from the triple-slot RTX 4080 design down to a dual-slot RTX 5080. In practice, this also allows for a slight increase in the card's TDP, giving the card a bit of a performance boost as well, just from implementing a dual flow-through design.

Given that fact, I would not be surprised if other card makers follow suit, and we start getting much slimmer graphics cards as a result.

A masculine hand holding an Nvidia GeForce RTX 5080 showing off the power connector

(Image credit: Future)

The only other design choice of note is the 90-degree turn of the 16-pin power port, which should make it easier to plug the 12VHPWR connector into the card. The RTX 4080 didn't suffer nearly the same kinds of issues with its power connectors as the RTX 4090 did, so this design choice really flows down from engineers trying to fix potential problems with the much more power hungry 5090. But, if you're going to implement it for your flagship card, you might as well put it on all of the Founders Edition cards.

Unfortunately, this redesign means that if you invested in a 90-degree-angled 12VHPWR cable, it won't work on the RTX 5080 Founders Edition, though third-party partner cards will have a lot of different designs, so you should be able to find one that fits your cable situation..

  • Design: 4.5 / 5

Nvidia GeForce RTX 5080: Performance

An Nvidia GeForce RTX 5080 slotted and running on a test bench

(Image credit: Future)
  • Excellent all-around performance
  • Moderately more powerful than the RTX 4080 and RTX 4080 Super, but nearly as fast as the RTX 4090 in gaming
  • You'll need DLSS 4 to get the best results
A note on my data

The charts shown below are the most recent test data I have for the cards tested for this review and may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

A note on the RTX 4080 Super

In my testing for this review, the RTX 4080 Super scored consistently lower than it has in the past, which I believe is an issue with my card specifically that isn't reflective of its actual performance. I'm including the data from the RTX 4080 Super for transparency's sake, but I wouldn't take these numbers as-is. I'll be retesting the RTX 4080 Super soon, and will update my data with new scores once I've troubleshot the issue.

Performance is king, though, and so naturally all the redesign and spec bumps won't amount to much if the RTX 5080 doesn't deliver better performance as a result, and fortunately it does—though maybe not as much as some enthusiasts would like.

Overall, the RTX 5080 manages to score about 13% better than the RTX 4080 and about 19% better than the AMD Radeon RX 7900 XTX, a result that will disappoint some (especially after seeing the 20-25% uplift on the RTX 5090) who were hoping for something closer to 20% or better.

If we were just to go off those numbers, some might call them disappointing, regardless of all the other improvements to the RTX 5080 in terms of design and specs. All this needs to be put in a broader context though, because my perspective changed once I compared the RTX 5080 to the RTX 4090.

Overall, the RTX 5080 is within 12% of the overall performance of the RTX 4090, and within 9% of the RTX 4090's gaming performance, which is a hell of a thing and simply can't be ignored, even by enthusiasts.

Starting with the card's synthetic benchmarks, the card scores about 13% better than the RTX 4080 and RX 7900 XTX, with the RTX 5080 consistently beating out the RTX 4080 and substantially beating the RX 7900 XTX in ray-traced workloads (though the RX 7900 XTX does pull down a slightly better average 1080p rasterization score, to its credit.

Compared to the RTX 4090, the RTX 5080 comes in at about 15% slower on average, with its worst performance coming at lower resolutions. At 4K, though, the RTX 5080 comes in just 7% slower than the last-gen flagship.

In terms of compute performance, the RTX 5080 trounces the RX 7900 XTX, as expected, by about 38%, with a more modest 9% improvement over the RTX 4080. Against the RTX 4090, however, the RTX 5080 comes within just 5% of the RTX 4090's Geekbench compute scores. If you're looking for a cheap AI card, the RTX 5080 is definitely going to be your jam.

On the creative side, PugetBench for Creators Adobe Photoshop benchmark still isn't working for the RTX 5080 Super, so I can't tell you much about its creative raster performance yet (though I will update these charts once that issue is fixed), but going off the 3D modeling and video editing scores, the RTX 5080 is an impressive GPU, as expected.

The entire 3D modeling industry is effectively built on Nvidia's CUDA, so against the RTX 5080, the RX 7900 XTX doesn't stand a chance as the 5080 more than doubles the RX 7900 XTX's Blender Benchmark performance. Gen-on-gen though, the RTX 5080 comes in with about 8% better performance.

Against the RTX 4090, the RTX 5080 comes within 15% on its performance, and for good measure, if you're rocking an RTX 3090 and you're curious about the RTX 5080, the RTX 5080 outperforms the RTX 3090 by about 75% in Blender Benchmark. If you're on an RTX 3090 and want to upgrade, you'll probably still be better off with an RTX 4090, but if you can't find one, the RTX 5080 is a great alternative.

In terms of video editing performance, the RTX 5080 doesn't do as well as its predecessor in PugetBench for Creators Adobe Premiere and effectively ties in my Handbrake 4K to 1080p encoding test. I expect that once the RTX 5080 launches, Puget Systems will be able to update its tools for the new RTX 50 series, so these scores will likely change, but for now, it is what it is, and you're not going to see much difference in your video editing workflows with this card over its predecessor.

An Nvidia GeForce RTX 5080 slotted into a motherboard

(Image credit: Future)

The RTX 5080 is Nvidia's premium "gaming" card, though, so its gaming performance is what's going to matter to the vast majority of buyers out there. For that, you won't be disappointed. Working just off DLSS 3 with no frame generation, the RTX 5080 will get you noticeably improved framerates gen-on-gen at 1440p and 4K, with substantially better minimum/1% framerates as well for smoother gameplay. Turn on DLSS 4 with Multi-Frame Generation and the RTX 5080 does even better, blowing well past the RTX 4090 in some titles.

DLSS 4 with Multi-Frame Generation is game developer-dependent, however, so even though this is the flagship gaming feature for this generation of Nvidia GPUs, not every game will feature it. For testing purposes, then, I stick to DLSS 3 without Frame Generation (and the AMD and Intel equivalents, where appropriate), since this allows for a more apples-to-apples comparison between cards.

At 1440p, the RTX 5080 gets about 13% better average fps and minimum/1% fps overall, with up to 18% better ray tracing performance. Turn on DLSS 3 to balanced and ray tracing to its highest settings and the RTX 5080 gets you about 9% better average fps than its predecessor, but a massive 58% higher minimum/1% fps, on average.

Compared to the RTX 4090, the RTX 5080's average 1440p fps comes within 7% of the RTX 4090's, and within 2% of its minimum/1% fps, on average. In native ray-tracing performance, the RTX 5080 slips to within 14% of the RTX 4090's average fps and within 11% of its minimum/1% performance. Turn on balanced upscaling, however, and everything changes, with the RTX 5080 comes within just 6% of the RTX 4090's ray-traced upscaled average fps, and beats the RTX 4090's minimum/1% fps average by almost 40%.

Cranking things up to 4K, and the RTX 5080's lead over the RTX 4080 grows a good bit. With no ray tracing or upscaling, the RTX 5080 gets about 20% faster average fps and minimum/1% fps than the RTX 4080, overall. Its native ray tracing performance is about the same, however, and it's minimum/1% fps average actually falls behind the RTX 4080's, both with and without DLSS 3.

Against the RTX 4090, the RTX 5080 comes within 12% of its average fps and within 8% of its minimum/1% performance without ray tracing or upscaling. It falls behind considerably in native 4K ray tracing performance (which is to be expected, given the substantially higher RT core count for the RTX 4090), but when using DLSS 3, that ray tracing advantage is cut substantially and the RTX 5080 manages to come within 14% of the RTX 4090's average fps, and within 12% of its minimum/1% fps overall.

Taken together, the RTX 5080 makes some major strides in reaching RTX 4090 performance across the board, getting a little more than halfway across their respective performance gap between the RTX 4080 and RTX 4090.

The RTX 5080 beats its predecessor by just over 13% overall, and comes within 12% of the RTX 4090's overal performance, all while costing less than both RTX 40 series card's launch MSRP, making it an incredible value for a premium card to boot.

  • Performance: 4 / 5

Should you buy the Nvidia GeForce RTX 5080?

A masculine hand holding up an Nvidia GeForce RTX 5080 against a green background

(Image credit: Future)

Buy the Nvidia GeForce RTX 5080 if...

You want fantastic performance for the price
You're getting close to RTX 4090 performance for under a grand (or just over two, if you're in Australia) at MSRP.

You want to game at 4K
This card's 4K gaming performance is fantastic, coming within 12-14% of the RTX 4090's in a lot of games.

You're not willing to make the jump to an RTX 5090
The RTX 5090 is an absolute beast of a GPU, but even at its MSRP, it's double the price of the RTX 5080, so you're right to wonder if it's worth making the jump to the next tier up.

Don't buy it if...

You want the absolute best performance possible
The RTX 5080 comes within striking distance of the RTX 4090 in terms of performance, but it doesn't actually get there, much less reaching the vaunted heights of the RTX 5090.

You're looking for something more affordable
At this price, it's an approachable premium graphics card, but it's still a premium GPU, and the RTX 5070 Ti and RTX 5070 are just around the corner.

You only plan on playing at 1440p
While this card is great for 1440p gaming, it's frankly overkill for that resolution. You'll be better off with the RTX 5070 Ti if all you want is 1440p.

Also consider

Nvidia GeForce RTX 4090
With the release of the RTX 5090, the RTX 4090 should see it's price come down quite a bit, and if scalpers drive up the price of the RTX 5080, the RTX 4090 might be a better bet.

Read the full Nvidia GeForce RTX 4090 review

Nvidia GeForce RTX 5090
Yes, it's double the price of the RTX 5080, and that's going to be a hard leap for a lot of folks, but if you want the best performance out there, this is it.

Read the full Nvidia GeForce RTX 5090 review

How I tested the Nvidia GeForce RTX 5080

  • I spent about a week and a half with the RTX 5080
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

I spent about a week testing the RTX 5080, using my updated suite of benchmarks like Black Myth Wukong, 3DMark Steel Nomad, and more.

I also used this card as my primary work GPU where I relied on it for photo editing and design work, while also testing out a number of games on it like Cyberpunk 2077, Black Myth Wukong, and others.

I've been testing graphics cards for TechRadar for a couple of years now, with more than two dozen GPU reviews under my belt. I've extensively tested and retested all of the graphics cards discussed in this review, so I'm intimately familiar with their performance. This gives me the best possible position to judge the merits of the RTX 5080, and whether it's the best graphics card for your needs and budget.

  • Originally reviewed January 2024
Nvidia GeForce RTX 5080
5:00 pm |

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Nvidia GeForce RTX 5080: Two-minute review

Nvidia GeForce RTX 5080: Price & availability

  • How much is it? MSRP is $999 / £939 / AU$2,019
  • When can you get it? The RTX 5080 goes on sale January 30, 2025
  • Where is it available? The RTX 5080 will be available in the US, UK, and Australia at launch
Where to buy the RTX 5080

Looking to pick up the RTX 5080? Check out our Where to Buy RTX 5080 live blog for updates to find stock in the US and UK.

The Nvidia GeForce RTX 5080 goes on sale on January 30, 2025, starting at $999 / £939 / AU$2,019 for the Founders Edition card from Nvidia, as well as select AIB partner cards. Third-party overclocked (OC) cards and those with other extras like liquid cooling and RGB will ultimately cost more.

The RTX 5080 launches at a much lower price than the original RTX 4080, which had a launch MSRP of $1,199 / £1,189 / AU$2,219, though the RTX 5080 does come in at the same price as the Nvidia RTX 4080 Super.

It's worth noting that the RTX 5080 is fully half the MSRP of the Nvidia RTX 5090 that launches at the same time, and given the performance of the RTX 5080, a lot of potential buyers out there will likely find the RTX 5080 to be the better value of the two cards.

  • Value: 4 / 5

Nvidia GeForce RTX 5080: Specs & features

The Nvidia GeForce RTX 5080's power connection port

(Image credit: Future / John Loeffler)
  • GDDR7 and PCIe 5.0
  • Slightly higher SM count than RTX 4080 Super
  • Moderate increase in TDP, but nothing like the RTX 5090
  • Specs & features: 4 / 5

Nvidia GeForce RTX 5080: Design

  • Slim, dual-slot form factor
  • Better cooling
  • Design: 4.5 / 5

Nvidia GeForce RTX 5080: Performance

An Nvidia GeForce RTX 5090 slotted into a test bench

(Image credit: Future)
  • DLSS 4
A note on my data

The charts shown below are the most recent test data I have for the cards tested for this review and may change over time as more card results are added and cards are retested. The 'average of all cards tested' includes cards not shown in these charts for readability purposes.

  • Performance: 4 / 5

Should you buy the Nvidia GeForce RTX 5080?

A masculine hand holding an RTX 5090

(Image credit: Future)

Buy the Nvidia GeForce RTX 5080 if...

Don't buy it if...

Also consider

Nvidia GeForce RTX 4080 Super

Read the full Nvidia GeForce RTX 4080 Super review

How I tested the Nvidia GeForce RTX 5080

  • I spent about a week and a half with the RTX 5080
  • I used my complete GPU testing suite to analyze the card's performance
  • I tested the card in everyday, gaming, creative, and AI workload usage
Test System Specs

Here are the specs on the system I used for testing:

Motherboard: ASRock Z790i Lightning WiFi
CPU: Intel Core i9-14900K
CPU Cooler:
Gigabyte Auros Waterforce II 360 ICE
RAM: Corsair Dominator DDR5-6600 (2 x 16GB)
SSD:
Crucial T705
PSU: Thermaltake Toughpower PF3 1050W Platinum
Case: Praxis Wetbench

  • Originally reviewed January 2024
Next Page »