Organizer
Gadget news
ZimaBlade review
9:35 am | May 15, 2024

Author: admin | Category: Computers Gadgets Pro | Tags: , , , | Comments: Off

The ZimaBlade single-board computer looks surprisingly similar to an old-school portable cassette player. 

Specifications

CPU: Entry-level Dual-Core N3350, High-Performance Quad-Core J3455

GPU: Entry-level 650MHz, High-performance 750MHz, 12 execution units

RAM: Upgradable up to 16GB of DDR3L, none supplied in the box

FLASH: 32GB eMMC

USB: 1 x Type-C, 1 x USB3.0, 2 x internal USB2.0

Display: 1 x 4K MiniDP, 1 x DP over Type-C, 1 x eDP internal

PCIe: Four lanes 2.0

SATA: 2 x SATA3.0

Ethernet: 1 x Gigabit LAN

Power Consumption: About 15W

Size: 107mm x 80mm x 32mm

It competes with the Raspberry Pi 4, being in the same price bracket while offering an Intel x86 architecture. The SBC has plenty of connectors, which makes this hacker-friendly platform versatile and unique. The built-in PCIe 2.0 x4 connector accepts various cards out-of-the-box, and with two SATA3 ports, the board can morph into a portable NAS storage device.

Since the ZimaBlade supports up to 16GB of DDR3L, it can run applications requiring large amounts of memory, such as databases and VMs. The main let-down is the outdated CPU, with the speediest version of the board based on a Quad-Core 2.3GHz Apollo Lake CPU. The SBC features a single USB Type-C, which supplies power and drives a DisplayPort output.

IceWhale Technology, the maker of the ZimaBlade, held a Crowdsupply campaign to finance the board's new version. Various perks are available; the most basic, containing a Dual-Core Intel Celeron N3350, is available for $64. The ZimaBlade 7700, built around a Quad-Core J3455 CPU, sells for $96. Except for the CPU, both have the same hardware and require a DDR3L memory module to boot. 

ZimaBlade front view.

(Image credit: Future)

ZimaBlade: Design

The ZimaBlade computer comes with a male-to-male Type-C and one SATA cable. The passively cooled unit measures 107mm x 80mm x 32mm and weighs 175g. The small case sits perfectly flat on a desk, with no mounting holes and only four tiny rubber pads on the bottom. Being very light, connecting various cables can become problematic as the case can topple easily.

The Zimablade designers have worked hard to produce an enclosure that showcases the computer’s internal components. A transparent plastic top displays the SODIMM memory but not the CPU. With no power button available, the hardware turns on when plugging a Type-C cable. A single status LED, barely visible from the side of the case, indicates if the board is powered. The PCIe socket location does not allow easy card insertion. The card’s metal bracket has to be removed before use.

Under the hood, the ZimaBlade sports a J3455 quad-core Intel Celeron CPU clocked at 2.4GHz for the highest performance board variant. Geekbench shows the ZimaBlade handily outperforms the Cortex A72 ARM CPU found in the Pi4 but scores well below the new Pi5’s Cortex A76 CPU. One aspect not found on similar-priced platforms is expanding the memory to 16GB using DDR3L SODIMM.

The ZimaBlade targets an audience that strives for high-speed interfaces. Seven connectors provide connectivity for many use cases with throughputs above the gigabit mark. Two SATA6 and one Gigabit Ethernet socket turn the ZimaBlade into a redundant storage server. One USB3, a USB Type-C with DP, and a mini-DP connector capable of 4K at 60Hz complete the list of external ports. Three internal connectors, two USB 2.0 connectors, and one eDP socket allow additional peripherals.

ZimaBlade side view.

(Image credit: Future)

ZimaBlade: In Use

The owner can use the ZimaBlade simply by plugging a USB Type-C cable into a screen supporting a Type-C display. The computer then boots CasaOS, a lightweight cloud-accessible platform with an ever-increasing number of applications. ZimaBlade is extremely fast at booting, taking just five seconds to display the Linux login.

After entering the default username and password, the user has root access to the Linux-based OS stored in 32GB eMMC storage, with 24GB left for user applications. A lean OS means a lowly 20% RAM utilization with an 8GB memory module. With the 1G LAN connected, software updates run automatically and keep the platform secured.

In addition to being affordable, the ZimaBlade builds on a user-friendly OS where the UI is viewed entirely through a web browser. This cloud concept could have been a miss, but thanks to modern technologies like Docker containers, using the desktop is very snappy. The installed software includes a file explorer and an app store containing forty applications ranging from a retro emulator to home automation. 

Running Geekbench6 on the ZimaBlade involves installing through SSH. The board's power consumption reaches 15W, with the case becoming 

hot at more than 60 degrees Celsius, and decreases to 6W when idle. With a score of 300 in single-core and 911 in multi-core benchmarks on Geekbench6, the J3455 CPU won’t blow you away with its computing prowess but will be sufficient for everyday basic tasks.

ZimaBlade top view.

(Image credit: Future)

ZimaBlade: The competition

Thanks to the ZimaBlade, finding an affordable x86 single-board computer with lots of connectivity and expandable memory has become more accessible. Hardkernel’s Odroid H3+ is very similar to the ZimaBlade, being passively cooled and possessing various high-speed connectors. The H3+ costs more than twice as much, with the Odroid H3+ being bigger with an oversized heatsink and consuming more power. The quad-lane PCIe connector on the ZimaBlade makes it a valuable testbed for PCIe cards, something not found in the Odroid H3+. 

ZimaBlade: Final verdict

IceWhale’s ZimaBlade makes a tremendous entry-level computer with many options for adding extra hardware. The PCIe slot is the product's standout feature, allowing the use of high-end gaming graphics cards, for example. The single SODIMM socket gives the user an upgrade path to more memory. The onboard eMMC storage memory turns the unit into a self-contained product. Finally, a price below $100 tilts the balance, making the ZimaBlade a must-have gadget this year. 

We've listed the best business computers.

The Snapdragon 8 Gen 4 to have impressive GPU performance
4:00 am | May 12, 2024

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

According to the notorious tipster on Weibo Digital Chat Station, Qualcomm's upcoming flagship SoC, the Snapdragon 8 Gen 4, will likely impress with exceptional graphics performance. The post is quite vague, but gives an example with the GPU-intensive Genshin Impact game. The report claims the new Adreno GPU will run the title fluently in native 1080p resolution. The game is quite taxing even for modern high-end chipsets and to run smoothly, users need to lower the resolution. Additionally, the development of the SoC is going ahead of schedule, so Qualcomm is targeting a release...

Details for Snapdragon X Plus leak: 10-core CPU, same GPU and NPU
3:45 pm | April 24, 2024

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

The Snapdragon X Elite is so far the only product in Qualcomm’s new Windows-focused line of chipsets, but that won’t be forever. Details of a Plus model have surfaced, but before we get to those, we have to break down the Elite specs first. The Snapdragon X Elite name is actually used for three chips. All have 12 CPU cores, 42MB of cache and an NPU that delivers 45 TOPS of performance. However, the CPU and GPU are clocked differently. Check the table below for more details on the three versions. The Snapdragon X Plus is similar to the lowest of the Elite versions, except that it has 10...

Details for Snapdragon X Plus leak: 10-core CPU, same GPU and NPU
3:45 pm |

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

The Snapdragon X Elite is so far the only product in Qualcomm’s new Windows-focused line of chipsets, but that won’t be forever. Details of a Plus model have surfaced, but before we get to those, we have to break down the Elite specs first. The Snapdragon X Elite name is actually used for three chips. All have 12 CPU cores, 42MB of cache and an NPU that delivers 45 TOPS of performance. However, the CPU and GPU are clocked differently. Check the table below for more details on the three versions. The Snapdragon X Plus is similar to the lowest of the Elite versions, except that it has 10...

Sony PlayStation 5 Pro is coming with better GPU, more memory bandwidth
8:01 pm | April 15, 2024

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

Last month a rumor did the rounds claiming a PlayStation 5 Pro was coming, and today a new report confirms this, while adding some more details about the console. Sony is apparently already asking developers to ensure their games are compatible with the upcoming PS5 Pro, highlighting a focus on improving ray tracing. The PS5 Pro is allegedly codenamed Trinity, and it's said to have a more powerful GPU and a slightly faster CPU mode. The changes in the Pro model are all supposed to make the console more capable of rendering games with ray tracing enabled, or hitting higher resolutions and...

PlayStation 5 Pro rumored to beef up GPU and ray tracing, bring AI acceleration
4:17 pm | March 18, 2024

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

The PlayStation 5 launched in late 2020, though it feels like it arrived later due to supply issues. A Pro model will reportedly arrive four years later with a much improved GPU, AI acceleration and other enhancements. The GPU will be the biggest upgrade on the PS5 Pro. Rumors claim up to 45% higher rasterization performance and 33.5 TFLOPs of compute power. Future SDK versions will support resolutions up to 8K and higher frame rates with 4K @ 120fps and 8K @ 60fps being possible. Ray tracing performance is set to include 2-3 times, even 4 times on some occasions. This is thanks to a...

Asus ROG G22CH review: the Intel NUC Extreme lives on, at least in spirit
7:00 pm | March 1, 2024

Author: admin | Category: Computers Computing Gadgets Gaming Computers Gaming PCs | Tags: , , , | Comments: Off

Asus ROG G22CH: One-minute review

As chipsets get smaller and more efficient, the past handful of years have seen a rise in smaller-form gaming PCs like the Asus ROG G22CH. 

Not only are they non-intrusive compared to the biggest and best gaming PCs, but they have a nice amount of portability as well. Most importantly, clever cooling and component management allow them to pack a nice performance punch at the cost of real upgradability. 

In the case of the ROG G22CH, the rig looks like a horizontally wider version of the Xbox Series X. There’s a sleek all-black look that’s accented by some nice angles with customizable RGB lighting. With that said, the performance specs are also miles ahead of a similar console. 

The ROG G22CH has an Intel i9-13900K CPU, Nvidia GeForce RTX 4070 GPU, 32GB DDR5 RAM, and a 1TB SSD. That’s more than enough for some solid native 1440p gaming with the ability for 4K through DLSS upscaling. 

Starting at 1,399.99 in the US (about £1,120/AU$1,960), it can get expensive pretty quickly as you increase the specs, with UK and Australian buyers more restricted in the kinds of configurations they can buy. 

This is a bit of an issue since upgradability down the line is likely going to be a problem due to the extremely tight chassis. When packing so much performance within such a small rig, efficient cooling is a must. There are two different options including fans and liquid but both are loud during intensive tasks.  

That said, potential buyers looking for a small-form gaming desktop should definitely keep the Asus ROG G22CH in mind, since it's one of the few available on the market now that Intel has retired its NUC Extreme line. Beyond its pretty aggressive styling, its performance prowess is where it matters the most, and it excels in this regard. The gaming desktop can run all of the most popular esports games at high frame rates such as Fortnite and Valorant while handling more visually demanding games like Alan Wake 2 without much fuss. If cost and upgradability are a problem, it might be best to go with a gaming rig that has a bigger case

An Asus ROG G22CH on a desk

(Image credit: Future / John Loeffler)

Asus ROG G22CH: Price & availability

  •  How much does it cost? Cost range between $1,399 and $2,499  
  •  When is it available? It is available now in U.S., UK and AU  
  •  Where can you get it? From various stories depending on territory  

The Asus ROG G22CH is relatively expensive regardless of what configuration one has. For gamers looking for some solid 1080p gaming, the $1,399 option comes with an Intel Core i5-13400F, Nvidia GeForce RTX 3060, 16GB DDR5 RAM, and a 512GB SSD. 

That’s definitely a solid choice for anyone looking to play some of the bigger esports games like Fortnite, Rocket League, Call of Duty, or Valorant. Our review configuration came to about $2,299 and for $200 more users can pump up to the Intel Core i9-14900KF, though this isn't necessarily a huge jump in CPU power. 

When it comes to the UK, there’s only one option available which includes an Intel Core i7, Nvidia GeForce RTX 4070, 16GB RAM, and 2TB SSD for £2,099. Australian buyers have two configurations they can buy. Both have an Nvidia GeForce RTX 4070, 32GB DDR5, and a1TB SSD, but for AU$4,699 you can get an Intel Core i7-14700F configuration, or for $4,999 you can get an Intel Core i9-14900KF system. 

For good measure, there’s even an included mouse and keyboard that comes packed in with all configurations. Serious gamers will probably want to check out the best gaming keyboard and best gaming mouse options though, as the stock peripherals aren't spectacular.

Small-form PC Gaming rigs are usually expensive and naturally face issues when it comes to upgrading. However, the Acer Predator Orion 3000 is the most approachable price-wise and the lowest configuration is a bit more powerful than the ROG G22CH. Meanwhile, if performance is a main concern regardless of money the Origin Chronos V3 with a little bit of upgradable wiggly room and the Corsair One i300 has the best form-factor.

An Asus ROG G22CH on a desk

(Image credit: Future / John Loeffler)

Asus ROG G22CH: Specs

 The Asus ROG G22CH currently comes in a variety of customizable configurations.  

An Asus ROG G22CH on a desk

(Image credit: Future / John Loeffler)

Asus ROG G22CH: Design

  • The case is 4.53" x 12.72" x 11.30" inches and weights 18.52Lbs 
  • An all-black design is accented with two strips of RGB lighting    
  • There's not much room for GPU upgrading

Balancing form and functionality are the most important attributes of a small-sized gaming PC, and the Asus ROG G22CH does a solid job with both. When it comes to design, there’s much to appreciate in terms of the all-black chassis. Having two vertical strips of customizable RGB lighting on the front panel does lend the rig some personality. 

There’s one small stripe on the upper left side and a longer one on the lower right side. Between them is an angular cut alongside the ROG logo. When it comes to ventilation, there’s some form of it on all sides of the ROG G22CH.  Just looking from the front panel, the overall design is really sleek and could give the Xbox Series X a run for its money.

Image 1 of 3

An Asus ROG G22CH on a desk

(Image credit: Future / John Loeffler)
Image 2 of 3

An Asus ROG G22CH on a desk

(Image credit: Future / John Loeffler)
Image 3 of 3

An Asus ROG G22CH on a desk

(Image credit: Future / John Loeffler)

There are plenty of ports available as well. The top IO panel features two USB-A ports alongside a singular USB-C, a 3.5mm combo jack, and a power button. Unfortunately, that USB-C port is the only one available on this PC. On the back are four USB-A split between 2.0 and 3.2, three audio jacks, and a gigabit Ethernet port. That should be more than enough for most PC gamers and creatives though.

Though upgradability will be tough, the ROG G22CH does somewhat make the process easier. Featuring a tool-free design, there’s a sliding latch that allows both sides and upper portions to be lifted to access to its inside. Having that ability without using screws does help a lot, outside of possibly RAM and SSD, getting a large GPU or attempting to swap out motherboards in the future is going to be difficult, if not impossible. 

An Asus ROG G22CH on a desk

(Image credit: Future / John Loeffler)

Asus ROG G22CH: Performance

  • 1440p performance is spectacular  
  • DLSS can do 4K when needed  
  • Fans will run at full volume   
Benchmarks

Here's how the Asus ROG G22CH performed in our series of benchmarks:

3DMark Speed Way: 4,404; Fire Strike: 34,340; Time Spy: 17,500
GeekBench 6 (single-core): 2,866; (multi-core): 17,650
Total War: Warhammer III (1080p, Ultra): 137 fps; (1080p, Low): 343 fps
Cyberpunk 2077 (1080p, Ultra): 123 fps; (1080p, Low): 162 fps
Dirt 5 (1080p, Ultra): 173 fps; (1080p, Low): 283 fps

Outside of gaming, the Asus ROG G22CH is a phenomenal workhorse for various general and creative tasks. Using Google Chrome in addition to listening to high-fidelity music through Tidal are fine working experiences. 

Using Adobe Suite worked totally fine on the G22CH as well. Photoshop was able to handle multiple-layer projects with incredibly high-resolution photos without issue. Editing videos through Premiere Pro allowed easy editing of 4K videos with speedy export times. 

That said, this is a gaming desktop, and it's its gaming performance where the G22CH really shines.

When it comes to handling the top tier of high-fidelity visuals in gaming, the G22CH can handle Cyberpunk 2077, Red Dead Redemption II, Alan Wake II, and the like at native 1440p at high frame rates without breaking a sweat. Our Cyberpunk 2077 tests produced an average 123 fps on Ultra settings at 1080p. Bumping to 1440p with path tracing placed frame rates in the high 90s. Having everything turned to the max in settings allowed Alan Wake II to run in the high 60s. 

If wanting to go up to 4K, users are definitely going to have to rely on Nvidia’s DLSS technology, but it's possible with the right settings tweaks.

When it comes to high esports-level performance, users right now can enjoy a serious edge over the competition. Games like Call of Duty: Warzone, Valorant, Country Strike 2, and Fortnite were able to pump out frame rates well over 100 fps on high settings which is more than enough for the best gaming monitors. For more competitive settings, it’s easy enough to reach past 200 fps. 

Just understand that users will know when the G22CH is being pushed to the limit. When playing rounds of Helldivers 2 and Alan Wake II, the noise from the PC's fans reached around the low 80-decibel mark. This means that headsets are going to be necessary when gaming. 

An Asus ROG G22CH on a desk

(Image credit: Future / John Loeffler)

Should you buy the Asus ROG G22CH?

Buy the Asus ROG G22CH if...

Don't buy it if...

How I tested the Asus ROG G22CH

I tested the Asus ROG G22CH over two weeks. During the day, many general computing tasks were done including Google Chrome and Tidal. Having multiple Google Chrome tabs allowed me to use Asana, Google Docs, and Hootsuite. For creating graphics alongside short-form social media video content, I used Adobe Premiere and Photoshop. 

Testing out high frame rate possibilities, games played included Call of Duty: Warzone, Valorant, and Fortnite. To see how hard we could push visual fidelity, we tried games including Cyberpunk 2077, Alan Wake 2 and Forza Motorsport (2023).

I’ve spent the past several years covering monitors alongside other PC components for Techradar. Outside of gaming, I’ve been proficient in Adobe Suite for over a decade as well. 

Read more about how we test

First reviewed March 2024

AMD Radeon RX 7900 GRE: AMD’s China-only card goes global—and upends the midrange market
7:42 pm | February 28, 2024

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

AMD Radeon RX 7900 GRE: Two-minute review

The AMD Radeon RX 7900 GRE was originally launched in China back in July 2023, and from what everyone was told, that card was going to be exclusive to that region. 

Well, following the launch of the RTX Super series of GPUs last month, AMD's decided to surprise everyone and launch the RX 7900 GRE globally, starting this week, and it looks primed to upend the midrange GPU market in a pretty huge way.

That's because the RX 7900 GRE (Golden Rabbit Edition) is going on sale starting at $549, which puts it directly in competition with the Nvidia RTX 4070 on price, and undercuts the Nvidia RTX 4070 Super by offering competitive performance for just over 90% of the cost.

To be clear, the card that is being released globally is the same card that has already been available for Chinese consumers, and so it has been extensively benchmarked for months, with much of that data freely available online for everyone to see. 

This has no doubt driven much of the global interest in the RX 7900 GRE since it originally launched back in July, and I fully expect this card to fly off the shelves since it is without question one of the best graphics cards for the midrange you're going to find.

In terms of raw synthetic performance, the RX 7900 GRE follows the familiar AMD-Nvidia pattern where the Radeon card is better at pure rasterization while the GeForce card is the better ray-tracer, but the difference between the RX 7900 GRE and the RTX 4070 Super in ray-tracing performance isn't as wide as it might have been last generation.

What's more, when it comes to gaming, Nvidia's advantage in native ray tracing is overcome by the RX 7900 GRE as soon as you bring upscaling into the mix, which you invariably have to do whenever ray tracing above 1080p is involved.

The RX 7900 GRE is even a much more capable creative card than I was expecting, so long as you're not working with CUDA, but for graphic designers, photographers, and video editors, this is a surprisingly powerful GPU for a lot less money than it's rivals.

Overall, the AMD Radeon RX 7900 GRE isn't so powerful that it completely knocks out Nvidia's RTX 4070 Super, but it's hitting Nvidia's newest GPU a lot harder than I think Nvidia was expecting so soon after launch. Unfortunately, this does put the only-slightly-cheaper-but-not-as-good AMD Radeon RX 7800 XT in a bit of an awkward position, but for gamers looking to get the best performance for their money, more options are better in this case.

An AMD Radeon RX 7900 GRE from PowerColor on a desk with its retail packaging

(Image credit: Future / John Loeffler)

AMD Radeon RX 7900 GRE: Price & availability

  • How much does it cost? $549 (about £440/AU$770)
  • When is it available? Available now
  • Where can you get it? Available in the US, UK, and Australia

The AMD Radeon RX 7900 GRE is available starting February 27, 2024, with a US MSRP of $549 (about £440/AU$770). This is the same price as the Nvidia RTX 4070, $50 less than the RTX 4070 Super, and $50 more than the RX 7800 XT.

This launch doesn't include an AMD reference card, so you will need to buy the RX 7900 GRE from third-party partners like ASRock, Gigabyte, Sapphire, and others. The sample I was sent for review is the PowerColor Hellhound RX 7900 GRE, a model line that typically sells for AMD's MSRP or below (when on sale).

An AMD Radeon RX 7900 GRE from PowerColor on a desk with its retail packaging

(Image credit: Future / John Loeffler)

AMD Radeon RX 7900 GRE: Features & specs

The AMD Radeon RX 7900 GRE is a modified Navi 31 GPU with four fewer compute units than the AMD Radeon RX 7900 XT, as well as slower clock speeds. It's power requirements are also officially lower at a starting TGP of 260W, but this will vary by which card you go for.

The Radeon RX 7900 GRE also has 16GB GDDR6 VRAM to the RX 7900 XT's 20GB, and while the RX 7900 XT has a 320-bit memory bus, the RX 7900 GRE has a slimmer — but still sizeable — 256-bit bus. With a memory clock of 2,250 MHz (compared to the RX 7900 XT's 2,500 MHz), the RX 7900 GRE comes in with an effective memory speed of 18 Gbps and a memory bandwidth of 576 GB/s, which is a notable decline from the RX 7900 XT's 800 Gbps and 800 GB/s, respectively.

Also notable are the two 8-pin power connectors, which won't require you to fuss around with a 16-pin connector like Nvidia's latest graphics cards require you to do, whether that's through an adapter or an ATX 3.0 power supply.

An AMD Radeon RX 7900 GRE from PowerColor on a desk with its retail packaging

(Image credit: Future / John Loeffler)

AMD Radeon RX 7900 GRE: Design

While there is an AMD reference card for the RX 7900 GRE, AMD has said that global availability will only come through AIB partners, so the design you get with your card will vary by manufacturer.

The card I tested, the PowerColor Hellhound RX 7900 GRE, sports a triple-fan cooler with RGB lighting in the fan. It's a long card to be sure, and even though it's technically a dual-slot card, the shroud makes for a tight fit.

The backplate of the Hellhound RX 7900 GRE has some notable features, like the Hellhound logo, the exposed GPU bracket, and a hole in the backplate opposite the third fan to leave an open path for air to pass over the GPU cooler's heatsink fins to improve cooling efficiency.

An AMD Radeon RX 7900 GRE from PowerColor in a test bench

(Image credit: Future / John Loeffler)

AMD Radeon RX 7900 GRE: Performance

Now we come to the heart of the matter. I can't tell if AMD was inspired by the release of the Nvidia RTX 4070 Super or not, but whatever convinced Team Red to bring the RX 7900 GRE out of China to the rest of the world, midrange gamers everywhere should be grateful because this is easily the best midrange graphics card on the market right now.

Image 1 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 2 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 3 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 4 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 5 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 6 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 7 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 8 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 9 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 10 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 11 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 12 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 13 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 14 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 15 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)

Starting with synthetic benchmarks, the typical rasterization-ray tracing divide between AMD and Nvidia remains, but like we've seen with other cards this generation, the gap is narrowing. The Nvidia 4070 and RTX 4070 Super definitely pull ahead in terms of raw compute performance, but overall, the RX 7900 GRE is the champ of the under-$600s.

Image 1 of 6

Rx 7900 GRE creative benchmark results

(Image credit: Future / Infogram)
Image 2 of 6

Rx 7900 GRE creative benchmark results

(Image credit: Future / Infogram)
Image 3 of 6

Rx 7900 GRE creative benchmark results

(Image credit: Future / Infogram)
Image 4 of 6

Rx 7900 GRE creative benchmark results

(Image credit: Future / Infogram)
Image 5 of 6

Rx 7900 GRE creative benchmark results

(Image credit: Future / Infogram)
Image 6 of 6

Rx 7900 GRE creative benchmark results

(Image credit: Future / Infogram)

For creative use, the RX 7900 GRE is the strongest rasterizer, but lags Nvidia with video editing, and serious stumbles when it comes to 3D rendering as seen in Blender Benchmark 4.0.0. 

Image 1 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 2 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 3 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 4 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 5 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 6 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 7 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 8 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 9 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 10 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 11 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 12 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 13 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 14 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 15 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 16 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 17 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 18 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 19 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 20 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 21 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 22 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 23 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 24 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 25 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 26 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 27 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 28 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 29 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 30 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 31 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 32 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 33 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 34 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 35 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 36 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 37 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 38 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 39 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)

When it comes to gaming, though, the RX 7900 GRE is the clear winner among midrange cards, with spectacular 1080p and 1440p gaming performance, with only slightly worse ray tracing performance than the RTX 4070 Super. 

As a 4K graphics card, however, the RX 7900 GRE isn't that far behind the RTX 4070 Ti, with the former getting an average 55 fps (30 fps minimum) and the latter getting an average of 63 fps (minimum 42 fps). The RTX 4070 Super, meanwhile, only averages 41 fps at 4K, with a minimum of 28 fps. 

Ultimately, the RTX 4070 Super can't really be considered among the best 4K graphics cards, but the RX 7900 GRE definitely can, thanks to its wider memory pool and larger memory bus.

Image 1 of 2

Power and Temperature benchmarks for the RX 7900 GRE

(Image credit: Future / Infogram)
Image 2 of 2

Power and Temperature benchmarks for the RX 7900 GRE

(Image credit: Future / Infogram)

Of course, this performance comes at the cost of power draw. You can throw the official 260W TGP right out the window here, with the RX 7900 GRE pulling down 302W, but the strong cooling performance on the PowerColor Hellhound card did manage to keep the RX 7900 GRE below 53 degrees Celsius.

Image 1 of 8

Final benchmark results for the RX 7900 GRE

(Image credit: Future / Infogram)
Image 2 of 8

Creative benchmarks for the RX 7900 GRE

(Image credit: Future / Infogram)
Image 3 of 8

Final benchmark results for the RX 7900 GRE

(Image credit: Future / Infogram)
Image 4 of 8

Final benchmark results for the RX 7900 GRE

(Image credit: Future / Infogram)
Image 5 of 8

Final benchmark results for the RX 7900 GRE

(Image credit: Future / Infogram)
Image 6 of 8

Final benchmark results for the RX 7900 GRE

(Image credit: Future / Infogram)
Image 7 of 8

Final benchmark results for the RX 7900 GRE

(Image credit: Future / Infogram)
Image 8 of 8

Final benchmark results for the RX 7900 GRE

(Image credit: Future / Infogram)

Overall, then, there's just no getting around the fact that the RX 7900 GRE effectively outperforms any other card in the midrange. And despite the RX 7900 GRE falling well short of the RTX 4070-series GPUs overall, it's worth keeping in mind that with Photoshop and similar rasterization-dependent programs, the RX 7900 GRE performs the best, and it doesn't fall too far behind the RTX cards when it comes to video editing. 

The weakness of the RX 7900 GRE is that most, if not all, 3D modeling software relies so heavily on Nvidia's CUDA that it heavily skews the creative performance averages, that it can be somewhat deceptive—unless you NEED this graphics card for 3D modeling. If that's the case, nothing else matters, and you need to go with an RTX 4070-class graphics card despite the RX 7900 GRE's superior performance everywhere else.

How many people will that stipulation apply to ultimately? Not enough to hold the RX 7900 GRE from claiming the crown as the best graphics card in the midrange, and since its final value score is just shy of the RX 7800 XT's, there really isn't any reason to opt for any other card right now. The RX 7900 GRE is honestly just that good.

An AMD Radeon RX 7900 GRE from PowerColor on a desk with its retail packaging

(Image credit: Future / John Loeffler)

Should you buy the AMD Radeon RX 7900 GRE?

Buy the AMD Radeon RX 7900 GRE if...

You want the best midrange graphics card
The AMD Radeon RX 7900 GRE is the best overall graphics card for under $600 you can get.

You want to game at 4K
Thanks to the RX 7900 GRE's 16GB VRAM and wide memory bus, you can absolutely game effectively at 4K with this card.

Don't buy it if...

You want the best ray-tracing graphics card
The AMD RX 7900 GRE is a good ray-tracing graphics card, but it's not as good as the RTX 4070 Super.

You do a lot of 3D modeling
If you're a 3D modeling professional (or even a passionate amateur), you need an RTX card, full stop.

AMD Radeon RX 7900 GRE: Also consider

If my AMD Radeon RX 7900 GRE review has you looking for other options, here are two more graphics cards to consider...

How I tested the AMD Radeon RX 7900 GRE

  • I spent about a week with the AMD Radeon RX 7900 GRE
  • I tested its synthetic, creative, and gaming performance
  • I used our standard suite of benchmarks
Test system specs

This is the system I used to test the AMD Radeon RX 7900 GRE

CPU: Intel Core i9-14900K
CPU Cooler: MSI MAG Coreliquid E360 AIO Cooler
RAM: 32GB Corsair Dominator Platinum RGB DDR5-6000
Motherboard: MSI MPG Z790 Tomahawk WiFi
SSD: Samsung 990 Pro 4TB NVMe M.2 SSD
Power Supply: Thermaltake PF3 1050W ATX 3.0
Case: Praxis Wetbench

I spent about a week testing the AMD Radeon RX 7900 GRE, including extensive benchmarking as well as general use of the card in my primary work PC.

I also made sure to benchmark other competing graphics cards in the same class with updated drivers to ensure correct comparable data, where necessary.

I've been reviewing computer hardware for years, and I have tested and retested all of the best graphics cards of the past two generations, so I know very well how a graphics card in a given class ought to perform.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed February 2024

Acer Predator BiFrost Arc A770 OC review: a flashy makeover for those who want that RGB
5:00 pm | February 25, 2024

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , | Comments: Off

Acer Predator BiFrost Arc A770 OC: Two-minute review

Following years of anticipation, Intel jumped into the GPU market dominated by AMD and NVIDIA with some respectable results last year. 

Both the Intel Arc A750 to the Intel Arc A770 showed real promise and managed to undercut the best graphics cards both chipmakers had to offer despite, at least on price if not necessarily matching performance benchmarks. 

Regardless, the A770's price just kept it from being one of the best cheap graphics cards for those looking for a GPU that could provide good ray-tracing alongside hardware-accelerated AI upscaling. Though it couldn’t match the sheer raw 1440p power of an AMD Radeon RX 7700 XT or Nvidia RTX 4060 Ti, general performance was more than respectable for the $349 launch price. 

With third-party variants of the A770 available, the Acer BiFrost Arc A770 OC might be a more attractive buy, especially now that the Intel Limited Edition cards are no longer being manufactured. There are a few things that lean in its favor including customizable RGB lighting through the Predator BiFrost Utility and overclocking capabilities. 

Sure, the lighting that comes with the BiFrost Arc A770 OC looks more attractive than the original A770, but that’s pretty much the biggest plus when it comes to this GPU over the Intel reference card. Performance power doesn’t increase much even with overclocking, which means that the dual-8-pin connection pulls even more power for no real reason, but you can make adjustments to its power draw if that's an issue. Be sure to make sure Resizable BAR is activated through your motherboard's BIOS settings as well because performance will absolutely tank if you don't. 

Image 1 of 3

An Acer Predator BiFrost Arc A770 OC on a gray deskmat

(Image credit: Future / John Loeffler)
Image 2 of 3

An Acer Predator BiFrost Arc A770 OC on a gray deskmat

(Image credit: Future / John Loeffler)
Image 3 of 3

An Acer Predator BiFrost Arc A770 OC on a gray deskmat

(Image credit: Future / John Loeffler)

As mentioned previously, the Acer BiFrost Arc A770 OC comes feature-packed with ray-tracing and AI upscaling capabilities. When it comes to ray-tracing, it’s not going to deliver performance that matches AMD let alone Nvidia, but that doesn’t mean that ray-tracing performance wasn’t good. 

When tested with the Dead Space Remake and Cyberpunk 2077, framerates stayed within the 30 fps ball-park. On the other hand, Intel’s XeSS AI upscaling technology is as good as DLSS and AMD FidelityFX in games like Call of Duty: Modern Warfare III (2023), Forza Horizon 5, and Hi-Fi Rush. Though 1440p performance is generally great, for more fps, brining it down to 1080p delivers better overall results.

There are around 70 games that support XeSS so far with more popular games like Fortnite, League of Legends, and Counter Strike 2 missing from the list. During playtesting some games performed horribly including Crysis Remastered and Forza Motorsport (2023) even when dropped down to borderline potato settings. 

An Acer Predator BiFrost Arc A770 OC on a gray deskmat

(Image credit: Future / John Loeffler)

As in TechRadar's original A770 review, older games may have performance issues due to driver compatibility, since games developed with DirectX 9 and Direct X 10 were not made with the Arc GPUs in mind, meanwhile, AMD and Nvidia drivers have over a decade of legacy support for these games built-in since earlier versions of the drivers were developed back when those games were first released. That said, DirectX 11 and DirectX 12 performance is much better, and Intel's drivers are being actively improved to support these games.

One thing that surprised me is that the A770 provides pretty decent performance when using Adobe Suite software like Premiere Pro and Photoshop if your project scope is kept reasonable. In the meantime, it’ll be interesting to see Adobe provide official support for the graphics card in the future.

Acer does have a Predator BiFrost Utility that allows users to change RGB lighting within its card, but outside of that, it’s not as useful as Intel’s own Arc Graphics utility driver. Both allow users to have various system overlays alongside overclock power limit, temperature limit, and fan speed. One thing's for sure, even when running at full power, the Acer BiFrost Arc A770 OC wasn’t incredibly loud compared to other power-hungry GPUs available.

An Acer Predator BiFrost Arc A770 OC on a gray deskmat

(Image credit: Future / John Loeffler)

Acer Predator BiFrost Arc A770 OC: PRICE & AVAILABILITY

  • How much does it cost? US MSRP $399 (about £320 / AU$560)
  • When is it available? Available now
  • Where can you get it? Available in the US, UK, and Australia

The Acer Predator BiFrost Arc A770 OC is currently available now in the US, UK, and Australia. Right now, there are ways to get around the $399 MSRP with some stores like Newegg selling the GPU for around $279. With the original A770 going for as high as $429, the BiFrost Arc A770 OC could be considered a better buy. 

For gamers on a more restricted budget looking for the best 1440p graphics card capable of playing many of the best PC games of the past couple of years, the BiFrost Arc A770 is definitely more accessible than comparable Nvidia and AMD graphic cards. Individuals who are working with a higher budget should definitely consider getting the AMD Radeon RX 7700 XT, which is just $50 more at $449 and provides much better 1440p performance. 

An Acer Predator BiFrost Arc A770 OC on a gray deskmat

(Image credit: Future / John Loeffler)

Acer Predator BiFrost Arc A770 OC: Specs

An Acer Predator BiFrost Arc A770 OC on a gray deskmat

(Image credit: Future / John Loeffler)

Should you buy the Acer Predator BiFrost Arc A770 OC?

Buy the Acer Predator BiFrost Arc A770 OC if...

You need for budget level price with nearly mid-tier performance
With solid ray tracing and AI upscaling capabilities, the 1440p performance on the BiFrost A770 OC is commendable.

You require a GPU to match your RGB ready desktop’s flyness
The dual fan design and RGB lighting does look cool compared to the original A770.

Don't buy it if...

You want the best midrange GPU
Due to developer support at the moment, the A770 lags behind AMD and NVIDIA, which means performance won’t be the best for many of the top-tier games.

You want a GPU that uses less power
The Acer BiForst Arc A770 uses a lot of power but the performance doesn’t really reflect that.

Also consider

If my Acer Predator BiFrost Arc A770 OC review has you looking for other options, here are two more graphics cards to consider...

How I tested the Acer Predator BiFrost Arc A770 OC

  • I spent around two weeks with the Acer Predator BiFrost Arc A770 OC
  • I used the Acer Predator BiFrost Arc A770 OC for gaming and creative test

Testing with the Acer Predator BiFrost Arc A770 OC happened over a two-week period on a second home computer where I split between gaming and creative tasks. On the gaming side, titles played during testing included Crysis Remastered, Call of Duty Modern Warfare III, Forza Horizon 5, Forza Motorsport (2023), and Dead Space (2023)

Creative usage was split between Premier Pro and Photoshop.  I’ve been testing gaming desktops alongside components for around three years for TechRadar and fully understand how GPUs are supposed to perform compared to similar tech. 

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed February 2024

Nvidia GeForce RTX 4080 Super review: second only to the RTX 4090, and finally worth buying
5:00 pm | January 31, 2024

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , | Comments: Off

Nvidia GeForce RTX 4080 Super: two minute review

The Nvidia GeForce RTX 4080 wasn't my favorite graphics card - by a long shot - when it was released in late 2022, so in a lot of ways the Nvidia GeForce RTX 4080 Super that's just gone on sale has a pretty easy bar to clear, and it does so by delivering better performance than its non-Super sibling at a much lower price.

To be clear, you are mostly getting a price cut on the RTX 4080 with this card, with some extra performance akin to some well-tuned overclocking. In a different universe, this card could have been released in 2022 and pretty much buried the rival AMD Radeon RX 7900 XTX out the gate, rather than give Team Red a chance to own the sub-$1000 premium tier essentially all on its own.

However, now that the Nvidia RTX 4080 Super is available for a US MSRP of $999.99 (about £800/AU$1,400), the market as it stands has just had a GPU whale-bellyflop its way into it, and there's only so much you can relitigate the past in a review like this.

An Nvidia GeForce RTX 4080 Super on a desk

(Image credit: Future / John Loeffler)

On its merits, the Nvidia GeForce RTX 4080 Super is essentially the best graphics card you can buy right now, other than the Nvidia GeForce RTX 4090. It provides a roughly 1.4% performance gain (geometric mean) on the Nvidia RTX 4080 with most of that coming in terms of synthetic benchmarks. 

On the other side, it manages to outperform the RX 7900 XTX by about a geometric average of about 7% both in synthetic performance and gaming, across raster and ray-traced workloads, both with graphics upscaling and native resolutions.

It even comes within 18% overall of the RTX 4090, which is a hell of a thing considering it costs about 40% less at MSRP. In short, if you're an enthusiast looking for the best 4K graphics card but you're not a creative professional who needs the kind of power that an RTX 4090 brings to the table, then the RTX 4080 Super is the perfect card for you.

An Nvidia GeForce RTX 4080 Super on a desk

(Image credit: Future / John Loeffler)

If you're lucky enough to pick up a Founders Edition of the RTX 4080 Super, you'll get the same gorgeous black finish that we saw on the Nvidia GeForce RTX 4070 Super, and considering how quickly these cards are probably going to sell out, these might be something of a showpiece for those wanting to flex.

That said, if you're the type of gamer who lives and breathes RGB, then one of Nvidia's AIB partners will likely offer a better card for your build. Expect them to run higher than MSRP, however, which unfortunately negates the key advantage of this card.

There's no doubt that this card is a seriously premium purchase no matter which way you go though, so someone looking for the best cheap graphics card or the best 1440p graphics card to satisfy a midrange price point will still only be window shopping the Nvidia GeForce RTX 4080 Super, but at least with the Super, it might not be so out of your reach as it once was.

An Nvidia GeForce RTX 4080 Super on a desk

(Image credit: Future / John Loeffler)

Nvidia GeForce RTX 4080 Super: Specs & features

While you are getting more from the Nvidia GeForce RTX 4080 Super in terms of specs — including more compute units, faster clock speeds, and faster memory — the TGP of the card is still 320W, so even though you have more of everything, you everything ends up just a bit underpowered as a result, leading to a card that is slightly faster in terms of performance, but not by much.

You would need about 336W to power all of the added hardware in the RTX 4080 Super to the same extent that the base RTX 4080's hardware was, so if you have an overclocking target, you know where you need to be. Thankfully, Nvidia does let you overclock its cards, so you do have some headroom if you want to squeeze the most out of those extra shaders.

An Nvidia GeForce RTX 4080 Super on a desk

(Image credit: Future / John Loeffler)

Nvidia GeForce RTX 4080 Super: Price & availability

  • How much is it? US MSRP listed at $999.99 (about £800, AU$1,400)
  • When is it out? It was released on January 31, 2024
  • Where can you get it? You can buy it in the US, UK, and Australia

The Nvidia GeForce RTX 4080 Super is available in the US, UK, and Australia (among other markets) from January 31, 2024, with a US MSRP of $999.99 (about £800 / AU$1,400).

This puts it directly against the AMD Radeon RX 7900 XTX in terms of price, and effectively takes the lead as the best value among the top-tier premium cards. It's also about 17% cheaper than the RTX 4080 it effectively replaces. 

Considering that the original launch price of the RTX 4080 essentially knocked 1.5 stars off its final score from me, that $200 price difference really is that big a deal at this stage since it puts enough room between it and the $1,600 price point of the RTX 4090 to make the RTX 4080 worth buying.

Nvidia GeForce RTX 4080 Super: Performance

In terms of performance, the Nvidia GeForce RTX 4080 Super offers the best bang for your buck of any of the top-tier premium cards, rivaling the performance of even the RTX 4090.

Of course, this was true of the base RTX 4080, but where the RTX 4080 Super really delivers is bringing this level of performance below the $1,000 price point that separates the enthusiast segment from the creative professional class who have company money to throw around.

Image 1 of 15

Synthetic benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 2 of 15

Synthetic benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 3 of 15

Synthetic benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 4 of 15

Synthetic benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 5 of 15

Synthetic benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 6 of 15

Synthetic benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 7 of 15

Synthetic benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 8 of 15

Synthetic benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 9 of 15

Synthetic benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 10 of 15

Synthetic benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 11 of 15

Synthetic benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 12 of 15

Synthetic benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 13 of 15

Synthetic benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 14 of 15

Synthetic benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 15 of 15

Synthetic benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)

On synthetic performance, the RTX 4080 Super is second only to the RTX 4090 overall, and where it falters against the RX 7900 XTX is mostly when dealing with 3DMark Fire Strike and Time Spy workloads (both regular as well as Extreme and Ultra versions), but as expected, whenever ray tracing is involved, the RTX 4080 Super shines. 

The RTX 4080 Super also outperforms the RX 7900 XTX when it comes to raw compute performance, so machine learning workloads will run much better on the RTX 4080 Super than the RX 7900 XTX.

Image 1 of 6

Creative benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 2 of 6

Creative benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 3 of 6

Creative benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 4 of 6

Creative benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 5 of 6

Creative benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 6 of 6

Creative benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)

In terms of creative performance, the RTX 4080 Super walks away the winner against the RX 7900 XTX, even if you don't factor in the fact that Blender Benchmark 4.0.0 workloads wouldn't even run on the RX 7900 XTX (though the RX 7900 XT was able to run them, just not nearly as well).

The RTX 4090 is still the card you'll want for creative workloads, 100%. But if money is a bit tighter than it used to be now that interest rates are non-zero and money means something again to VCs, the RTX 4080 Super is a good compromise.

Image 1 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 2 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 3 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 4 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 5 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 6 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 7 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 8 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 9 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 10 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 11 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 12 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 13 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 14 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 15 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 16 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 17 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 18 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 19 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 20 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 21 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 22 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 23 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 24 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 25 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 26 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 27 of 27

Gaming benchmark results for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)

The gaming performance of the Nvidia RTX 4080 Super is likewise second only to the Nvidia RTX 4090, with strong native 4K ray tracing performance before you even factor in DLSS, which — thanks to Frame Generation and Nvidia Reflex — drives frame rates up even further on games that support it. 

While barely a blip higher than the RTX 4080 non-Super in gaming performance (about 0.4% better overall), it is about 7% better than the RX 7900 XTX, which retails for the same price currently.

Image 1 of 9

Final performance figures for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 2 of 9

Final performance figures for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 3 of 9

Final performance figures for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 4 of 9

Final performance figures for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 5 of 9

Final performance figures for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 6 of 9

Final performance figures for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 7 of 9

Final performance figures for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 8 of 9

Final performance figures for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)
Image 9 of 9

Final performance figures for the Nvidia GeForce RTX 4080 Super

(Image credit: Future / InfoGram)

With an average (geomean) frame rate of about 80 fps at 4K compared to the RTX 4090's 87 fps average, the RTX 4080 Super all but eliminates any real need to splurge on the RTX 4090 if you're just looking to game. 

The RTX 4090 might be the best graphics card for creatives or machine learning scientists out there, but for gamers, you are much better off with the Nvidia GeForce RTX 4080 Super, since that extra $600 can buy you a whole lot of other components like the best SSD or best processor for gaming, something even the PC gaming enthusiasts out there can appreciate.

A masculine hand holding the Nvidia GeForce RTX 4080 Super

(Image credit: Future / John Loeffler)

Should you buy the Nvidia GeForce RTX 4080 Super?

Buy the Nvidia GeForce RTX 4080 Super if…

Don’t buy it if…

Also consider

How I tested the Nvidia GeForce RTX 4080 Super

I spent about a week and a half with the Nvidia GeForce RTX 4080 Super all together, with much of that time spent benchmarking the card on our test bench.

I did use the card as my primary workstation GPU for content creation, and in addition to my standard battery of benchmark tests, I also used the card for high-end 4K gaming on titles like Cyberpunk 2077: Phantom Liberty.

I've been reviewing computer hardware for years now, and having extensively tested and retested every major graphics card release of the last two generations, with nearly two dozen graphics card reviews completed in the last 18 months, I know how a card like this is supposed to perform given its price and target audience, so I'm well-equipped to give you the buying advice you need to help you make the right decision with your money.

First reviewed in January 2024

« Previous PageNext Page »