Organizer
Gadget news
Acer Predator Triton 17 X: a premium gaming laptop that packs a punch
2:59 pm | June 4, 2024

Author: admin | Category: Computers Computing Gadgets Gaming Computers Gaming Laptops | Tags: , , , | Comments: Off

Acer Predator Triton 17 X: Two-minute review

There's an argument to be made for packing in as much power as possible when it comes to the best gaming laptops, and that's the space the Acer Predator Triton 17 X occupies. For the most part, it forgoes being the sleekest and smallest of its kind to go all-in on pushing boundaries for those with deep enough pockets to take the plunge. 

Priced at $3,599.99 / £3,299.99 / AU$7,999, the Acer Predator Triton 17 X isn't a budget pick by any means, but that's the cost of packing in enough horsepower to give even the best gaming PCs a run for their money. While the mobile RTX 4090 doesn't exactly rival what its desktop counterpart can do, the performance margin is within an acceptable ballpark range; you can think of it as similar to an RTX 4080 desktop GPU.  

Where this rig stands out from competitors is with its display. The Triton 17 X features a staggering 250Hz refresh rate with a 1600p resolution screen. That 16:10 aspect ratio means you get more real-estate for gaming, and the results are impressive. Fortunately, the components inside this Predator laptop mean you'll be able to push even the latest and most demanding games to superfast frame rates. 

No corners have been cut with the quality-of-life features here, either. This laptop is armed with a six-speaker setup, an excellent keyboard, and a healthy port selection, so even when you're not gaming, you'll have a good experience. Just keep in mind that the Triton 17 X is not the most practical notebook with its 3kg / 6.6lbs heft, so it might not be your daily runner to work or school on the side. 

Compounding this is the majorly disappointing battery life. The Acer Predator Triton 17 X lasts around two hours at best when enjoying media playback or browsing the web, and about an hour when getting stuck into one of the latest games. You'll want the charger nearby, but if you can overlook these issues then there's a stellar machine underneath it all. 

Acer Predator Triton 17 X: Price and availability

MSI Triton 17 X screen

(Image credit: Future)
  • How much does it cost? $3,599.99 / £3,299.99 / AU$7,999
  • When is it available? It's out now
  • Where can you get it? In the US, UK and Australia

The Acer Predator Triton 17 X is one of the pricier gaming laptops on the market, coming in above the $3,000 / £3,000 mark (and at AU$8,000). Considering the hardware inside, that shouldn't come as a huge surprise, though. Acer isn't pulling any punches from the choice of CPU and GPU, through to the display, RAM, and storage. Simply put, it's far from a cheap gaming laptop, but if you want to be on the bleeding edge and have the cash to splash then it could be worthwhile. 

As a frame of reference, the price of entry for the Predator Triton 17 X puts it in league with other top-end offerings such as the Origin EON 16SL when fully specced out, or the Alienware M16 and Razer Blade 16 (2023) in higher configurations. You aren't getting the best value for money on the market, nor the strongest price-to-performance ratio, but in terms of sheer raw power, the Triton 17 X has it in spades. 

  • Price: 3 / 5

Acer Predator Triton 17 X: Design

Design of the MSI Triton 17 X

(Image credit: Future)
  • Stunning 250Hz mini-LED display 
  • Packed with ports 
  • A bit heavy at 3kg / 6.6lbs
Acer Predator Triton 17 X: Specs

Here's what's inside the Acer Predator Triton 17 X supplied to TechRadar. 

CPU: Intel Core i9-13900HX
GPU: Nvidia RTX 4090
RAM: 64GB LPDDR5
Storage: 2TB NVMe Gen 4.0
Display: WQXGA (2560 x 1600) 16:10 IPS 250Hz
Ports: 2x USB 3.2, 2x USB-C, 2.5Gb Ethernet, 3.5mm audio jack, microSD card slot
Wireless: Wi-Fi 6E; Bluetooth 5.1
Weight: 3kg / 6.6lbs
Dimensions: ‎‎28 x 38.04 x 2.19cm (LxWxH)

The most notable thing about the Acer Predator Triton 17 X at first glance is the display which is certainly a leading model as far as gaming laptops go. This portable powerhouse packs in a 16:10 WQXGA (2,560 x 1,600 resolution) screen meaning more real-estate is available for gaming than 16:9 can offer. It's bolstered by a 250Hz refresh rate and is Nvidia G-sync compatible, so there's no screen tearing. 

It's not the first laptop to feature a mini-LED display, but it is an excellent example of the panel tech in action. While not quite as vivid as OLED, it is considerably brighter, and the 1,000 local dimming zones do a solid job of standing in with similar black levels. Considering the hardware inside, an RTX 4090 mobile GPU backed up by an Intel 13th-gen Core i9 processor, you'll be able to take advantage of that high refresh rate, too. 

Acer's design philosophy for this machine is "excellent in excess" and that's clearly demonstrated with the hardware packed into a portable form factor. Mind you, this rig weighs in at 3kg / 6.6lbs making it one of the heavier models on the market. With a 17-inch screen, it's fairly large as well, and while technically portable, the 17 X is unlikely to be something you'll commonly be slinging into a bag. It's more of an out-and-out desktop replacement. 

While you're likely to plug in one of the best gaming keyboards and best gaming mice, the Acer Predator Triton 17 X features a solid keyboard and trackpad for casual web browsing and typing. It offers pleasant multi-zone RGB lighting which looks the part when playing in darker environments. The trackpad isn't as nice as some of the glass ones you'll find on a similarly priced Razer Blade, but it gets the job done. Again, a dedicated mouse will do the trick better.

No expense was spared on the connectivity front here, either. There are two USB-C ports, two USB 3.2 ports, 2.5Gb Ethernet, an SD card reader, and a 3.5mm audio jack. You'll have no shortage of options for either work or play, and it's good that the manufacturer chose function over form in this respect, as some thinner laptops can sacrifice port selection to achieve their svelte nature. 

  • Design: 4 / 5

Acer Predator Triton 17 X: Performance

Keyboad of the Triton 17 X

(Image credit: Future)
  • Unparalleled 1080p and 1440p gaming performance 
  • Silky smooth refresh rate 
  • Gets very hot and loud

You won't be surprised to learn that a gaming laptop powered by the Intel Core i9-13900HX and Nvidia RTX 4090 with 64GB of LPDDR5 RAM absolutely mowed through our suite of benchmarks and games. The display for the laptop tops out at 250Hz, and you'll have all the horsepower necessary to achieve those kind of frame rates in 1080p, and drive very smooth gameplay at 1440p as well.

Acer Predator Triton 17 X benchmarks

Here's how the Acer Predator Triton 17 X got on in our game testing. 

Total War: Three Kingdoms (1080p) - 364fps (Low); 140fps (Ultra)
Total War: Three Kingdoms (1440p) - 290fps (Low) ; 92fps (Ultra)
Cyberpunk 2077 (1080p) - 118fps (Low); 107fps (Ultra)
Cyberpunk 2077 (1440p) - 129fps (Low); 89fps (Ultra)
Cyberpunk 2077 RT Ultra - 85fps (1080p); 83fps (1440p)
Red Dead Redemption 2 (1080p) - 147fps (Low) ; 128fps (Ultra)
Red Dead Redemption 2 (1440p) - 108fps (Low); 86fps (Ultra)
Geekbench 6:
Single - 2,720
Multi - 17,308
3DMark:
Night Raid - 72,575
Fire Strike - 31,498
Time Spy - 16,866
Port Royal - 11,261
PCMark10: 8,069
CrystalDiskMark: Read - 6,441.97; Write - 4,872.65
Cinebench R23:
Single - 1,941
Multi - 25,624
TechRadar battery test: 1 hour 8 minutes

It's comparable to what the MSI Titan 18 HX can do, albeit without the 4K resolution, not that you'll necessarily need 4K in such a small display anyway. It wasn't uncommon for the demanding games tested, such as Cyberpunk 2077 or Red Dead Redemption 2, to exceed 100fps when maxed out in 1440p. Even CPU-bound titles such as Total War: Three Kingdoms were no sweat for the 13900HX, as this game could exceed a lightning-fast 300fps.

Synthetic figures are equally strong as evidenced by 3DMark's range of GPU benchmarks alongside PCMark 10. Acer hasn't skimped on the choice of Gen 4.0 NVMe SSD either, with a strong performance of 6,441MB/s for reads and 4,872MB/s for writes. All told it's a very encouraging package showcasing the prowess of the hardware, but not without a few drawbacks.

While the RTX 4090M is roughly equivalent to the desktop RTX 4080 with its 16GB GDDR6 VRAM and lower power draw, the combination of CPU and GPU here does result in excess heat and loudness. It wasn't uncommon for the rig to reach upwards of 90 degrees when under stress, with the fans drowning out the otherwise impressive six-speaker surround setup. This could be counteracted by employing the use of one of the best gaming headsets, but it's worth noting all the same.

Using the HDMI 2.1 port, you'll be able to hook up the Acer Predator Triton 17 X to one of the best gaming monitors for that big screen experience should the 17-inch display not be enough for you. You may also want to invest in a dedicated laptop riser to keep the fans of the machine elevated to aid cooling, too. 

  • Performance: 4 / 5

Acer Predator Triton 17 X: Battery life

Closed lid of the MSI Triton 17 X

(Image credit: Future)
  • Lasts around two hours when web browsing or for media playback
  • About an hour of gaming on battery power  

What's most disappointing about the Acer Predator Triton 17 X is the battery life which just about manages two hours on a single charge with media playback or casual browsing. When gaming, you can expect about an hour or so, give or take, so you'll need to keep a charger handy if you want to have a full session of gaming for the evening.

Keeping the Acer Predator Triton 17 X plugged in at all times isn't ideal in terms of its portability factor, obviously, but as we already observed, it's a little too large and bulky for that anyway. The battery life is a shame considering there's a 99.98Wh four-cell power pack inside, but it's not too big a shock when factoring in that there's 175W of power drawn by the RTX 4090M GPU alone.

Simply put, if you're after excellent battery life for a portable machine then the Acer Predator 17 X won't be for you. Instead, we recommend considering one of the best Ultrabooks, even if you won't get anywhere near the same level of processing power.

  • Battery: 2 / 5

Should you buy the Acer Predator Triton 17 X?

Buy it if... 

You want a no-compromise gaming experience 

The Acer Predator Triton 17 X packs a punch with its RTX 4090 GPU and 13th-gen Core i9 CPU backed with a staggering 64GB of RAM. All that power translates to commonly getting over 100fps in 1440p with maxed out details. 

You want an out-and-out desktop replacement 

With its powerful hardware and generous port selection, you'll be able to hook up the Triton 17 X to an external monitor for a big screen gaming experience. 

You're in the market for a productivity powerhouse 

While the Acer Predator Triton 17 X is geared towards gamers, its 250Hz refresh rate and cutting-edge hardware make it a good choice for creatives who need all the VRAM and raw performance grunt they can get.

Don't buy it if... 

You want the best value for money 

There's no getting around the eye-watering MSRP of the Acer Predator Triton 17 X at $3,599.99 / £3,299.99 / AU$7,999. If you're on a tighter budget, you'll clearly want to consider a more mid-range model instead.

You want a laptop with a good battery life 

Despite its 99.98Wh battery, you can expect only around an hour of gaming when not plugged in. Media playback doubles that to around two hours based on our battery test (conducted at 50% battery with half max brightness). Whatever the case, don't expect much longevity with the Triton 17 X.

Also consider

  • First reviewed June 2024
Apple: Performance claims about the iPad Air (2024) were accurate, despite GPU core count mistake
1:32 pm |

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

Apple has become aware of the mix-up with the new iPad Air GPU and says that while it does indeed have only 9 cores instead of 10 cores (as originally announced), the performance claims it made about the 2024 tablets are accurate. The original press release has been updated to reflect the true core count. Apple claims that the iPad Air 11 (2024) and iPad Air 13 (2024) with their M2 chipsets are notably faster than their M1-based predecessor from 2022. Specifically, the “Apple-designed M2 has a 15 percent faster CPU, 25 percent faster graphics, and 50 percent more memory bandwidth than the...

ZimaBlade review
9:35 am | May 15, 2024

Author: admin | Category: Computers Gadgets Pro | Tags: , , , | Comments: Off

The ZimaBlade single-board computer looks surprisingly similar to an old-school portable cassette player. 

Specifications

CPU: Entry-level Dual-Core N3350, High-Performance Quad-Core J3455

GPU: Entry-level 650MHz, High-performance 750MHz, 12 execution units

RAM: Upgradable up to 16GB of DDR3L, none supplied in the box

FLASH: 32GB eMMC

USB: 1 x Type-C, 1 x USB3.0, 2 x internal USB2.0

Display: 1 x 4K MiniDP, 1 x DP over Type-C, 1 x eDP internal

PCIe: Four lanes 2.0

SATA: 2 x SATA3.0

Ethernet: 1 x Gigabit LAN

Power Consumption: About 15W

Size: 107mm x 80mm x 32mm

It competes with the Raspberry Pi 4, being in the same price bracket while offering an Intel x86 architecture. The SBC has plenty of connectors, which makes this hacker-friendly platform versatile and unique. The built-in PCIe 2.0 x4 connector accepts various cards out-of-the-box, and with two SATA3 ports, the board can morph into a portable NAS storage device.

Since the ZimaBlade supports up to 16GB of DDR3L, it can run applications requiring large amounts of memory, such as databases and VMs. The main let-down is the outdated CPU, with the speediest version of the board based on a Quad-Core 2.3GHz Apollo Lake CPU. The SBC features a single USB Type-C, which supplies power and drives a DisplayPort output.

IceWhale Technology, the maker of the ZimaBlade, held a Crowdsupply campaign to finance the board's new version. Various perks are available; the most basic, containing a Dual-Core Intel Celeron N3350, is available for $64. The ZimaBlade 7700, built around a Quad-Core J3455 CPU, sells for $96. Except for the CPU, both have the same hardware and require a DDR3L memory module to boot. 

ZimaBlade front view.

(Image credit: Future)

ZimaBlade: Design

The ZimaBlade computer comes with a male-to-male Type-C and one SATA cable. The passively cooled unit measures 107mm x 80mm x 32mm and weighs 175g. The small case sits perfectly flat on a desk, with no mounting holes and only four tiny rubber pads on the bottom. Being very light, connecting various cables can become problematic as the case can topple easily.

The Zimablade designers have worked hard to produce an enclosure that showcases the computer’s internal components. A transparent plastic top displays the SODIMM memory but not the CPU. With no power button available, the hardware turns on when plugging a Type-C cable. A single status LED, barely visible from the side of the case, indicates if the board is powered. The PCIe socket location does not allow easy card insertion. The card’s metal bracket has to be removed before use.

Under the hood, the ZimaBlade sports a J3455 quad-core Intel Celeron CPU clocked at 2.4GHz for the highest performance board variant. Geekbench shows the ZimaBlade handily outperforms the Cortex A72 ARM CPU found in the Pi4 but scores well below the new Pi5’s Cortex A76 CPU. One aspect not found on similar-priced platforms is expanding the memory to 16GB using DDR3L SODIMM.

The ZimaBlade targets an audience that strives for high-speed interfaces. Seven connectors provide connectivity for many use cases with throughputs above the gigabit mark. Two SATA6 and one Gigabit Ethernet socket turn the ZimaBlade into a redundant storage server. One USB3, a USB Type-C with DP, and a mini-DP connector capable of 4K at 60Hz complete the list of external ports. Three internal connectors, two USB 2.0 connectors, and one eDP socket allow additional peripherals.

ZimaBlade side view.

(Image credit: Future)

ZimaBlade: In Use

The owner can use the ZimaBlade simply by plugging a USB Type-C cable into a screen supporting a Type-C display. The computer then boots CasaOS, a lightweight cloud-accessible platform with an ever-increasing number of applications. ZimaBlade is extremely fast at booting, taking just five seconds to display the Linux login.

After entering the default username and password, the user has root access to the Linux-based OS stored in 32GB eMMC storage, with 24GB left for user applications. A lean OS means a lowly 20% RAM utilization with an 8GB memory module. With the 1G LAN connected, software updates run automatically and keep the platform secured.

In addition to being affordable, the ZimaBlade builds on a user-friendly OS where the UI is viewed entirely through a web browser. This cloud concept could have been a miss, but thanks to modern technologies like Docker containers, using the desktop is very snappy. The installed software includes a file explorer and an app store containing forty applications ranging from a retro emulator to home automation. 

Running Geekbench6 on the ZimaBlade involves installing through SSH. The board's power consumption reaches 15W, with the case becoming 

hot at more than 60 degrees Celsius, and decreases to 6W when idle. With a score of 300 in single-core and 911 in multi-core benchmarks on Geekbench6, the J3455 CPU won’t blow you away with its computing prowess but will be sufficient for everyday basic tasks.

ZimaBlade top view.

(Image credit: Future)

ZimaBlade: The competition

Thanks to the ZimaBlade, finding an affordable x86 single-board computer with lots of connectivity and expandable memory has become more accessible. Hardkernel’s Odroid H3+ is very similar to the ZimaBlade, being passively cooled and possessing various high-speed connectors. The H3+ costs more than twice as much, with the Odroid H3+ being bigger with an oversized heatsink and consuming more power. The quad-lane PCIe connector on the ZimaBlade makes it a valuable testbed for PCIe cards, something not found in the Odroid H3+. 

ZimaBlade: Final verdict

IceWhale’s ZimaBlade makes a tremendous entry-level computer with many options for adding extra hardware. The PCIe slot is the product's standout feature, allowing the use of high-end gaming graphics cards, for example. The single SODIMM socket gives the user an upgrade path to more memory. The onboard eMMC storage memory turns the unit into a self-contained product. Finally, a price below $100 tilts the balance, making the ZimaBlade a must-have gadget this year. 

We've listed the best business computers.

The Snapdragon 8 Gen 4 to have impressive GPU performance
4:00 am | May 12, 2024

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

According to the notorious tipster on Weibo Digital Chat Station, Qualcomm's upcoming flagship SoC, the Snapdragon 8 Gen 4, will likely impress with exceptional graphics performance. The post is quite vague, but gives an example with the GPU-intensive Genshin Impact game. The report claims the new Adreno GPU will run the title fluently in native 1080p resolution. The game is quite taxing even for modern high-end chipsets and to run smoothly, users need to lower the resolution. Additionally, the development of the SoC is going ahead of schedule, so Qualcomm is targeting a release...

Details for Snapdragon X Plus leak: 10-core CPU, same GPU and NPU
3:45 pm | April 24, 2024

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

The Snapdragon X Elite is so far the only product in Qualcomm’s new Windows-focused line of chipsets, but that won’t be forever. Details of a Plus model have surfaced, but before we get to those, we have to break down the Elite specs first. The Snapdragon X Elite name is actually used for three chips. All have 12 CPU cores, 42MB of cache and an NPU that delivers 45 TOPS of performance. However, the CPU and GPU are clocked differently. Check the table below for more details on the three versions. The Snapdragon X Plus is similar to the lowest of the Elite versions, except that it has 10...

Details for Snapdragon X Plus leak: 10-core CPU, same GPU and NPU
3:45 pm |

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

The Snapdragon X Elite is so far the only product in Qualcomm’s new Windows-focused line of chipsets, but that won’t be forever. Details of a Plus model have surfaced, but before we get to those, we have to break down the Elite specs first. The Snapdragon X Elite name is actually used for three chips. All have 12 CPU cores, 42MB of cache and an NPU that delivers 45 TOPS of performance. However, the CPU and GPU are clocked differently. Check the table below for more details on the three versions. The Snapdragon X Plus is similar to the lowest of the Elite versions, except that it has 10...

Sony PlayStation 5 Pro is coming with better GPU, more memory bandwidth
8:01 pm | April 15, 2024

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

Last month a rumor did the rounds claiming a PlayStation 5 Pro was coming, and today a new report confirms this, while adding some more details about the console. Sony is apparently already asking developers to ensure their games are compatible with the upcoming PS5 Pro, highlighting a focus on improving ray tracing. The PS5 Pro is allegedly codenamed Trinity, and it's said to have a more powerful GPU and a slightly faster CPU mode. The changes in the Pro model are all supposed to make the console more capable of rendering games with ray tracing enabled, or hitting higher resolutions and...

PlayStation 5 Pro rumored to beef up GPU and ray tracing, bring AI acceleration
4:17 pm | March 18, 2024

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

The PlayStation 5 launched in late 2020, though it feels like it arrived later due to supply issues. A Pro model will reportedly arrive four years later with a much improved GPU, AI acceleration and other enhancements. The GPU will be the biggest upgrade on the PS5 Pro. Rumors claim up to 45% higher rasterization performance and 33.5 TFLOPs of compute power. Future SDK versions will support resolutions up to 8K and higher frame rates with 4K @ 120fps and 8K @ 60fps being possible. Ray tracing performance is set to include 2-3 times, even 4 times on some occasions. This is thanks to a...

Asus ROG G22CH review: the Intel NUC Extreme lives on, at least in spirit
7:00 pm | March 1, 2024

Author: admin | Category: Computers Computing Gadgets Gaming Computers Gaming PCs | Tags: , , , | Comments: Off

Asus ROG G22CH: One-minute review

As chipsets get smaller and more efficient, the past handful of years have seen a rise in smaller-form gaming PCs like the Asus ROG G22CH. 

Not only are they non-intrusive compared to the biggest and best gaming PCs, but they have a nice amount of portability as well. Most importantly, clever cooling and component management allow them to pack a nice performance punch at the cost of real upgradability. 

In the case of the ROG G22CH, the rig looks like a horizontally wider version of the Xbox Series X. There’s a sleek all-black look that’s accented by some nice angles with customizable RGB lighting. With that said, the performance specs are also miles ahead of a similar console. 

The ROG G22CH has an Intel i9-13900K CPU, Nvidia GeForce RTX 4070 GPU, 32GB DDR5 RAM, and a 1TB SSD. That’s more than enough for some solid native 1440p gaming with the ability for 4K through DLSS upscaling. 

Starting at 1,399.99 in the US (about £1,120/AU$1,960), it can get expensive pretty quickly as you increase the specs, with UK and Australian buyers more restricted in the kinds of configurations they can buy. 

This is a bit of an issue since upgradability down the line is likely going to be a problem due to the extremely tight chassis. When packing so much performance within such a small rig, efficient cooling is a must. There are two different options including fans and liquid but both are loud during intensive tasks.  

That said, potential buyers looking for a small-form gaming desktop should definitely keep the Asus ROG G22CH in mind, since it's one of the few available on the market now that Intel has retired its NUC Extreme line. Beyond its pretty aggressive styling, its performance prowess is where it matters the most, and it excels in this regard. The gaming desktop can run all of the most popular esports games at high frame rates such as Fortnite and Valorant while handling more visually demanding games like Alan Wake 2 without much fuss. If cost and upgradability are a problem, it might be best to go with a gaming rig that has a bigger case

An Asus ROG G22CH on a desk

(Image credit: Future / John Loeffler)

Asus ROG G22CH: Price & availability

  •  How much does it cost? Cost range between $1,399 and $2,499  
  •  When is it available? It is available now in U.S., UK and AU  
  •  Where can you get it? From various stories depending on territory  

The Asus ROG G22CH is relatively expensive regardless of what configuration one has. For gamers looking for some solid 1080p gaming, the $1,399 option comes with an Intel Core i5-13400F, Nvidia GeForce RTX 3060, 16GB DDR5 RAM, and a 512GB SSD. 

That’s definitely a solid choice for anyone looking to play some of the bigger esports games like Fortnite, Rocket League, Call of Duty, or Valorant. Our review configuration came to about $2,299 and for $200 more users can pump up to the Intel Core i9-14900KF, though this isn't necessarily a huge jump in CPU power. 

When it comes to the UK, there’s only one option available which includes an Intel Core i7, Nvidia GeForce RTX 4070, 16GB RAM, and 2TB SSD for £2,099. Australian buyers have two configurations they can buy. Both have an Nvidia GeForce RTX 4070, 32GB DDR5, and a1TB SSD, but for AU$4,699 you can get an Intel Core i7-14700F configuration, or for $4,999 you can get an Intel Core i9-14900KF system. 

For good measure, there’s even an included mouse and keyboard that comes packed in with all configurations. Serious gamers will probably want to check out the best gaming keyboard and best gaming mouse options though, as the stock peripherals aren't spectacular.

Small-form PC Gaming rigs are usually expensive and naturally face issues when it comes to upgrading. However, the Acer Predator Orion 3000 is the most approachable price-wise and the lowest configuration is a bit more powerful than the ROG G22CH. Meanwhile, if performance is a main concern regardless of money the Origin Chronos V3 with a little bit of upgradable wiggly room and the Corsair One i300 has the best form-factor.

An Asus ROG G22CH on a desk

(Image credit: Future / John Loeffler)

Asus ROG G22CH: Specs

 The Asus ROG G22CH currently comes in a variety of customizable configurations.  

An Asus ROG G22CH on a desk

(Image credit: Future / John Loeffler)

Asus ROG G22CH: Design

  • The case is 4.53" x 12.72" x 11.30" inches and weights 18.52Lbs 
  • An all-black design is accented with two strips of RGB lighting    
  • There's not much room for GPU upgrading

Balancing form and functionality are the most important attributes of a small-sized gaming PC, and the Asus ROG G22CH does a solid job with both. When it comes to design, there’s much to appreciate in terms of the all-black chassis. Having two vertical strips of customizable RGB lighting on the front panel does lend the rig some personality. 

There’s one small stripe on the upper left side and a longer one on the lower right side. Between them is an angular cut alongside the ROG logo. When it comes to ventilation, there’s some form of it on all sides of the ROG G22CH.  Just looking from the front panel, the overall design is really sleek and could give the Xbox Series X a run for its money.

Image 1 of 3

An Asus ROG G22CH on a desk

(Image credit: Future / John Loeffler)
Image 2 of 3

An Asus ROG G22CH on a desk

(Image credit: Future / John Loeffler)
Image 3 of 3

An Asus ROG G22CH on a desk

(Image credit: Future / John Loeffler)

There are plenty of ports available as well. The top IO panel features two USB-A ports alongside a singular USB-C, a 3.5mm combo jack, and a power button. Unfortunately, that USB-C port is the only one available on this PC. On the back are four USB-A split between 2.0 and 3.2, three audio jacks, and a gigabit Ethernet port. That should be more than enough for most PC gamers and creatives though.

Though upgradability will be tough, the ROG G22CH does somewhat make the process easier. Featuring a tool-free design, there’s a sliding latch that allows both sides and upper portions to be lifted to access to its inside. Having that ability without using screws does help a lot, outside of possibly RAM and SSD, getting a large GPU or attempting to swap out motherboards in the future is going to be difficult, if not impossible. 

An Asus ROG G22CH on a desk

(Image credit: Future / John Loeffler)

Asus ROG G22CH: Performance

  • 1440p performance is spectacular  
  • DLSS can do 4K when needed  
  • Fans will run at full volume   
Benchmarks

Here's how the Asus ROG G22CH performed in our series of benchmarks:

3DMark Speed Way: 4,404; Fire Strike: 34,340; Time Spy: 17,500
GeekBench 6 (single-core): 2,866; (multi-core): 17,650
Total War: Warhammer III (1080p, Ultra): 137 fps; (1080p, Low): 343 fps
Cyberpunk 2077 (1080p, Ultra): 123 fps; (1080p, Low): 162 fps
Dirt 5 (1080p, Ultra): 173 fps; (1080p, Low): 283 fps

Outside of gaming, the Asus ROG G22CH is a phenomenal workhorse for various general and creative tasks. Using Google Chrome in addition to listening to high-fidelity music through Tidal are fine working experiences. 

Using Adobe Suite worked totally fine on the G22CH as well. Photoshop was able to handle multiple-layer projects with incredibly high-resolution photos without issue. Editing videos through Premiere Pro allowed easy editing of 4K videos with speedy export times. 

That said, this is a gaming desktop, and it's its gaming performance where the G22CH really shines.

When it comes to handling the top tier of high-fidelity visuals in gaming, the G22CH can handle Cyberpunk 2077, Red Dead Redemption II, Alan Wake II, and the like at native 1440p at high frame rates without breaking a sweat. Our Cyberpunk 2077 tests produced an average 123 fps on Ultra settings at 1080p. Bumping to 1440p with path tracing placed frame rates in the high 90s. Having everything turned to the max in settings allowed Alan Wake II to run in the high 60s. 

If wanting to go up to 4K, users are definitely going to have to rely on Nvidia’s DLSS technology, but it's possible with the right settings tweaks.

When it comes to high esports-level performance, users right now can enjoy a serious edge over the competition. Games like Call of Duty: Warzone, Valorant, Country Strike 2, and Fortnite were able to pump out frame rates well over 100 fps on high settings which is more than enough for the best gaming monitors. For more competitive settings, it’s easy enough to reach past 200 fps. 

Just understand that users will know when the G22CH is being pushed to the limit. When playing rounds of Helldivers 2 and Alan Wake II, the noise from the PC's fans reached around the low 80-decibel mark. This means that headsets are going to be necessary when gaming. 

An Asus ROG G22CH on a desk

(Image credit: Future / John Loeffler)

Should you buy the Asus ROG G22CH?

Buy the Asus ROG G22CH if...

Don't buy it if...

How I tested the Asus ROG G22CH

I tested the Asus ROG G22CH over two weeks. During the day, many general computing tasks were done including Google Chrome and Tidal. Having multiple Google Chrome tabs allowed me to use Asana, Google Docs, and Hootsuite. For creating graphics alongside short-form social media video content, I used Adobe Premiere and Photoshop. 

Testing out high frame rate possibilities, games played included Call of Duty: Warzone, Valorant, and Fortnite. To see how hard we could push visual fidelity, we tried games including Cyberpunk 2077, Alan Wake 2 and Forza Motorsport (2023).

I’ve spent the past several years covering monitors alongside other PC components for Techradar. Outside of gaming, I’ve been proficient in Adobe Suite for over a decade as well. 

Read more about how we test

First reviewed March 2024

AMD Radeon RX 7900 GRE: AMD’s China-only card goes global—and upends the midrange market
7:42 pm | February 28, 2024

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

AMD Radeon RX 7900 GRE: Two-minute review

The AMD Radeon RX 7900 GRE was originally launched in China back in July 2023, and from what everyone was told, that card was going to be exclusive to that region. 

Well, following the launch of the RTX Super series of GPUs last month, AMD's decided to surprise everyone and launch the RX 7900 GRE globally, starting this week, and it looks primed to upend the midrange GPU market in a pretty huge way.

That's because the RX 7900 GRE (Golden Rabbit Edition) is going on sale starting at $549, which puts it directly in competition with the Nvidia RTX 4070 on price, and undercuts the Nvidia RTX 4070 Super by offering competitive performance for just over 90% of the cost.

To be clear, the card that is being released globally is the same card that has already been available for Chinese consumers, and so it has been extensively benchmarked for months, with much of that data freely available online for everyone to see. 

This has no doubt driven much of the global interest in the RX 7900 GRE since it originally launched back in July, and I fully expect this card to fly off the shelves since it is without question one of the best graphics cards for the midrange you're going to find.

In terms of raw synthetic performance, the RX 7900 GRE follows the familiar AMD-Nvidia pattern where the Radeon card is better at pure rasterization while the GeForce card is the better ray-tracer, but the difference between the RX 7900 GRE and the RTX 4070 Super in ray-tracing performance isn't as wide as it might have been last generation.

What's more, when it comes to gaming, Nvidia's advantage in native ray tracing is overcome by the RX 7900 GRE as soon as you bring upscaling into the mix, which you invariably have to do whenever ray tracing above 1080p is involved.

The RX 7900 GRE is even a much more capable creative card than I was expecting, so long as you're not working with CUDA, but for graphic designers, photographers, and video editors, this is a surprisingly powerful GPU for a lot less money than it's rivals.

Overall, the AMD Radeon RX 7900 GRE isn't so powerful that it completely knocks out Nvidia's RTX 4070 Super, but it's hitting Nvidia's newest GPU a lot harder than I think Nvidia was expecting so soon after launch. Unfortunately, this does put the only-slightly-cheaper-but-not-as-good AMD Radeon RX 7800 XT in a bit of an awkward position, but for gamers looking to get the best performance for their money, more options are better in this case.

An AMD Radeon RX 7900 GRE from PowerColor on a desk with its retail packaging

(Image credit: Future / John Loeffler)

AMD Radeon RX 7900 GRE: Price & availability

  • How much does it cost? $549 (about £440/AU$770)
  • When is it available? Available now
  • Where can you get it? Available in the US, UK, and Australia

The AMD Radeon RX 7900 GRE is available starting February 27, 2024, with a US MSRP of $549 (about £440/AU$770). This is the same price as the Nvidia RTX 4070, $50 less than the RTX 4070 Super, and $50 more than the RX 7800 XT.

This launch doesn't include an AMD reference card, so you will need to buy the RX 7900 GRE from third-party partners like ASRock, Gigabyte, Sapphire, and others. The sample I was sent for review is the PowerColor Hellhound RX 7900 GRE, a model line that typically sells for AMD's MSRP or below (when on sale).

An AMD Radeon RX 7900 GRE from PowerColor on a desk with its retail packaging

(Image credit: Future / John Loeffler)

AMD Radeon RX 7900 GRE: Features & specs

The AMD Radeon RX 7900 GRE is a modified Navi 31 GPU with four fewer compute units than the AMD Radeon RX 7900 XT, as well as slower clock speeds. It's power requirements are also officially lower at a starting TGP of 260W, but this will vary by which card you go for.

The Radeon RX 7900 GRE also has 16GB GDDR6 VRAM to the RX 7900 XT's 20GB, and while the RX 7900 XT has a 320-bit memory bus, the RX 7900 GRE has a slimmer — but still sizeable — 256-bit bus. With a memory clock of 2,250 MHz (compared to the RX 7900 XT's 2,500 MHz), the RX 7900 GRE comes in with an effective memory speed of 18 Gbps and a memory bandwidth of 576 GB/s, which is a notable decline from the RX 7900 XT's 800 Gbps and 800 GB/s, respectively.

Also notable are the two 8-pin power connectors, which won't require you to fuss around with a 16-pin connector like Nvidia's latest graphics cards require you to do, whether that's through an adapter or an ATX 3.0 power supply.

An AMD Radeon RX 7900 GRE from PowerColor on a desk with its retail packaging

(Image credit: Future / John Loeffler)

AMD Radeon RX 7900 GRE: Design

While there is an AMD reference card for the RX 7900 GRE, AMD has said that global availability will only come through AIB partners, so the design you get with your card will vary by manufacturer.

The card I tested, the PowerColor Hellhound RX 7900 GRE, sports a triple-fan cooler with RGB lighting in the fan. It's a long card to be sure, and even though it's technically a dual-slot card, the shroud makes for a tight fit.

The backplate of the Hellhound RX 7900 GRE has some notable features, like the Hellhound logo, the exposed GPU bracket, and a hole in the backplate opposite the third fan to leave an open path for air to pass over the GPU cooler's heatsink fins to improve cooling efficiency.

An AMD Radeon RX 7900 GRE from PowerColor in a test bench

(Image credit: Future / John Loeffler)

AMD Radeon RX 7900 GRE: Performance

Now we come to the heart of the matter. I can't tell if AMD was inspired by the release of the Nvidia RTX 4070 Super or not, but whatever convinced Team Red to bring the RX 7900 GRE out of China to the rest of the world, midrange gamers everywhere should be grateful because this is easily the best midrange graphics card on the market right now.

Image 1 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 2 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 3 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 4 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 5 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 6 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 7 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 8 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 9 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 10 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 11 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 12 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 13 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 14 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)
Image 15 of 15

Rx 7900 GRE synthetic benchmark results

(Image credit: Future / Infogram)

Starting with synthetic benchmarks, the typical rasterization-ray tracing divide between AMD and Nvidia remains, but like we've seen with other cards this generation, the gap is narrowing. The Nvidia 4070 and RTX 4070 Super definitely pull ahead in terms of raw compute performance, but overall, the RX 7900 GRE is the champ of the under-$600s.

Image 1 of 6

Rx 7900 GRE creative benchmark results

(Image credit: Future / Infogram)
Image 2 of 6

Rx 7900 GRE creative benchmark results

(Image credit: Future / Infogram)
Image 3 of 6

Rx 7900 GRE creative benchmark results

(Image credit: Future / Infogram)
Image 4 of 6

Rx 7900 GRE creative benchmark results

(Image credit: Future / Infogram)
Image 5 of 6

Rx 7900 GRE creative benchmark results

(Image credit: Future / Infogram)
Image 6 of 6

Rx 7900 GRE creative benchmark results

(Image credit: Future / Infogram)

For creative use, the RX 7900 GRE is the strongest rasterizer, but lags Nvidia with video editing, and serious stumbles when it comes to 3D rendering as seen in Blender Benchmark 4.0.0. 

Image 1 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 2 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 3 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 4 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 5 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 6 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 7 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 8 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 9 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 10 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 11 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 12 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 13 of 39

RX 7900 GRE 1080p gaming benchmarks

(Image credit: Future / Infogram)
Image 14 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 15 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 16 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 17 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 18 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 19 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 20 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 21 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 22 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 23 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 24 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 25 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 26 of 39

RX 7900 GRE 1440p gaming benchmarks

(Image credit: Future / Infogram)
Image 27 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 28 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 29 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 30 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 31 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 32 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 33 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 34 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 35 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 36 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 37 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 38 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)
Image 39 of 39

Rx 7900 GRE 4K gaming benchmark results

(Image credit: Future / Infogram)

When it comes to gaming, though, the RX 7900 GRE is the clear winner among midrange cards, with spectacular 1080p and 1440p gaming performance, with only slightly worse ray tracing performance than the RTX 4070 Super. 

As a 4K graphics card, however, the RX 7900 GRE isn't that far behind the RTX 4070 Ti, with the former getting an average 55 fps (30 fps minimum) and the latter getting an average of 63 fps (minimum 42 fps). The RTX 4070 Super, meanwhile, only averages 41 fps at 4K, with a minimum of 28 fps. 

Ultimately, the RTX 4070 Super can't really be considered among the best 4K graphics cards, but the RX 7900 GRE definitely can, thanks to its wider memory pool and larger memory bus.

Image 1 of 2

Power and Temperature benchmarks for the RX 7900 GRE

(Image credit: Future / Infogram)
Image 2 of 2

Power and Temperature benchmarks for the RX 7900 GRE

(Image credit: Future / Infogram)

Of course, this performance comes at the cost of power draw. You can throw the official 260W TGP right out the window here, with the RX 7900 GRE pulling down 302W, but the strong cooling performance on the PowerColor Hellhound card did manage to keep the RX 7900 GRE below 53 degrees Celsius.

Image 1 of 8

Final benchmark results for the RX 7900 GRE

(Image credit: Future / Infogram)
Image 2 of 8

Creative benchmarks for the RX 7900 GRE

(Image credit: Future / Infogram)
Image 3 of 8

Final benchmark results for the RX 7900 GRE

(Image credit: Future / Infogram)
Image 4 of 8

Final benchmark results for the RX 7900 GRE

(Image credit: Future / Infogram)
Image 5 of 8

Final benchmark results for the RX 7900 GRE

(Image credit: Future / Infogram)
Image 6 of 8

Final benchmark results for the RX 7900 GRE

(Image credit: Future / Infogram)
Image 7 of 8

Final benchmark results for the RX 7900 GRE

(Image credit: Future / Infogram)
Image 8 of 8

Final benchmark results for the RX 7900 GRE

(Image credit: Future / Infogram)

Overall, then, there's just no getting around the fact that the RX 7900 GRE effectively outperforms any other card in the midrange. And despite the RX 7900 GRE falling well short of the RTX 4070-series GPUs overall, it's worth keeping in mind that with Photoshop and similar rasterization-dependent programs, the RX 7900 GRE performs the best, and it doesn't fall too far behind the RTX cards when it comes to video editing. 

The weakness of the RX 7900 GRE is that most, if not all, 3D modeling software relies so heavily on Nvidia's CUDA that it heavily skews the creative performance averages, that it can be somewhat deceptive—unless you NEED this graphics card for 3D modeling. If that's the case, nothing else matters, and you need to go with an RTX 4070-class graphics card despite the RX 7900 GRE's superior performance everywhere else.

How many people will that stipulation apply to ultimately? Not enough to hold the RX 7900 GRE from claiming the crown as the best graphics card in the midrange, and since its final value score is just shy of the RX 7800 XT's, there really isn't any reason to opt for any other card right now. The RX 7900 GRE is honestly just that good.

An AMD Radeon RX 7900 GRE from PowerColor on a desk with its retail packaging

(Image credit: Future / John Loeffler)

Should you buy the AMD Radeon RX 7900 GRE?

Buy the AMD Radeon RX 7900 GRE if...

You want the best midrange graphics card
The AMD Radeon RX 7900 GRE is the best overall graphics card for under $600 you can get.

You want to game at 4K
Thanks to the RX 7900 GRE's 16GB VRAM and wide memory bus, you can absolutely game effectively at 4K with this card.

Don't buy it if...

You want the best ray-tracing graphics card
The AMD RX 7900 GRE is a good ray-tracing graphics card, but it's not as good as the RTX 4070 Super.

You do a lot of 3D modeling
If you're a 3D modeling professional (or even a passionate amateur), you need an RTX card, full stop.

AMD Radeon RX 7900 GRE: Also consider

If my AMD Radeon RX 7900 GRE review has you looking for other options, here are two more graphics cards to consider...

How I tested the AMD Radeon RX 7900 GRE

  • I spent about a week with the AMD Radeon RX 7900 GRE
  • I tested its synthetic, creative, and gaming performance
  • I used our standard suite of benchmarks
Test system specs

This is the system I used to test the AMD Radeon RX 7900 GRE

CPU: Intel Core i9-14900K
CPU Cooler: MSI MAG Coreliquid E360 AIO Cooler
RAM: 32GB Corsair Dominator Platinum RGB DDR5-6000
Motherboard: MSI MPG Z790 Tomahawk WiFi
SSD: Samsung 990 Pro 4TB NVMe M.2 SSD
Power Supply: Thermaltake PF3 1050W ATX 3.0
Case: Praxis Wetbench

I spent about a week testing the AMD Radeon RX 7900 GRE, including extensive benchmarking as well as general use of the card in my primary work PC.

I also made sure to benchmark other competing graphics cards in the same class with updated drivers to ensure correct comparable data, where necessary.

I've been reviewing computer hardware for years, and I have tested and retested all of the best graphics cards of the past two generations, so I know very well how a graphics card in a given class ought to perform.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed February 2024

« Previous PageNext Page »