Organizer
Gadget news
Huawei nova 14i debuts in Hong Kong
1:51 pm | January 27, 2026

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

Following its announcement over three months ago, Huawei’s nova 14i has made its way to Hong Kong and we finally have pricing details to go with it. The device will be available from February 6 and will retail for HKD 1,588 ($203) in its 8/256GB storage trim. Nova 14i is available in blue and black colors and is actually a rebadged version of the Huawei Enjoy 60X aka Nova Y91 both of which launched back in 2023. Huawei nova 14i The device comes with a 6.95-inch LCD (FHD+120Hz), a Snapdragon 680 chipset and a 7,000mAh battery with 22.5W charging. The circular camera island...

I spent days testing Acer’s new 16-inch laptop — and sadly AI remains just a gimmick
12:50 pm |

Author: admin | Category: Computers Computing Gadgets Laptops Windows Laptops | Tags: , | Comments: Off

Acer Aspire 16 AI: Two-minute review

The Acer Aspire 16 AI is a large laptop promising powerful AI features in an elegant body. It certainly looks the part, thanks to the premium materials and finish, as well as the impressively thin chassis. It’s also surprisingly light for a laptop of this size, which further improves its portability.

However, the price paid for this litheness is the somewhat flimsy build quality, falling below the standards of the best laptop constructions. There’s a fair amount of flex to the chassis, while the lid hinge doesn’t offer the greatest stability – although it at least managed to stay planted while I typed.

There are a good number of ports on the Aspire 16 AI, including two USB-C and two USB-A ports. However, the former are located closest to you, a choice I usually lament since it means your power adapter has to cross over with any cable you have plugged in to the USB-A port. It’s also a shame that the card reader is only fit for microSDs.

For day-to-day use, the Aspire 16 AI is very capable. It can handle light productivity and 4K streaming without missing a beat. However, the included AI features are disappointing: they’re either too basic in their functionality or fail to work altogether.

Gaming also proved to be a lackluster experience. Its shared memory GPU can just about handle AAA titles on the lowest settings, and even then you won’t exactly be treated to the smoothest frame rates.

Close-up of camera on Acer Aspire 16 AI on pink background

(Image credit: Future)

Under these kinds of intensive workloads, the Aspire 16 AI can generate a fair amount of heat, but thankfully it’s concentrated underneath, towards the back. Coupled with the hushed fans, the Aspire 16 AI remains comfortable to use in such scenarios.

The display in my review unit, with its OLED technology and 2048 x 1280 resolution, provided a crystal-clear image, rendering colors vividly and delivering high brightness levels. This latter aspect is especially useful for combating reflections, which can be quite prominent.

Thanks to the spacing and satisfying feel of its keys, the keyboard on the Aspire 16 AI is easy to use. However, the number pad keys are too small for my liking, and I wished there was a right Control key, as I find this useful for productivity purposes.

The touchpad is smooth and large, which helps with navigation, but it can get in the way when typing. Also, the one in my review unit felt loose and rattled when clicking, making it awkward to use and suggests signs of poor quality control.

Battery life is somewhat disappointing, and isn’t a patch on that of the smaller 14 AI. In fact, many of its key rivals can outlast it. I only managed to get roughly nine hours from it when playing a movie on a continuous loop.

On the face of it, the Aspire 16 AI might look like good value, but it doesn’t deliver enough to justify its cost. Its slender form and mostly great display aren’t enough to make up for its drawbacks, while other laptops at this price point offer more complete packages.

Acer Aspire 16 AI review: Price & Availability

  • $649.99 / £799.99 / AU$1,499
  • Available now in various configurations
  • Better value rivals exist

The Aspire 16 AI starts from $649.99 / £799.99 / AU$1,499 and is available now. It can be configured with various processors, including Intel and Qualcomm (ARM) chips, with a couple of storage and RAM options to choose from.

Unfortunately, there are better value laptops out there with more power and performance, better suited to heavier workloads. The Apple MacBook Air 13-inch (M4) is one such example. Starting prices aren’t as low, but they’re similar to the higher spec models of the Aspire 16 AI. They also have excellent build quality, making them a better value proposition all things considered.

If you want to stick with Windows, the Asus TUF Gaming A16 Advantage Edition is another alternative. Again, it’s similarly priced to the higher-spec variants of the Aspire 16 AI, but offers much better gaming performance, chiefly thanks to its AMD Radeon RX 7600S GPU. It’s no surprise we think it’s one of the best cheap gaming laptops around right now.

  • Value: 3 / 5

Acer Aspire 16 AI review: Specs

Acer Aspire 16 AI Specs

Acer Aspire 16 AI Base Config

Acer Aspire 16 AI Review Config

Price

$649.99 / £799.99 / AU$1,499

£949 (about $1,280, AU$1,960)

CPU

Qualcomm Snapdragon X X1-26-100 (8 cores), 3GHz

AMD Ryzen AI 7 350, 2.0GHz (8 cores)

GPU

Qualcomm Adreno GPU (shared memory)

AMD Radeon 860M (shared memory)

RAM

16GB LPDDR5X

16GB LPDDR5X

Storage

512GB PCI Express NVMe 4.0 (M.2)

1TB PCI Express NVMe 4.0 (M.2)

Display

16-inch WUXGA (1920 x 1200) 16:10 ComfyView (Matte) 120Hz, IPS

16-inch WUXGA+ (2048 x 1280) OLED, 16:10, 120Hz

Ports and Connectivity

2x USB-C (Thunderbolt 4), 2x USB-A, 1x HDMI 2.1, 1x headset jack, 1x microSD, Wi-Fi 7, Bluetooth 5.4

2x USB-C (Thunderbolt 4), 2x USB-A, 1x HDMI 2.1, 1x headset jack, 1x microSD, Wi-Fi 7, Bluetooth 5.4

Battery

65Wh

65Wh

Dimensions

14 x 9.8 x 0.6 inch / 355 x 250 x 16mm

14 x 9.8 x 0.6 inch / 355 x 250 x 16mm

Weight

3.4lbs / 1.55kg

3.4lbs / 1.55kg

Acer Aspire 16 AI review: Design

Close-up of keyboard on Acer Aspire 16 AI

(Image credit: Future)
  • Brilliantly thin and light
  • Not the sturdiest
  • Touchpad issues

Thanks to its minimal design, the Aspire 16 AI has sleek looks. The low-shine metallic lid also adds to its elegance, befitting its premium price tag.

It’s pleasingly light and slender, too, making it more portable than you might expect for a 16-inch laptop. The bezel for the display is minuscule as well, which helps to maximize its full potential.

There’s a satisfying click when you close the lid on the Aspire 16 AI, something I haven’t encountered on any other laptop before. The hinge also allows for the screen to recline all the back by 180 degrees, something I’m always happy to see.

However, lid stability isn’t the best, as it’s prone to wobbling, although, thankfully, it remains stable while typing on the keyboard. The overall construction of the Aspire 16 AI isn’t especially impressive, either, with the chassis having a fair amount of flex.

Close-up of touchpad on Acer Aspire 16 AI

(Image credit: Future)

Worse still, the touchpad in my review unit had a horrible rattle, as if some part was loose at the bottom section. It’s possible this issue is confined to my review unit alone – perhaps it had been passed around several journalists before it got to me – but the issue still doesn’t speak highly of its build quality or Acer's quality control.

There’s a varied selection of ports on the Aspire 16 AI, spread evenly across both sides. On the left are two USB-C ports, one USB-A port, and an HDMI port. However, I found it inconvenient that the USB-C ports are placed nearest to you, since one has to be used for the power adapter; I much prefer the thick cable for this to trail from the back of the laptop, rather than from the middle, as it does with the Aspire 16 AI.

On the right you’ll find another USB-A port, followed by a combo audio jack and a microSD card reader. It’s a shame the latter can’t accommodate standard SD card sizes, but this is a small grievance.

  • Design: 3.5 / 5

Acer Aspire 16 AI review: Performance

Close-up of left-side ports on Acer Aspire 16 AI, on desk with pink background

(Image credit: Future)
  • Good productivity and streaming performance
  • Poor for gaming
  • Useless AI features

The Aspire 16 AI offers great general performance. It takes light productivity in its stride, from word processing to spreadsheet creation, and multiple browser tabs didn’t cause a problem for me, thanks to the 16GB of RAM in my review unit. Streaming 4K content is well within its grasp, too. I experienced little buffering or slow down, providing a seamless viewing experience in the main.

However, despite what Acer claims, the gaming performance of the Aspire 16 AI is quite poor. With its shared memory, the AMD Radeon GPU didn’t handle AAA titles very well. When I played Cyberpunk 2077 with the default Ray Tracing: Low preset and resolution scaling set to Performance mode, I was getting 20fps on average – not what you’d call playable.

The best I could achieve with the game was about 38fps, but that was at the lowest possible graphics preset and the resolution dropped to 1080p. This at least made it playable, but if you’re expecting to get even moderately close to the performance of the best gaming laptops, you’ll be sorely disappointed.

During my playtime, the Aspire 16 AI generated a fair amount of heat. Fortunately, this was heavily concentrated underneath and at the back, thus steering clear of any parts you might actually touch. Fan noise is also pleasantly subdued.

As when I tested the 14 AI, the AI features the Aspire 16 AI sports are disappointing. The centerpiece appears to be Acer LiveSense, a photo editing and webcam suite with very basic functionality, not to mention a poor UI and frequent glitches.

For more AI features, you’ll have to download Acer Intelligence Space, the brand’s hub. Contrary to when I tested the 14 AI, I managed to install it successfully. However, it didn’t get off to an auspicious start, as a dialog box warned me that I had insufficient memory resources, explaining that it needed 6.5GB free and a total of at least 16GB to execute smoothly.

Close-up of right-hand ports on Acer Aspire 16 AI on table

(Image credit: Future)

I proceeded anyway and was greeted with a clear user interface that revealed the various AI apps I could install. However, a large portion of them seem to be incompatible with the Aspire 16 AI, and those that are were once more very limited in their functionality.

On a more positive note, the 2K OLED display in my review unit was as clear and as vibrant as you might expect. The very shiny coating can cause prominent reflections, but these can be mitigated by the screen’s brightness values (especially if you disable the ‘change brightness based on content’ setting).

The keyboard feels premium, too, thanks to the subtle texture and tight fit of the keys themselves. They’re also light, tactile, and reasonably spaced, although perhaps not to the extent of other laptop keyboards. I didn’t find this aspect to be a problem when typing, but I did while gaming, as it made adopting the WASD more uncomfortable for me.

At least the number pad doesn’t eat into the layout space. However, contrary to many full-sized laptop keyboards I’ve experienced, it’s the number pad itself that feels cramped, with its keys being too narrow to be used easily. Another small but notable gripe I have with the keyboard is the absence of a right Control key, which can be frustrating when performing productivity tasks.

The touchpad performs well enough, with its large and smooth surface making for easy navigation. However, thanks to the aforementioned rattle in its bottom portion, clicks felt unpleasant. It can also get in the way while typing: on occasion, the palm of my thumbs would activate the cursor, although thankfully not clicks or taps.

  • Performance: 3.5 / 5

Acer Aspire 16 AI review: Battery Life

Back of Acer Aspire 16 AI open on table with pink background

(Image credit: Future)
  • Average battery life
  • 14 AI battery life much better
  • Other rivals are better, too

The battery life of the Aspire 16 AI isn’t particularly impressive. It lasted just over nine hours in our movie playback test, which is a middling result. This is a far cry from the time achieved by the 14 AI, which lasted over twice as long, making the Aspire 16 AI even more disappointing by comparison.

What’s more, plenty of its rivals can beat this score, including the Microsoft Surface Laptop 13-inch, which managed over 17 hours, and the Asus TUF Gaming A16 Advantage Edition, which lasted 11 hours.

  • Battery Life: 3.5 / 5

Should I buy the Acer Aspire 16 AI?

Acer Aspire 16 AI Scorecard

Attributes

Notes

Rating

Value

Starting prices are low, but climb up the specs and the value starts to diminish.

3 / 5

Design

Build quality isn’t the best, but it’s impressively thin and light. It looks good, too.

3.5 / 5

Performance

Everyday tasks are dispatched without a hitch, but it can’t cope well with heavier demands, such as gaming. The display is very good, though.

3.5 / 5

Battery Life

Only average, and the smaller 14 AI absolutely obliterates it on this front.

3 / 5

Total

The Aspire 16 AI is a capable workhorse, but its poor GPU, underwhelming AI features, and suspect build quality result in a middling machine.

3 / 5

Buy the Acer Aspire 16 AI if...

You want a large and bright display
The 16-inch OLED on my model looked great, its powerful backlight can overcome its reflective nature.

You want something portable
Despite its large size, the Aspire 16 AI is impressively light and thin, making it easy to carry around.

Don't buy it if...

You’ll be running graphics-intensive apps
The Aspire 16 AI could barely handle AAA gaming at modest settings, saddled as it is with a shared memory GPU.

You want a super-sturdy machine
There’s plenty of flex in the body, and the seemingly broken touchpad on my particular unit was disconcerting.

Acer Aspire 16 AI review: Also Consider

Asus TUF Gaming A16 Advantage Edition
If you’re after more graphical power but don’t want to spend more for it, the TUF Gaming A16 Advantage Edition might be the solution. It comes equipped with an AMD Radeon RX 7600S GPU, which is capable of handling AAA titles smoothly, although you may have to forgo Ray Tracing. Read our full Asus TUF Gaming A16 Advantage Edition review.

Apple MacBook Air 13-inch (M4)
Unusually for an Apple product, this MacBook Air is actually a great budget pick if you’re after a powerful machine, being among the best laptops for video editing for this reason. Its sumptuous design and display are additional feathers in its creative cap. Read our full Apple MacBook Air 13-inch (M4) review.

How I tested the Acer Aspire 16 AI

  • Tested for several days
  • Used for various tasks
  • Plentiful laptop reviewing experience

I tested the Aspire 16 AI for several days, during which time I used it for various tasks, from productivity and browsing to streaming and gaming.

I also ran our series of benchmark tests to assess its all-round performance more concretely, and played a movie on a continuous loop while unplugged to see how long its battery lasted.

I have been using laptops for decades, and have reviewed a large and varied selection of them too, ranging in their form factors, price points, and intended purposes.

  • First reviewed: January 2026
  • Read more about how we test
Galaxy A07 5G India pricing tipped ahead of launch
11:30 am |

Author: admin | Category: Mobile phones news | Comments: Off

Samsung is gearing up to unveil the Galaxy A07 5G, and a tipster has now revealed the phone’s expected pricing in India. The phone was recently spotted on the official Samsung website in Myanmar. According to the tipster, the Galaxy A07 5G will be sold in two variants in India. The base option with 4GB of RAM and 128GB storage is said to be priced at INR 15,999 ($175), whereas the 6GB/128GB model could carry a price tag of INR 17,999 ($195). Samsung Galaxy A07 4G The official listing for the handset revealed that it will sport a 6.7-inch LCD display with a 120Hz refresh rate, a...

iQOO 15R launch scheduled for next month
10:30 am |

Author: admin | Category: Mobile phones news | Comments: Off

iQOO has been teasing the 15R for the Indian market for the past few weeks, and the company has now officially confirmed the phone’s India launch date. The iQOO 15R will debut in India on February 24, the company confirmed in an X post. Although the brand hasn’t revealed any details about the upcoming phone, recent leaks have suggested that it could be powered by the Snapdragon 8 Gen 5 chipset. The phone will run Android 16-based OriginOS and offer a dual-rear camera setup. The handset appears to be a rebrand of the iQOO Z11 Turbo, which was unveiled in China earlier this...

iQOO 15R launch scheduled for next month
10:30 am |

Author: admin | Category: Mobile phones news | Comments: Off

iQOO has been teasing the 15R for the Indian market for the past few weeks, and the company has now officially confirmed the phone’s India launch date. The iQOO 15R will debut in India on February 24, the company confirmed in an X post. Although the brand hasn’t revealed any details about the upcoming phone, recent leaks have suggested that it could be powered by the Snapdragon 8 Gen 5 chipset. The phone will run Android 16-based OriginOS and offer a dual-rear camera setup. The handset appears to be a rebrand of the iQOO Z11 Turbo, which was unveiled in China earlier this...

vivo X200T is here with triple 50MP camera, Dimensity 9400+ SoC
9:30 am |

Author: admin | Category: Mobile phones news | Comments: Off

Vivo's 200 series isn't done yet - meet the vivo X200T, a more affordable, yet flagship-grade phone with proper Zeiss cameras, a high-end build, and a big battery with fast charging. The vivo X200T launches in India and comes in Stellar Black and Seaside Lilac. The phone's body molds a premium metal frame with glass panels on either side and both IP68 and IP69 dust and water protection. Stellar Black and Seaside Lilac Camera-wise, the X200T has a triple 50MP Zeiss system. The main unit is a wide-angle with a 50MP 1/1.56-inch LYT702 sensor. Then, there's a 15mm autofocusing...

Oppo Find X9s’ live image surfaces
8:31 am |

Author: admin | Category: Mobile phones news | Comments: Off

Oppo's Find X9 series currently includes the Find X9 and Find X9 Pro. The brand is expected to add two more members to the lineup this year, dubbed the Find X9s and Find X9 Ultra. While Oppo hasn't announced the launch date of the Find X9s yet, a live image of the alleged Find X9s has surfaced online, showing a rear camera arrangement similar to the vanilla and Pro models. [#InlinePriceWidget,14094,1#] This picture, included below, was shared by a Weibo user. The Find X9s' design is concealed with a case, but it confirms the smartphone will feature three rear...

Meet the Motorola Edge 70 Fusion, starring in leaked renders
6:01 am |

Author: admin | Category: Mobile phones news | Comments: Off

The next member of Motorola's ever-expanding Edge 70 family will be the Edge 70 Fusion, which ran Geekbench last week and had its specs leaked before that. Today, it's the star of newly leaked official-looking renders. It is portrayed in two colorways: Pantone Country Air (the lighter one) and Pantone Silhouette (the darker one). Motorola Edge 70 Fusion in Pantone Country Air Aside from these hues, the phone is also expected to be offered in Pantone Blue Surf, Pantone Orient Blue, and Pantone Sporting Green. The Edge 70 Fusion's back is apparently inspired by nylon and...

My hands-on experience of the Asus Ascent GX10 was a radical one that is only relevant to those actively engaged in AI development
4:05 am |

Author: admin | Category: Computers Computing Gadgets | Tags: , | Comments: Off

Rather than a review, this is a ‘hands-on’ in which I’ve explored what the Asus Ascent GX10 offers, providing information that might be critical to those considering purchasing one.

The first important piece of information about this hardware is that this isn’t a PC, or rather, it's not an Intel, AMD or X86-compliant platform that can run Windows.

It’s built around ARM technology, much like modern phones and tablets, although its ARM technology has been scaled up to work with massively powerful Nvidia Blackwell technology that is capable of 1 petaFLOP of AI performance using FP4.

This has all been shoehorned into a gorgeously engineered 150mm-square, 51mm-high form factor that resembles an oversized NUC.

The system can be used directly by attaching a mouse, keyboard, and screen, but it’s also intended to be used in a headless mode from another system, which might explain why it comes with relatively modest onboard storage.

What this system doesn’t allow for is much expansion, at least internally. The inclusion of a special networking connection, the Nvidia ConnectX-7 port, does allow another Ascent GX10 node to be stacked on top, doubling the amount of processing power and the price.

The platform that runs the integrated AI software stack is Ubuntu Linux, so familiarity with that might be useful for those wanting to work directly on it.

As anyone working in AI can already attest, nothing to do with this type of development is cheap, and the Asus Ascent GX10 is more than $3000 for a single node.

But given the expertise needed to use this hardware and the associated developer costs, this AI-focused hardware might be the least expensive part of any project. And, with memory costs rising dramatically, a system with 128GB of LPDDR5X onboard it might be significantly more expensive by the end of 2026 than it is at the start of it.

Asus Ascent GX10: Price and availability

  • How much does it cost? From $3090, £2800
  • When is it out? Available now
  • Where can you get it? From online retailers.

The ASUS Ascent GX10 isn’t available directly from Asus, but it's easy to find at many online retailers, including Amazon.

For US readers, the price on Amazon.com is $3099.99 for the 1TB storage SKU (GX10-GG0015BN), and $4,149.99 for the 4TB storage model (GX10-GG0016BN).

Given that a 4TB Gen 5 SSD is about $500, that is a remarkable price hike for the extra storage capacity.

For UK readers, on Amazon.co.uk the 1TB model price is £3769, but I found it via online retailer SCAN for a more palatable £2799.98. SCAN also carries a 2TB option for $3199.99 and the 4TB model for £3638.99.

The important details of this platform are that the hardware inside the GX10 isn’t exclusive to Asus, as Nvidia GPUs are (in theory) available across a number of brands, and Nvidia has its own model.

The Nvidia DGX Spark Personal AI Supercomputer, as the originator modestly calls it, costs £3699.98 in the UK, for a system with 128GB of RAM and 4TB of storage.

Acer offers the Veriton AI GN100, which bears an uncanny visual resemblance to the Asus but comes with 4TB of storage, like the Nvidia option. This is £3999.99 direct from Acer in the UK, but only $2999.99 from Acer in the US.

Another choice is the Gigabyte AI TOP ATOM Desktop Supercomputer, a 4TB storage model that sells for £3479.99 from SCAN in the UK, and can be found on Amazon.com for $3999.

And the final model with the same spec as most is the MSI EdgeXpert Desktop AI Supercomputer, selling for £3,598.99 from SCAN in the UK, and $3999 on Amazon.com for US customers.

Overall, the prices of all these products are roughly in the same ballpark, but the Asus in its 1TB configuration is one of the cheaper choices, especially for those in Europe.

Asus Ascent GX10 AI Supercomputer

(Image credit: Mark Pickavance)

Asus Ascent GX10: Specs

Item

Spec

CPU:

ARM v9.2-A CPU (GB10) (20 ARM cores, 10 Cortex-X925, 10 Corex-A725)

GPU:

NVIDIA Blackwell GPU (GB10, integrated)

RAM:

128 GB LPDDR5x, unified system memory

Storage:

1TB M.2 NVMe PCIe 4.0 SSD storage

Expansion:

N/A

Ports:

3x USB 3.2 Gen 2x2 Type-C, 20Gbps, alternate mode (DisplayPort 2.1) 1x USB 3.2 Gen 2x2 Type-C,with PD in(180W EPR PD3.1 SPEC) 1x HDMI 2.1 1x NVIDIA ConnectX-7 SmartNIC

Networking:

10GbE LAN, AW-EM637 Wi-Fi 7 (Gig+) , Bluetooth 5.4

OS:

Nvidia DGX OS (Ubuntu Linux)

PSU:

48V 5A 240W

Dimensions:

150 x 150 x 51 mm (5.91 x 5.91 x 2.01 inch)

Weight:

1.48kg

Asus Ascent GX10: Design

  • Uber NUC
  • Connect-7 scalability
  • Limited internal access

Asus Ascent GX10 AI Supercomputer

(Image credit: Mark Pickavance)

While the GX10 looks like an oversized NUC mini PC, at 1.48kg it's heavier than any I’ve previously encountered. And that doesn’t include the substantial 240W PSU.

The front is an elegant grill with only the power button for company, and all the ports are on the rear. These include four USB-C ports, one of which is required for the PSU to connect, a single 10GbE LAN port and a single HDMI 2.1 video out.

You can connect more than one monitor by using the USB 3.2 Gen 2x2 ports in DP Alt mode, if you have the adapters to convert those into DisplayPort.

What seems mildly odd is that Asus went with three USB 3.2 Gen 2x2, a standard that was an effective dead end in USB development, and not USB4. And, there are no Type-A USB ports at all, forcing the buyer to use an adapter or hub to attach a mouse and keyboard to this system.

As mice and keyboards are still mostly USB-A, that’s slightly irritating.

But what makes this system truly interesting is the inclusion of a ConnectX-7 Smart NIC alongside the more conventional 10GbE Ethernet port.

The best the 10GbE LAN port can offer is a data transfer of around 840MB/s, which is technically slower than the USB ports, even if it's quick by networking technology.

The ConnectX-7 port is a technology developed by Mellanox Technologies Ltd, an Israeli-American multinational supplier of computer networking products based on InfiniBand and Ethernet technology that was acquired by Nvidia in 2019.

In this context, ConnectX-7 provides a means to link a second GX10 directly over a 200 Gbit/s (25 GB/s) InfiniBand network, enabling performance scaling across the two systems.

There are certainly parallels with this type of technology to the time when Nvidia enabled two GPUs to work in unison using a dedicated interconnect, but the ConnectX-7 interface is a much more sophisticated option where both processing and memory can be used in collective exercise, enabling the handling of large-scale models with over 400 billion parameters. That's double the 200 billion that a single unit can cope with.

Asus Ascent GX10 AI Supercomputer

(Image credit: Mark Pickavance)

Mellanox does make ConnectX switches, but I’m not sure if it is possible to connect more than two GX10 via one of those. Being realistic, each system is still only capable of 200 Gbit/s communication, so adding additional nodes beyond two might offer diminishing returns. But this technology is utilised in switched fabrics for enterprise data centres and high-performance computing, and in these scenarios, the Mellanox Quantum family of InfiniBand switches supports up to 40 ports running at HDR 200 Gbit/s.

It may be that products like the GX10 will be the vanguard for the wider use and application of ConnectX technology, and a blueprint for easily expandable clusters.

However, the last aspect I looked at on the GX10 was a disappointment, and it was the only nod to upgradability that this system has, beyond adding a second machine.

On the underside of the GX10 is a small panel that can be removed to provide access to the one M.2 NVMe drive that this system supports.

In our review, the hardware was occupied by a single 2242 M.2 PCIe 4.0 1TB drive, although you can also get this system with 4TB. The fact that there wasn’t room for a 2280 drive is a shock, because that effectively limits the maximum internal storage to 4TB.

But conversely, the only other of these types of systems I’ve seen, the Acer GN100 AI Mini Workstation, has no access to the internal storage at all. So perhaps Asus Ascent GX10 owners should be thankful for small mercies.

Asus Ascent GX10: Features

  • ARM 20-core CPU
  • Grace Blackwell GB10
  • AI platforms compared

The Nvidia GB10 Grace Blackwell Superchip represents a significant leap in AI hardware, emerging from a collaborative effort between Nvidia and ARM. Its origins lie in the growing demand for specialised computing platforms capable of supporting the rapid development and deployment of artificial intelligence models. Unlike traditional x86-based systems, the GB10 is built around ARM v9.2-A architecture, featuring a combination of 20 ARM cores—specifically, 10 Cortex-X925 and 10 Cortex-A725 cores. This design choice reflects a broader industry trend towards ARM-based solutions, which offer improved efficiency and scalability for AI workloads.

The GB10’s capabilities are nothing short of remarkable. It integrates a powerful Nvidia Blackwell GPU paired with the ARM CPU, delivering up to a petaFLOP of AI performance using FP4 precision. This level of computational power is particularly suited to the training and inference of large language models (LLMs) and diffusion models, which underpin much of today’s generative AI. The system is further enhanced by 128GB of unified LPDDR5x memory, ensuring that even the most demanding AI tasks can be handled efficiently.

The GB10’s operating environment is based on Ubuntu Linux, specifically tailored with NVIDIA’s DGX OS, making it an ideal platform for developers familiar with open-source AI tools and workflows.

There is an exceptionally fine irony to this OS choice, since Nvidia’s hardly been a friend to Linux over the past three decades, and has actively obstructed its attempts to compete more widely with Microsoft Windows. If anyone doubts my opinion on the relationship between Linux and Nvidia, then search for “Linus Torvalds” and “Nvidia”. Recently, Linus has warmed to the company, but much less to Nvidia CEO Jensen Huang. And, he’s not a fan of the AI industry, which he described as "90% marketing and 10% reality".

Looking to the future, the evolution of the GB10 and similar superchips will likely be shaped by the ongoing arms race in AI hardware. As models grow ever larger and more complex, the need for even greater memory bandwidth, faster interconnects, and more efficient processing architectures will drive innovation. The modularity offered by technologies like ConnectX-7 hints at a future where AI systems can be scaled seamlessly by linking multiple nodes, enabling the handling of models with hundreds of billions of parameters.

In terms of raw AI performance, the GB10 delivers up to 1 petaFLOP at FP4 precision, which is heavily optimised for quantised AI workloads. While this is less than the multi-petaFLOP performance of NVIDIA’s flagship data centre chips (such as the Blackwell B200 or GB200), the GB10’s power efficiency is a standout. It operates at around 140W TDP, far lower than the 250W or more seen in GPUs like the RTX 5070, yet offers vastly more memory (128GB vs 12GB on the 5070). This makes the GB10 especially suitable for developers and researchers who need to work with large models locally, without the need for a full server rack.

Asus Ascent GX10 AI Supercomputer

(Image credit: Mark Pickavance)

While there are some other players hidden in the shadows, mostly Chinese, the key AI players are Nvidia, AMD, Google and Apple.

NVIDIA has the Blackwell B200/GB200 products for datacenter flagships, offering up to 20 petaFLOPS of sparse FP4 compute and massive HBM3e memory bandwidth. These are massively expensive enterprise products, and the GB10, by contrast, is a scaled-down, more accessible version for desktop and edge use, trading some peak performance for efficiency and compactness.

AMD's line of AI accelerators is the Instinct MI300/MI350, these are competitive in terms of raw compute and memory bandwidth, with the MI350X offering up to 288GB HBM3e and strong FP4/FP6 performance. But these don’t offer the same level of flexibility as the GB10, even if they’re better suited to interference tasks. And, the same can be said for Google TPU v6/v7, a technology that is highly efficient for large-scale inference and is optimised for Google’s own cloud and AI services.

Whereas Apple M3/M4/M5 and Edge AI Chips are optimised for on-device AI in consumer products, with impressive efficiency and integrated neural engines. However, these chips are not designed for large-scale model training or inference, and their memory and compute capabilities are far below what the GB10 offers for professional AI development.

The NVIDIA GB10 Grace Blackwell Superchip stands out as a bridge between consumer AI hardware and data centre accelerators. It offers a unique blend of high memory capacity, power efficiency, and local accessibility, making it ideal for developers and researchers who need serious AI capability without the scale or cost of a full server. While it cannot match the absolute peak performance of the largest data centre chips, its unified memory, advanced interconnects, and software support make it a compelling choice for cutting-edge AI work at the desktop.

However, that statement does assume that current AI is a path work taking.

Asus Ascent GX10: AI Reality Check

Asus Ascent GX10 AI Supercomputer

(Image credit: Mark Pickavance)

Looking at the specifications of the Asus Ascent GX10, it's easy to be impressed by how much computing power Asus, with the help of Nvidia, has managed to squeeze into a tiny computer, and its ability to scale.

However, there are three practical little pigs living in this AI straw house, and in this story, I’m the wolf.

Those researching AI might think I’m referring to the three AI issues that confront all public implementations. Those being algorithmic bias, lack of transparency (aka explainability), and the significant ethical/societal risks associated with the spread of misinformation. But I’m not, because these are potentially fixable to a degree.

Instead, I’m talking about the three unfixable issues with current models

Almost every AI platform is based on a concept called the Deep Neural Net, and under that are two approaches that are generally classified as LLM (Large Language Models) and Diffusion models, which are the ones that can generate images and video.

What both these sides of the Deep Neural Net coin show is a pattern-matching approach to problems, like the computer is playing a complex version of the children’s card game Snap. The results are coloured by the scale of the data and how quickly the routines and hardware platforms find the patterns.

Before IBM made computers, they sold card files, with the concept that it was quicker to navigate the cards to the information you wanted.

It’s a generalisation, but these models are purely more sophisticated versions of that, because if the pattern they’re looking for doesn’t exist in the data, then the routine can’t inspirationally create it.

To make the results seem less random, model designers have tried to specialise their AI constructs to focus on narrower criteria, but the Nirvana of AGI (Artificial General Intelligence) is that the AI should be generally applicable to almost any problem.

How this issue manifests in AI responses is that when confronted with a pattern that the routine can’t match accurately, it just offers up the partial matches it found that may or may not be related at all.

These ‘hallucinations’, as they’re often called, are a choice the model makers have between the AI admitting it has no idea what the answer is, and delivering a response that’s got a remarkably low possibility of being correct. Given that AI companies don’t like the idea of their models admitting they haven’t a clue what the answer is, hallucinations are deemed preferable.

Asus Ascent GX10 AI Supercomputer

(Image credit: Mark Pickavance)

Perhaps some of the problem here is not AI, but that users aren’t trained to check what the AI is producing, which isn’t entirely spurious.

The next issue is the classic ‘prompt injection’ issue, where you ask a question, and then, often based on the response, you realise you asked the wrong one, and then proceed in an entirely different direction. The AI doesn’t recognise this pivot and tries to apply its previous pattern constructions to the new problem, and becomes entirely confused.

And the final piglet, where current AI entirely falls down, might be classed as original thinking, where what the user wants is a new approach to a problem that hasn’t been documented before. What has defined humans as being especially impressive thinkers is their ability to abstract, and that is something that current AI doesn’t do, even modestly.

While prompt injection can probably be solved, the other two issues regarding generalisation and abstraction are unlikely to be fixed by the Deep Neural Net, these need a radically new approach, and ironically, not one that AI is likely to come up with.

Some of you reading this will be wondering why I’ve inserted this information into this product reveal, but the whole purpose of the Asus Ascent GX10 is to facilitate the design and testing of LLMs and Diffusion models, and at this time, these have significant limitations.

But critically, the development of the whole Deep Neural Net direction doesn’t appear to have resolution to some of the more problematic issues, which suggests it might ultimately be a dead end.

It might turn out to be useful for lots of problems, but it's not the AI we’re looking for, and the likelihood of it evolving into that true artificial intelligence is extremely low.

This is especially relevant to the Asus Ascent GX10, since it doesn’t have a practical purpose beyond the creation of models, as it’s not a PC.

These aren’t all the issues associated with AI, but they’re some of the ones that might directly impact those buying the GX10, at some point or another.

Asus Ascent GX10: Early verdict

Asus Ascent GX10 AI Supercomputer

(Image credit: Mark Pickavance)

It’s exciting to see Asus make something this radical, showing that it truly believes in a post-Windows, post-PC future where hardware is purely specified for a specific task, and in the case of the Asus Ascent GX10, that’s AI model development.

I’ve already covered the caveats regarding that subject, so for the purpose of this conclusion, let's pretend that AI is the solid bet that some think, and not an underachieving dead end that others believe.

For corporations, the cost of this hardware won’t be an issue for their IT people to experience building AI models and evaluating their worth.

The beauty of a system like the GX10 is that it’s a finite cost, unlike buying access to an AI server centre cluster, which will be an ongoing cost, and likely to become more expensive if demand is high. While the data centre still might be needed for the biggest projects, or for deployment, the GX10 does provide a first rung for any proof of concept.

However, if the AI path is not the one that is ultimately taken, this machine becomes mostly a beautifully engineered paperweight.

For more compact computing, see our guide to the best mini PCs you can buy

Apple releases iOS 26.2.1 and iPadOS 26.2.1
4:01 am |

Author: admin | Category: Mobile phones news | Comments: Off

Today, Apple has released iOS 26.2.1 and iPadOS 26.2.1. While these are minor incremental versions of the software as the numbers imply, they do bring one significant new feature to both iPhones and iPads. That is support for Apple's second-generation AirTag tracker, which was announced today. This comes with a new UWB chip, which supports Precision Finding. That can guide you to your tracked item from 50% farther away. The new AirTag also has an upgraded Bluetooth chip, with expanded range, and a 50% louder speaker. And with iOS 26.2.1 and iPadOS 26.2.1, you can now control the...

« Previous PageNext Page »