Organizer
Gadget news
Apple’s new M3 chips are built on the 3 nm process, major GPU improvements in tow
8:00 am | October 31, 2023

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

Apple’s scary event saw the arrival of the company’s new chips for personal computers - the M3, M3 Pro, and M3 Max. They will be available in the 14” and 16” MacBook Pro laptops, and the entry-level variant is also in the 24” iMac. The biggest improvement is the GPU, which aims to improve the performance of professional apps and gaming. It will support hardware-accelerated ray tracing and mesh shading, which is a first for Apple silicon. Another first is the 3 nm process, as Cupertino is the first to implement the technology in chips for personal computers. Story is developing…

Intel Arc A770 review: a great 1440p graphics card for those on a budget
4:00 pm | October 16, 2023

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Intel Arc A770: One-minute review

The Intel Arc A770 has had quite a journey since its release back on October 12, 2022, and fortunately, it has been a positive one for Intel despite a somewhat rocky start.

Right out the gate, I'll say that if you are looking for one of the best cheap graphics cards for 1440p gaming, this card definitely needs to be on your list. It offers great 1440p performance for most modern PC titles that most of us are going to be playing and it's priced very competitively against its rivals. 

Where the card falters, much like with my Intel Arc A750 review earlier this year, is with older DirectX 9 and DirectX 10 titles, and this really does hurt its overall score in the end. Which is a shame, since for games released in the last five or six years, this card is going to surprise a lot of people who might have written it off even six months ago.

Intel's discrete graphics unit has been working overtime on its driver for this card, providing regular updates that continue to improve performance across the board, though some games benefit more than others. 

Naturally, a lot of emphasis is going to be put on more recently released titles. And even though Intel has also been paying attention to shoring up support for older games as well, if you're someone with an extensive back catalog of DX9 and DX10 titles from the mid-2000s that you regularly return to, then this is not the best graphics card for your needs. Nvidia and AMD drivers carry a long legacy of support for older titles that Intel will honestly never be able to match.

But if what you're looking for is the best 1440p graphics card to play the best PC games of the modern era but you're not about to plop down half a grand on a new GPU, then the Intel Arc A770 is going to be a very solid pick with a lot more to offer than many will probably realize.

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Intel Arc A770: Price & availability

  • How much is it? US MSRP for 16GB card: $349 (about £280/AU$510); for 8GB card: $329 (about £265/AU$475)
  • When was it released? It went on sale on October 12, 2022
  • Where can you buy it? Available in the US, UK, and Australia

The Intel Arc A770 is available now in the US, UK, and Australia, with two variants: one with 16GB GDDR6 VRAM and an official US MSRP of $349 (about £280/AU$510), and one with 8GB GDDR6 VRAM and an official MSRP of $329 (about £265/AU$475).

Those are the launch MSRPs from October 2022, of course, and the cards have come down considerably in price in the year since their release, and you can either card for about 20% to 25% less than that. This is important, since the Nvidia GeForce RTX 4060 and AMD Radeon RX 7600 are very close to the 16GB Arc A770 cards in terms of current prices, and offer distinct advantages that will make potential buyers want to go with the latter rather than the former.

But those decisions are not as cut and dry as you might think, and Intel's Arc A770 holds up very well against modern midrange offerings, despite really being a last-gen card. And, currently, the 16GB variant is the only 1440p card that you're going to find at this price, even among Nvidia and AMD's last-gen offerings like the RTX 3060 Ti and AMD Radeon RX 6750 XT. So for 1440p gamers on a very tight budget, this card fills a very vital niche, and it's really the only card that does so.

  • Price score: 4/5

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Intel Arc A770: Design

  • Intel's Limited Edition reference card is gorgeous
  • Will fit most gaming PC cases easily
Intel Arc A770 Limited Edition Design Specs

Slot size: Dual slot
Length: 11.02 inches | 280mm
Height: 4.53 inches | 115mm
Cooling: Dual fan
Power Connection: 1 x 8-pin and 1 x 6-pin
Video outputs: 3 x DisplayPort 2.0, 1 x HDMI 2.1

The Intel Arc A770 Limited Edition that I'm reviewing is Intel's reference model that is no longer being manufactured, but you can still find some stock online (though at what price is a whole other question). 

Third-party partners include ASRock, Sparkle, and Gunnir. Interestingly, Acer also makes its own version of the A770 (the Acer Predator BiFrost Arc A770), the first time the company has dipped its toe into the discrete graphics card market.

All of these cards will obviously differ in terms of their shrouds, cooling solutions, and overall size, but as far as Intel's Limited Edition card goes, it's one of my favorite graphics cards ever in terms of aesthetics. If it were still easily available, I'd give this design five out of five, hands down, but most purchasers will have to opt for third-party cards which aren't nearly as good-looking, as far as I'm concerned, so I have to dock a point for that.

It's hard to convey from just the photos of the card, but the black finish on the plastic shroud of the card has a lovely textured feel to it. It's not quite velvety, but you know it's different the second you touch it, and it's something that really stands out from every other card I've reviewed.

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

The silver trim on the card and the more subtle RGB lighting against a matte black shroud and fans really bring a bit of class to the RGB graphics card I typically see. The twin fans aren't especially loud (not any more so than other dual-fan cards, at least), and the card feels thinner than most other similar cards I've reviewed and used, whether or not the card is thinner in fact.

The power connector is an 8-pin and 6-pin combo, so you'll have a pair of cables dangling from the card which may or may not affect the aesthetic of your case, but at least you won't need to worry about a 12VHPWR or 12-pin adapter like you do with Nvidia's RTX 4000-series and 3000-series cards.

You're also getting three DisplayPort 2.0 outputs and an HDMI 2.1 output, which puts it in the same camp as Nvidia's recent GPUs, but can't match AMD's recent move to DisplayPort 2.1, which will enable faster 8K video output. As it stands, the Intel Arc A770 is limited to 8K@60Hz, just like Nvidia. Will you be doing much 8K gaming on a 16GB card? Absolutely not, but as we get more 8K monitors next year, it'd be nice to have an 8K desktop running at 165Hz, but that's a very speculative prospect at this point, so it's probably not anything anyone looking at the Arc A770 needs to be concerned about.

  • Design Score: 4 / 5

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Intel Arc A770: Specs & features

  • Good hardware AI cores for better XeSS upscaling
  • Fast memory for better 1440p performance

Intel's Xe HPG architecture inside the Arc A770 introduces a whole other way to arrange the various co-processors that make up a GPU, adding a third, not very easily comparable set of specs to the already head-scratching differences between Nvidia and AMD architectures.

Intel breaks up its architecture into "render slices", which contain 4 Xe Cores, which each contain 128 shaders, a ray tracing processor, and 16 matrix processors (which are directly comparable to Nvidia's vaunted tensor cores at least), which handle graphics upsampling and machine learning workflows. Both 8GB and 16GB versions of the A770 contain eight render slices for a total of 4096 shaders, 32 ray processors, and 512 matrix processors.

The ACM-G10 GPU in the A770 runs at 2,100MHz base frequency with a 2,400MHz boost frequency, with a slightly faster memory clock speed (2,184MHz) for the 16GB variant than the 8GB variant's 2,000MHz. This leads to an effective memory speed of 16 Gbps for the 8GB card and 17.5 Gbps for the 16GB.

With a 256-bit memory bus, this gives the Arc A770 a much wider lane for high-resolution textures to be processed through, reducing bottlenecks and enabling faster performance when gaming at 1440p and higher resolutions thanks to a 512 GB/s and 559.9 GB/s memory bandwidth for the 8GB and 16GB cards, respectively.

All of this does require a good bit of power, though, and the Arc A770 has a TDP of 225W, which is higher than most 1440p cards on the market today.

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

As far as features all this hardware empowers, there's a lot to like here. The matrix cores are leveraged to great effect by Intel's XeSS graphics upscaling tech found in a growing number of games, and this hardware advantage generally outperforms AMD's FSR 2.0, which is strictly a software-based upscaler.

XeSS does not have frame generation though, and the matrix processors in the Arc A770 are not nearly as mature as Nvidia's 3rd and 4th generation tensor cores found in the RTX 3000-series and RTX 4000-series, respectively.

The Arc A770 also has AV1 hardware-accelerated encoding support, meaning that streaming videos will look far better than those with only software encoding at the same bitrate, making this a compelling alternative for video creators who don't have the money to invest in one of Nvidia's 4000-series GPUs.

  • Specs & features: 3.5 / 5

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Intel Arc A770: Performance

  • Great 1440p performance
  • Intel XeSS even allows for some 4K gaming
  • DirectX 9 and DirectX 10 support lacking, so older games will run poorly
  • Resizable BAR is pretty much a must

At the time of this writing, Intel's Arc A770 has been on the market for about a year, and I have to admit, had I gotten the chance to review this card at launch, I would probably have been as unkind as many other reviewers were.

As it stands though, the Intel Arc A770 fixes many of the issues I found when I reviewed the A750, but some issues still hold this card back somewhat. For starters, if you don't enable Resizable BAR in your BIOS settings, don't expect this card to perform well at all. It's an easy enough fix, but one that is likely to be overlooked, so it's important to know that going in.

Image 1 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 2 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 3 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 4 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 5 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 6 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 7 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 10 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 11 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 12 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 13 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 14 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 15 of 15

Synthetic benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)

In synthetic benchmarks, the A770 performed fairly well against the current crop of graphics cards, despite its effectively being a last-gen card. It is particularly strong competition against the Nvidia RTX 4060 Ti across multiple workloads, and it even beats the 4060 Ti in a couple of tests.

Its Achilles Heel, though, is revealed in the PassMark 3D Graphics test. Whereas 3DMark tests DirectX 11 and DirectX 12 workloads, Passmark's test also runs DirectX 9 and DirectX 10 workflows, and here the Intel Arc A770 simply can't keep up with AMD and Nvidia.

Image 1 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 2 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 3 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 4 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 5 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 6 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 7 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 10 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 11 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 12 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 13 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 14 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 15 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 16 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 17 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 18 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 19 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 20 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 21 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 22 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 23 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 24 of 24

Non-ray traced, non-upscaled  gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)

In non-ray-traced and native-resolution gaming benchmarks, the Intel Arc A770 managed to put up some decent numbers against the competition. At 1080p, the Arc A770 manages an average of 103 fps with an average minimum fps of 54. At 1440p, it averages 78 fps, with an average minimum of 47, and even at 4K, the A770 manages an average of 46 fps, with an average minimum of 27 fps.

Image 1 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 2 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 3 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 4 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 5 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 6 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 7 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 10 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 11 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 12 of 12

Ray-traced gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)

Turn on ray tracing, however, and these numbers understandably tank, as they do for just about every card below the RTX 4070 Ti and RX 7900 XT. Still, even here, the A770 does manage an average fps of 41 fps, with an average minimum of 32 fps) at 1080p with ray tracing enabled, which is technically still playable performance. Once you move up to 1440p and 4K, however, your average title isn't going to be playable at native resolution with ray tracing enabled.

Image 1 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 2 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 3 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 4 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 5 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 6 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 7 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 9

Ray-traced and balanced upscaled gaming benchmark results for the Intel Arc A770

(Image credit: Future / Infogram)

Enter Intel XeSS. When set to "Balanced", XeSS turns out to be a game changer for the A770, getting it an average framerate of 66 fps (with an average minimum of 46 fps) at 1080p, an average of 51 fps (with an average minimum of 38 fps) at 1440p, and an average 33 fps (average minimum 26 fps) at 4K with ray tracing maxed out.

While the 26 fps average minimum fps at 4K means it's really not playable at that resolution even with XeSS turned on, with settings tweaks, or more modest ray tracing, you could probably bring that up into the low to high 30s, making 4K games playable on this card with ray tracing turned on. 

That's something the RTX 4060 Ti can't manage thanks to its smaller frame buffer (8GB VRAM), and while the 16GB RTX 4060 Ti could theoretically perform better (I have not tested the 16GB so I cannot say for certain), it still has half the memory bus width of the A770, leading to a much lower bandwidth for larger texture files to pass through.

This creates an inescapable bottleneck that the RTX 4060 Ti's much larger L2 cache can't adequately compensate for, and so takes it out of the running as a 4K card. When tested, very few games managed to maintain playable frame rates even without ray tracing unless you dropped the settings so low as to not make it worth the effort. The A770 16GB, meanwhile, isn't technically a 4K card, but it can still dabble at that resolution with the right settings tweaks and still look reasonably good.

Image 1 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 2 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 3 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 4 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 5 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 6 of 9

The final average performance benchmark scores for the Intel Arc A770

(Image credit: Future / John Loeffler)
Image 7 of 9

Final performance scores for the Intel Arc A770

(Image credit: Future / Infogram)
Image 8 of 9

Final performance scores for the Intel Arc A770

(Image credit: Future / Infogram)
Image 9 of 9

Final performance scores for the Intel Arc A770

(Image credit: Future / Infogram)

All told, then, the Intel Arc A770 turns out to be a surprisingly good graphics card for modern gaming titles that can sometimes even hold its own against the Nvidia RTX 4060 Ti. It can't hold a candle to the RX 7700 XT or RTX 4070, but it was never meant to, and given that those cards cost substantially more than the Arc A770, this is entirely expected.

Its maximum observed power draw of 191.909W is pretty high for the kind of card the A770 is, but it's not the most egregious offender in that regard. All this power meant that keeping it cool was a struggle, with its maximum observed temperature hitting about 74 ºC.

Among all the cards tested, the Intel Arc A770 was at nearly the bottom of the list with the RX 6700 XT, so the picture for this card might have been very different had it launched three years ago and it had to compete with the RTX 3000-series and RX-6000 series exclusively. In the end, this card performs like a last-gen card, because it is. 

Despite that, it still manages to be a fantastic value on the market right now given its low MSRP and fairly solid performance, rivaling the RTX 4060 Ti on the numbers. In reality though, with this card selling for significantly less than its MSRP, it is inarguably the best value among midrange cards right now, and it's not even close.

  • Performance score: 3.5 / 5

An Intel Arc A770 LE graphics card on a table with a pink desk mat

(Image credit: Future / John Loeffler)

Should you buy the Intel Arc A770?

Buy the Intel Arc A770 if...

Don't buy it if...

Also Consider

If my Intel Arc A770 review has you considering other options, here are two more graphics cards for you to consider.

How I tested the Intel Arc A770

  • I spent several days benchmarking the card, with an additional week using it as my primary GPU
  • I ran our standard battery of synthetic and gaming benchmarks 
Test Bench

These are the specs for the test system used for this review:
CPU: Intel Core i9-13900K
CPU Cooler: 
Cougar Poseidon GT 360 AIO Cooler
Motherboard: MSI MPG Z790E Tomahawk Wifi
Memory: 
64GB Corsair Dominator Platinum RGB DDR5-6000
SSD: Samsung 990 Pro
PSU: Thermaltake PF3 1050W ATX 3.0
Case: Praxis Wetbench

I spent about two weeks with the Intel Arc A770 in total, with a little over half that time using it as my main GPU on my personal PC. I used it for gaming, content creation, and other general-purpose use with varying demands on the card.

I focused mostly on synthetic and gaming benchmarks since this card is overwhelmingly a gaming graphics card. Though it does have some video content creation potential, it's not enough to dethrone Nvidia's 4000-series GPUs, so it isn't a viable rival in that sense and wasn't tested as such.

I've been reviewing computer hardware for years now, with an extensive computer science background as well, so I know how graphics cards like this should perform at this tier.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

  • First reviewed October 2023
Asus Tuf RX 7900 XTX Gaming OC review
2:47 am | August 8, 2023

Author: admin | Category: Computers Gadgets | Tags: , , , | Comments: Off

Asus Tuf RX 7900 XTX Gaming OC: 30 second review

The Asus Tuf RX 7900 XTX Gaming OC has been billed as being "optimized inside and out for lower temps and durability", so this partner card features a massive heatsink and a quad-slot design made for overclocking with case space as no option. 

Despite its slightly higher price and bigger overall size, I am comfortable considering this variant as not only one of the best 4K graphics cards but also one of the best graphics cards currently available. 

It's not as much of a slam dunk as the AMD Radeon RX 7900 XTX which delivers phenomenal price-to-performance with its sleeker package. However, for those wanting to overclock and really push RDNA 3 to its limits, the Asus Tuf RX 7900 XTX Gaming OC excels in all the right places. 

That shouldn't be entirely surprising considering it is built on the foundations of the best AMD graphics card. Given the extra bells and whistles, you're going to pay slightly more than the base AMD MSRP for the 7900 XTX, with the MSRP for Asus's variant running $1,099 / £1,249.98 /  AU$1,439. But, if you've got a sizable case and a little extra to spend, Asus's latest offering is something that you should seriously consider.

Asus Tuf RX 7900 XTX Gaming OC: Price & Availability

  • How much is it? $1,099 / £1,249.98 /  AU$1,439
  • When was it released? November 11, 2022
  • Where can you get it? The US, the UK, Australia, and other regions

The Asus Tuf RX 7900 XTX Gaming OC was released back in November 2022 around a month after AMD released its RX 7900 XTX reference card. While the RDNA 3 flagship was priced at $999 / £899 / AU$1,789 at launch, Asus' variant ran a little higher in regions such as the US and the UK at the end of last year, however, that has now softened in the passing months. 

It's been possible to find the Tuf RX 7900 XTX Gaming OC available in the US and the UK matching the MSRP of the reference model at retailers such as Amazon and Ebuyer. In the UK especially, Asus' model is actually one of the cheapest of the competition, but in the US, you're still paying around $100 more when not discounted. 

However, considering that this card is a 24GB GDDR6 4K, and even 8K, graphics card, the price point still undercuts the Nvidia RTX 4090 significantly. Still, if you're on a tighter budget, then you may be better off with AMD's reference card if you can find it. 

  • Value: 4 / 5

Asus Tuf RX 7900 XTX Gaming OC: Design & Features

Asus TUF RX 7900 XTX Gaming OC

(Image credit: Future)
  • One of the largest XTX models available
  • Excellent heatsink for overclocking 
  • 3x 8-pin connectors 
Asus Tuf RX 7900 XTX Gaming OC specs

GPU: Navi 31
Stream Processors: 6,144
AI Accelerators: 192
Ray Accelerators: 96
Power draw (TGP): 355W
Base clock: 1,895 MHz
Boost clock: 2,615 MHz
VRAM: 24GB GDDR6
Bandwidth: 960GB/s
Bus interface: PCIe 4.0 x16
Output: 1x HDMI 2.1; 3x DisplayPort 2.1
Power connector: 3x 8-pin

Without question, the Asus Tuf RX 7900 XTX Gaming OC is one of, if not the largest of all the current line-up of flagship RDNA 3 graphics cards. Where AMD won favor with its base model (a dual-slot width GPU which only used 2x 8-pin connectors) that's no longer the case with this partner card. 

That's because Asus' version uses 3x 8-pin connectors and is actually a quad-slot with its sizable cooler which makes it more comparable in size to Nvidia's RTX 4090 than the original GPU. 

The heatsink itself is truly stellar, though. It comes complete with a triple fan setup armed with a vented exoskeleton to really keep those core temps down. It also has RGB with a front badge on the side which looks nice if you've got a see-through case such as my NZXT H9 Flow (an easy contender for one of the best PC cases available,
by the way). 

That added bulk and height does have an impact though, even with a sizable mid-tower like mine, one of the biggest available in terms of raw building space, the power connectors themselves nearly bulging up against the tempered glass and that surprised me given its far leaner foundations.  

This supercharged heatsink isn't for nothing. As an overclocked card, the Asus Tuf RX 7900 XTX Gaming OC features a base clock speed of 1,895 MHz, which is around a 2% increase over the reference card. It's the fastest base-clocked XTX on the market, as the vast majority of the competition such as ASRock, PowerColor, Gigabyte, Sapphire, XFX, and BioStar don't come close. It may not sound too impressive, but that's just the start. There's an OC mode of up to 2,615 MHz boost clock. You're also getting a Game Clock of 2,395 MHz and a Shader Clock of 2,395 MHz for a 6% increase, and that's just out of the box. If you're someone who really wants to push what the latest RDNA 3 frontrunner can do then you'll be in good hands here. 

Speaking of the overclocking functionality itself, this is where Asus' GPU Tweak III software comes in, making it easy for you to push the card as you see fit. The free program, tailor-made for the company's video cards, has profiles for "OC Mode" and "Silent Mode" as well as user-activated sliders to control the GPU voltage, boost and memory clock, and the fans' speed. 

  • Design and Features: 4.5 / 5

Asus Tuf RX 7900 XTX Gaming OC: Performance

Asus TUF RX 7900 XTX Gaming OC in case

(Image credit: Future)

We've gone into the performance extensively what the 7900 XTX can do in our reference card review, so you can check out that review for more extensive data on frame rates and benchmark scores. What I will say here is that this is some of the most consistent 4K performance I've seen from a leading video card. The Asus Tuf RX 7900 XTX Gaming OC made short work of our benchmark titles such as Total War: Warhammer III, Cyberpunk 2077, and Metro Exodus in native 2160p resolution with framerates outclassing the previous 24GB leader, the RTX 3090 and the current Ada high-end RTX 4080

Realistically, you're looking at around a 2-5fps boost over the original AMD Radeon RX 7900 XTX when everything is dialed up which could help with some of the more demanding titles on the market. The crux here is Team Red's choice to go for GDDR6 instead of the newest GDDR6X memory, which has around a 43% data increase means you won't quite be on the bleeding edge, as opposed to the RTX 4090. Considering that this memory type is also hotter on the inside of the card, the extra heatsink here on the Tuf RX 7900 XTX Gaming OC seems like overkill. 

The card's 24GB VRAM is going to give you a significant amount of overhead for the next few years when gaming in 4K, though, even if it's slower. We're now at a point where many titles maxed out in 2160p require serious amounts of memory, as is the case with Diablo 4, which needs over 16GB memory for Ultra Textures in the target resolution. You should have a decent amount of headspace to keep maxing games out with the overclocking potential to squeeze those precious few extra frames, too. 

  • Performance: 4.5 / 5

Asus TUF RX 7900 XTX Gaming OC

(Image credit: Future)

Should you buy the Asus Tuf RX 7900 XTX Gaming OC?

Buy it if... 

Don't buy it if...

Dell Latitude 9440 Business Laptop Review
12:45 pm | July 18, 2023

Author: admin | Category: Computers Gadgets | Tags: , , , | Comments: Off

The Latitude 9440 2-in-1 from Dell is an outstanding laptop. It might just be the best business laptop available right now. For everything I would use this for as a business user, The Latitude 9440 handles it with flying colors. The computer is beautiful, the speeds are quick, and the laptop is excellent.

Dell Latitude 9440

(Image credit: Collin Probst // Future)

Unboxing and First Impressions

Unboxing the laptop was nothing exciting until I pulled the wrapper off the computer. That is when I first felt the matte finish on the 9440 2-in-1, and let me say, I love it. I am a massive fan of matte black and dark greys, so this laptop is a dream finish. While signing in, I noticed the keyboard and the touchpad. The touchpad, first of all, feels enormous. After doing some research, I found out the touchpad is, in fact, over 15% larger than the previous model. Second, the keyboard immediately felt comfortable, which says a lot about a keyboard. It felt natural to type from the first word I wrote.

Dell Latitude 9440

(Image credit: Collin Probst // Future)

The last thing I noticed right away was the need for more ports. If you are fully moved over to the new standard of USB-C with your devices, or if you plug into one of the best Thunderbolt docks at your desk, you're golden. If not, you'll run into the same problem MacBook Air users have where no legacy ports are available, so you'll need to resort to an adapter, dongle, or dock.

Dell Latitude 9440

(Image credit: Collin Probst // Future)

Design and Build Quality

Specs

*as tested

Dimensions: 12.20 x 8.46 x 0.64in
CPU: 13th-generation Intel Core processors
GPU: Intel Iris Xe Graphics
RAM: Up to 64GB
Display: 14-inch, 16:10
Resolution: 2560 x 1600 or 2660 x 1600
Storage: Up to 2TB
Weight: 3.38 lb

The Latitude 9440 2-in-1 from Dell has a screen that measures 14 inches but feels gigantic. This phenomenon is partially because of the high-resolution screen and partially because of the near bezel-less borders.

The touchpad, as mentioned, is significantly larger than the last generation of Latitude laptops. While not entirely necessary because this is a 2-in-1 laptop and has a full touchscreen, the larger trackpad is greatly appreciated when you need to get things done with a trackpad like a standard laptop.

The keyboard above the trackpad is quite comfortable to type on. This keyboard also has the same matte finish the laptop case does while remaining a very easy-to-use keyboard. While writing this review, my fingers don't feel any sense of discomfort or unfamiliarity, which means that the keys are spaced out well.

As mentioned, this laptop is almost entirely made of a matte dark grey material. Around the computer's edges, a band of slightly shiny material helps it pop visually and gives this computer a bit of a fancy look.

Dell Latitude 9440

(Image credit: Collin Probst // Future)

There are not a ton of ports on this laptop; outside of the three Thunderbolt or USB-C ports, there is only a single headphone jack. I have gotten to the full USB-C or Thunderbolt life, so I have an adapter with me at all times in my laptop backpack just in case I need it, and then my desk setups have Thunderbolt docks. If you don't have a system like that, you should pick up a Thunderbolt dock unless all your peripherals are USB-C or non-existent.

The last thing about this laptop is that the 16:10 aspect ratio is warmly welcomed. I love having a more vertical screen real estate, which boosts business productivity, particularly quite a bit.

Dell Latitude 9440

(Image credit: Collin Probst // Future)

In Use

Using this laptop for the last few weeks has been fantastic. I love this laptop. It hits all my marks in what I would want in a professional business laptop, and it looks good while doing it. Dell's Latitude line has been high-ranking on our lists for quite a while, and with good reason.

The 14-inch screen, as mentioned, feels massive. I can fit plenty of reference documents, websites, productivity tool windows, and so on without feeling like I want more while portable.

Dell Latitude 9440

(Image credit: Collin Probst // Future)

Whenever I grab this laptop, I love feeling the matte texture on my fingers. It's soft yet rugged while feeling premium. It's hard to describe in words, but it's incredible. I've already mentioned that the keyboard and touchpad are both excellent. The touchpad has integrated collaboration features which sadly only work for Zoom. However, when I have been able to use them, having soft buttons pop out of a touchpad feels like something out of a movie.

Dell Latitude 9440

(Image credit: Collin Probst // Future)

One more remarkably impressive part of this laptop is that it can actively be connected to two networks at once and switch between them as needed to keep the strongest and fastest connection. This feature is impressive, especially for power business users who take vital calls and can't risk losing connection. The way the business world is going, dropping a call is as good as losing a sale, contract, or business sometimes. So, being constantly connected to two networks with one as an always-ready, redundant network is incredible.

Dell Latitude 9440

(Image credit: Collin Probst // Future)

Final Verdict

All in all, this laptop is nearly perfect. If the price were lower, it would be perfect. However, some elements make it worth the cost. Regardless, this is an astounding laptop with great features, high build quality, and one of my favorite finishes in a computer to date. That's why I will happily still give this laptop a near-perfect rating.

Origin Chronos V3 review: big performance, small package
9:02 pm | July 13, 2023

Author: admin | Category: Computers Gadgets | Tags: , , | Comments: Off

Origin Chronos V3: One-minute review

Going as far back as 2014, the Origin Chronos line of gaming desktops have earned a reputation by providing incredible performance in a relatively compact shell. Throughout the years, design has gotten smaller as chip sets get more and more powerful, and the same goes for the Origin Chronos V3. 

Featuring a mid-tower ITX case set-up that’s 11 inches tall and 7 inches wide, the gaming desktop is small enough to pack in dozens of combinations split between various CPUs, GPUs, motherboards, RAM and SSD storage. 

It doesn’t matter what side of the Intel, Nvidia and AMD side of the fence one stands on, the amount of personalized options are remarkable. There’s even plenty of ventilation through its steel mesh panels that also allow up to 6 120mm fans to be used as well. This allows high-end gaming performance doesn’t become uncomfortably noisy when pushed to the max. 

However, this impressive package does come with some issues. Regardless of which configuration one finds themselves choosing, the Chronos V3 is going to cost a pretty penny. Starting at $1,501 for a build with an AMD Ryzen 5 7600X CPU and no discrete graphics, you can customize your Chronos V3 to the tune of more than $5,500. Of course, the max configuration is a beast of a machine, so the price is absolutely in line with what you're getting.

Meanwhile, the smaller design means port access located at the top instead of the rear may be problematic. Most importantly, upgrading various parts over-time may be problematic due to its mini ITX case. These restrictions won’t make the Chronos V3 any less desirable but may be something potential buyers should take into consideration given how much money they're likely to drop on this bad boy. 

Origin Chronos V3: Price & availability

A Origin Chronos V3 gaming PC shipping crate

Yes, they ship this thing out in a crate like it's the Ark of the Covenant (if you're into that kind of thing) (Image credit: Future / John Loeffler)
  • How much does it cost? Depending on the configuration, expect to spend between $1,785 and $5,695  
  •  When is it available? It is available now in the US only 
  •  Where can you get it? From Origin’s online store 

Currently only available stateside through Origin’s online store, the Chronos V3 gaming desktop can come in a variety of spec configurations that’s split between two white and black colorways. 

Our review setup runs about $3,050 (about £2,830/AU$4,560) and came packed with an Intel Core i7-13700K, Nvidia GeForce RTX 4080, 32 GB of DDR5 RAM, and a 1TB SSD with an additional 2 TB SSD storage. 

The Chronos V3 is going to be an all around expensive purchase regardless of what options you go for, but it is still reasonable on the lower end and not out of step with the best gaming PCs from manufacturers like Dell or Lenovo. 

Still, if you're looking for something much more on this side of affordable, do check out our best budget gaming PC  page for more affordable alternatives.

Though the front case design comes with two USB-A and one USB-C ports alongside a 3.5 mm headset jack across all configurations, port selection may differ due to the amount of motherboards available as well. This review configuration was an MPG Z790I Edge Wifi that granted four USB-A ports, a singular USB-C, RealTek 7.1 Audio Out capabilities, 2.5 LAN Ethernet port in addition to Intel Wi-Fi 6E and Bluetooth 5.3. 

At the cheaper end of the spectrum, users can get a viable build with an AMD Ryzen 5 7600X CPU, Nvidia RTX 4060 GPU, 32GB DDR5 RAM and a 500GB SSD. That’ll cost around $1,785 (about £1,425, AU$2,675). 

On the high end, for around $5,521 (about £4,420/AU$8,280), individuals can blow up their specs to a 24-core Intel i9-13900KS, Nvidia GeForce RTX 4080, 64GB DDR5 RAM, 8GB PCIe SSD storage alongside an extra 8GB SATA SSD and a bay-mounted, low-profile Blu-Ray writer because why the hell not? 

  • Value score: 3.5 / 5

Origin Chronos V3: Specs

A Origin Chronos V3 gaming PC on a desk

(Image credit: Future / John Loeffler)

The Origin Chronos V3 currently comes in any number of configurations, letting the number of potential builds run well over 100. 

Origin Chronos V3 : Design

A Origin Chronos V3 gaming PC on a desk

(Image credit: Future / John Loeffler)
  • Has a very small footprint  
  • Ports are arranged at the top of the unit near a ventilation fan  
  • Design isn’t completely future proof

The Origin Chronos V3’s design is rather svelte, meaning it won’t take up much space and may remind many of the Xbox Series X. The case alone is around 5 lbs as additional components shouldn’t make the gaming desktop a heavy lift. Moving the Chronos V3 around didn’t take much effort at all. 

When it comes to aesthetics, the desktop looks great while offering a premium design. It doesn’t matter if buyers go with the white or black colorway either as the customizable RGB lighting makes it visually pop. Despite the small design and power it contains, there’s plenty of ventilation through the steel mesh panels that can easily be removed for cleaning eventual dust build-up. 

Image 1 of 4

A Origin Chronos V3 gaming PC on a desk

(Image credit: Future / John Loeffler)
Image 2 of 4

A Origin Chronos V3 gaming PC on a desk

(Image credit: Future / John Loeffler)
Image 3 of 4

A Origin Chronos V3 gaming PC on a desk

(Image credit: Future / John Loeffler)
Image 4 of 4

A Origin Chronos V3 gaming PC on a desk

(Image credit: Future / John Loeffler)

Due to the design of the Chronos V3, ports are placed at the top instead of the rear. To keep things cleaner, they are accessible through a removable panel with an opening at rear for cable management. 

Some may have an issue with ports being placed at the top instead of the back, as well as the ports being so close to a ventilation fan. As mentioned previously, there are various motherboard options which will lead to different port configurations, but our review set up had enough ports at the top alongside the two additional USB-A and single USB-C near the power bottom at the front panel’s lower portion.

By default, the biggest issue with the design will be upgradability. Replacing CPU, RAM and Storage won’t be much of an issue but the cramped space is going to make upgrading GPUs and motherboards in the future a problem. At the very least, Origin does offer options to send the gaming desktop back to have them upgraded if it becomes too much of a hassle.

  • Design score: 4 / 5

Origin Chronos V3 : Performance

A Origin Chronos V3 gaming PC on a desk

(Image credit: Future / John Loeffler)
  • Our review configuration provided respectable 1440p gaming at max settings  
  • Quiet fans despite the performance specs and small design
  • Lack of flagship GPU options limiting native 4K performance  

Considering the smaller case design of the Origin Chronos V3, there’s some serious horsepower packed in. During testing, our Intel Core i7-13700K and RTX 4080 combo provided great native 1440p performance at high frame rates. Games ranging from Cyberpunk 2077, Star Wars Jedi: Survivor, Need For Speed Unbound and Diablo IV ran buttery smooth without issue. 

At those settings, there wasn’t a game the Chronos V3 couldn’t handle even with the addition of ray-tracing. Our standard test from Total War: Warhammer III and Dirt 5 provided frame rates that all went above 200 when using Ultra settings. If 1440p gameplay is all one is concerned about, this gaming desktop is more than enough. 

Just be mindful that the case will limit which GPUs the Chronos V3 can hold, and there aren't any options to preconfigure the PC with AMD and Nvidia's flagship GPUs, the AMD Radeon RX 7900 XTX and Nvidia GeForce RTX 4090. This means that native 4K performance will be a problem depending on the game, but thankfully, upscaling measures like Nvidia DLSS or AMD FSR can deliver 4K resolutions at high frame rates with some of the GPU options for the build. Those can come with issues like loss of visual details and input delay, though, so it's not a perfect substitute.

Individuals who want native 4K or even 8K performance may want to stay clear of this particular gaming desktop as the case understandably prevents bigger GPU sizes. One thing that is consistent is that fan cooling doesn’t get very loud during intense performance.

We also found the Chronos V3 to be a great workstation for creative tasks. Our PugetBench test for Adobe Photoshop and Premiere Pro delivered fantastic performance as well. Running Photoshop with high resolution photo files and multiple layers wasn't a problem at all while 4K video exports could be considered relatively snappy.

  • Performance score: 4.5/ 5

Should you buy the Origin Chronos V3?

A Origin Chronos V3 gaming PC on a desk

(Image credit: Future / John Loeffler)

Buy it if...

Don't buy it if...

How I tested the Origin Chronos V3

I spent two weeks with the Origin Chronos V3, playing the latest PC games, used it for general computing tasks, and using various creative apps like Adobe Photoshop. 

Pushing the compact gaming desktop to its limits, I played games including Cyberpunk 2077, Need For Speed Unbound, Forza Horizon 5, and Star Wars Jedi: Survivor

More general computing use included using Google Chrome for various tasks ranging from Google Docs to utilizing various social media platforms. Outside of PugetBench tests, we also used Adobe Photoshop and Premier as well. 

Read more about how we test

First reviewed July 2023

iQoo 11 review: a speedy phone that’s hard to find
3:00 pm | June 7, 2023

Author: admin | Category: Computers Gadgets | Tags: , , , | Comments: Off

iQoo 11: Two-minute review

The iQoo 11 probably won’t be launching in the west, but as one of the first ever phones to run on the Snapdragon 8 Gen 2 chipset, it's worth acknowledging, even now some more widely available handsets use this chipset too.

While this Vivo sub-brand sells largely to the Asian market, it’s clearly mulling over a push into new territories at some point. There aren’t many Chinese brands that bother to supply their pre-launch test handsets with a UK power adapter, but iQoo did, which has to signify something.

Given the strength of the iQoo 11 package, we would welcome it (or a future handset) to our shores. Maybe the brand could take up OnePlus’s former role as market disruptor and flagship killer-in-chief.

An iQoo 11 from the front

(Image credit: TechRadar)

iQoo’s customary team-up with BMW sees a fairly generic design lifted by a white vegan leather back with a three-stripe decal. There’s a black option, but this themed one is way more fun.

The headline feature here, however, is that cutting-edge chipset, which instantly makes the iQoo 11 one of the most powerful smartphones on the market. The iQoo brand presents itself as gamer-friendly, and its latest phone certainly flies through demanding games like Wreckfest and Genshin Impact on maxed-out settings.

This gaming-friendly status is further enhanced by a 6.78-inch AMOLED display that’s similarly cutting-edge. Besides using the very latest Samsung E6 panel, it sports a rare combination of QHD sharpness and a 144Hz refresh rate, as well as getting really bright.

While the iQoo 11 won’t be joining the iPhone 14 Pro, the Samsung Galaxy S23 Ultra, and the Google Pixel 7 Pro at the top of the camera phone tree, it does a creditable job of turning out bright, balanced shots across its three cameras. It could use a little work when the light drops, though.

Throw in a full day of battery life from its 5,000mAh cell and rapid 120W wired charging (but not wireless, alas), and you have a fine entry-level flagship phone.

iQoo 11 review: price and availability

An iQoo 11 from the back, in someone's hand

(Image credit: TechRadar)
  • Arrived December 8, 2022
  • No western availability
  • Price works out to around $700 / £580 / AU$1,050

The iQoo 11 hit Indonesia and Malaysia on December 8, 2022, and Thailand on December 15, followed by India on January 13, 2023.

There are no plans for the iQoo 11 to hit the US, UK, or Australia, which is a real shame. With an Indonesian launch price of IDR10,999,000 working out to around $700 / £580 / AU$1,050, it could very well have undercut – or at least provided genuine competition for – the Pixel 7 and the OnePlus 10T.

We’re hopeful of a wider rollout for the brand in future. The company supplied a UK power brick with our test iQoo 11, so it’s clearly thinking about branching out.

  • Value score: 4.5 / 5

iQoo 11 review: specs

An iQoo 11 from the front, in someone's hand

(Image credit: TechRadar)

iQoo 11 review: design

An iQoo 11 from the back, in someone's hand

(Image credit: TechRadar)
  • Typical design, lifted by vegan leather/three stripe finish
  • No IP rating
  • Camera module scratches easily

The iQoo 11 has a pretty generic Android design, with a flat display, a subtly curved back, and a metal frame.

However, it’s rescued from bland uniformity by the Legend model we’ve been sent. This sports a mixture of fiberglass and vegan leather on the rear, rendered in brilliant white, and with a colorful triple stripe running down the entire length of the phone.

The latter is courtesy of a longstanding brand partnership with BMW M Motorsport. It’s just as effective a design flourish here as it was on the likes of the iQoo 7.

If you opt for the Alpha edition, you’ll receive an altogether more sober black shade and a glass back. iQoo calls this a “classic, premium aesthetic”, but others might call it boring.

Image 1 of 3

An iQOO 11 from the side

(Image credit: TechRadar)
Image 2 of 3

An iQOO 11 from the back

(Image credit: TechRadar)
Image 3 of 3

The bottom half of an iQOO 11 from the back

(Image credit: TechRadar)

One negative point we did notice towards the end of our time with the phone was that the black paint coating the thin metal frame surrounding the camera module had begun to scratch off along the bottom edge. Presumably this was where the phone made contact with whatever surface it was laying on, but it started to give the phone a somewhat scruffy edge after just a week or two of usage.

This isn’t a small phone at 8.7mm thick and 205g, but nor is it distractingly hefty. We found it very easy to live with, and that vegan leather has proved both grippy and mercifully non-freezing-to-the-touch when taken out on cold days.

The lack of an IP rating – meaning no official water resistance – is a bit of a downer, and one sign that we’re not dealing with an out and out flagship phone here. So too is a chin bezel that’s slightly thicker than the forehead, which is always a dead giveaway that a phone isn’t gunning for the elite league, regardless of what its spec sheet might say.

There’s stereo sound provided by a pair of speakers, but one is positioned on the bottom edge of the phone, and proves a little too easy to cover during landscape gaming. This is a common concession on phones of all price ranges, but when a phone claims to be geared towards gamers it’s worth calling out.

  • Design score: 4 / 5

iQoo 11 review: display

An iQoo 11 from the front

(Image credit: TechRadar)
  • 6.78-inch AMOLED screen
  • Next-gen Samsung E6 panel 
  • QHD+ and 144Hz in one package

While the iQoo 11’s Snapdragon 8 Gen 2 chip is getting most of the headlines, its display is similarly cutting-edge and just as worthy of attention.

Where most flagship Android phones in 2022 featured Samsung’s E5 panel, this phone switched up to the E6.

The baseline stats are strong. It’s a 6.78-inch AMOLED with a QHD+ (1440 x 3200) resolution, though you’ll need to activate that in the settings menu. We experienced some issues with font sizing following this switch, but that’s an issue with iQoo’s software.

Two other specs stand out here. One is a higher-than-usual 144Hz maximum refresh rate, though again, you’ll need to crank this up in the settings. It really is very responsive indeed.

An iQoo 11 from the front, in someone's hand

(Image credit: TechRadar)

We’ve seen 144Hz (and higher) refresh rates before, but never in conjunction with a QHD resolution.

The other stand out spec is a peak brightness of 1800 nits. That’s beyond even the mighty Samsung Galaxy S23 Ultra, and just a little shy of the iPhone 14 Pro Max.

One other gaming-focused feature is a pressure-sensitive screen, which can be mapped to controls in certain games. Pressing both sides firmly in landscape serves as a shortcut to booting up the phone’s Game Space gaming UI, which is a nice touch.

We haven’t seen too much of this pressure-sensing technology since Apple removed it from its iPhones, so it’s good to see it implemented here – even if it’s not as deeply integrated into the UI as Apple’s 3D Touch was to iOS.

  • Display score: 4.5 / 5

iQoo 11 review: software

An iQoo 11 from the front, in someone's hand

(Image credit: TechRadar)
  • Funtouch 13 is busy and full of bloat for local markets
  • Somewhat buggy UI
  • Only 2 years of Android updates

Software is arguably the weak point with the iQoo 11, with Funtouch 13 proving to be a rather busy custom UI layered over Android 13.

It’s worth mentioning that there are mitigating circumstances here. As discussed, this is a phone that’s intended for the Indonesian market, which explains why it comes laden with so much bloatware, including local apps like Lazada and Viu.

Even setting the matter of pre-installed apps aside, though, Funtouch 13 feels somewhat buggy and unfinished. There’s the UI’s apparent inability to adjust to bumping up the display resolution to full QHD+, resulting in comically small text in the Messages app and the clock widget. Adjusting the system font size didn’t seem to help here.

An iQoo 11 from the front, in someone's hand

(Image credit: TechRadar)

Then there’s the fact that WhatsApp notifications continued to break through for us when the phone was in Do Not Disturb mode, which spoiled a couple of attempted weekend lie-ins.

All of this can be fixed in future software updates, of course. However, that just brings into focus iQoo’s two-year Android update promise, which is looking rather stingy and outdated compared to many other high-end handsets.

  • Software score: 2.5 / 5

iQoo 11 review: cameras

The camera block on an iQoo 11

(Image credit: TechRadar)
  • Same 50MP main camera sensor as Galaxy S22
  • 13MP telephoto, 8MP ultra-wide
  • Slightly artificial but even tone across the three cameras

If the iQoo 11 is mixing it with the big boys in terms of power and display technology, then it steps back into the second tier with its camera offering.

That still makes it a decent photography tool however, and it does some things we like a lot. We particularly appreciate the provision of a dedicated telephoto camera to accompany the wide and ultra-wide. That’s often one of the first features on the chopping block when putting together a more affordable flagship.

This is a nicely balanced setup too. The main camera uses  the same 50MP sensor as you’ll find in the Samsung Galaxy S22 and Galaxy S22 Plus, as well as a number of previous iQoo models. It’s not exactly a cutting-edge component, but it’s a decent-sized 1/1.57" sensor, and it produces punchy shots in good lighting.

This is accompanied by an 8MP ultra-wide and a 13MP telephoto sensor, both also from Samsung. These support sensors aren’t up to the standard of the main sensor in terms of color depth, detail, or dynamic range, but they’re perfectly serviceable – especially that telephoto.

iQoo 11 camera samples

Image 1 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

The overall tone can be rather cool.

Image 2 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

The telephoto does a good job matching the main sensor’s tone.

Image 3 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

The ultrawide lacks detail, but again matches the tone of the others.

Image 4 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

Not much in the way of noise or artifacts here.

Image 5 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

Again, the telephoto matches up well.

Image 6 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

Edge distortion, but a consistent tone.

Image 7 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

Indoor shots can look a little murky.

Image 8 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

A reasonably sharp, if dark, food shot.

Image 9 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

Shots can look a little washed out.

Image 10 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

The dedicated telephoto is way better than cropping in.

Image 11 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

The ultrawide struggles for detail.

Image 12 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

Night mode isn’t up there with the best.

Image 13 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

Good low light shots are possible if you keep movement to a minimum.

Image 14 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

A shot showing the iQoo 11’s performance in low indoor lighting.

Image 15 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

Showing the main camera’s cool tone and exaggerated greens.

Image 16 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

The 2x telephoto is a solid performer.

Image 17 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

Selfies are fine, once you deactivate beautification.

Image 18 of 18

An iQOO 11 camera sample

(Image credit: TechRadar)

The camera is least impressive in middling/indoor lighting situations.

One of the best things about this camera setup is that the tone of the shots stays relatively consistent across all three sensors. Take three shots of the same scene, one with each camera, and they all look more or less of a piece.

True, that shared tone can be a little too cool and icy for our liking, with slightly punched up greens and a hint of overexposure. But that’s a color science choice that some will be more fond of than others, especially if you’re mainly in the business of sharing your shots on social media. 

What’s more, there’s a toggle on the main camera UI that switches to a more natural, restrained look should you prefer it, which we did. It’s a shame this isn’t the default selection, but at least the camera app remembers your choice should you choose to activate it.

Low light performance is decent, if not among the best. Taking pictures of static scenes with a suitably steady hand yielded some clear results, but we also shot the odd dud that didn’t quite lock on properly, or which yielded excessive noise, while any form of movement in the shot tended to be picked up. 

Shots in artificially lit indoor environments, and those in that murky zone between light and dark, could be a little hit and miss on the focusing front, and sometimes looked a little flat and washed out. This is a camera that rewards a steady hand and a little patience once the light drops.

The 16MP front camera does a reasonable job with selfie skin tones, though you’ll want to turn off the default beautifying effect, which smooshes and smudges facial textures in that disconcerting way that certain manufacturers seem to favor.

Overall, this is a solid camera setup, and none of the traits we mention are egregious given what appears to be the competitive pricing of the device. They merely illustrate that the iQoo 11 isn’t in the conversation with the genuinely top tier camera phone crowd like it is in other departments. With the Pixel 7 and the Google Pixel 6a both available for a very reasonable price, that’s worth mentioning. 

  • Camera score: 3.5 / 5

iQoo 11 review: performance

An iQoo 11 from the front, in someone's hand

(Image credit: TechRadar)
  • One of the first Snapdragon 8 Gen 2 phones
  • Extremely impressive CPU and GPU performance
  • Stays cool and fast under gaming load

While it doesn’t make gaming phones in the strictest of senses, gaming prowess is undoubtedly a core component of the iQoo brand. The iQoo 11 is no different, standing as one of the very first instances of the flagship Snapdragon 8 Gen 2 chip being used.

The resulting benchmarks are suitably impressive, with an average Geekbench 5 single-core score of 1,462 and a multi-core score of 4,855. That beats a Snapdragon 8 Gen 1 phone like the Samsung Galaxy S22 by around 200 points in single-core and a whopping 1,500 points in multi-core.

The Asus Zenfone 9 with its Snapdragon 8 Plus Gen 1 chipset gets a little closer, but still falls short by around 100 points single-core and 500 points multi-core.

On the GPU front, an average Wild Life Extreme score of around 3,750 (with an average frame rate of 22.50fps) is very strong indeed, outgunning even the mighty iPhone 14 Pro. Just as impressive is the fact that this level of graphical performance remains relatively stable over time in the extended Wild Life Extreme Stress Test, which runs the same high-intensity GPU workout 20 times in a row.

There’s a slight dip for the final few loops, but not by much, indicating that the iQoo 11 has its thermals in order. That can be attributed to the efficiency of the Snapdragon 8 Gen 2, but also to a multi-layer vapor chamber that iQoo has implemented.

In practical terms, we were able to play Genshin Impact and console-quality racer Wreckfest on maxed out graphical settings, with performance remaining silky smooth. We didn’t observe any serious thermal build-up or throttling over the space of a 20-minute gaming session, either.

Another performance point we should note is the inclusion of iQoo’s V2 chip, which apparently inserts frames to keep gameplay nice and smooth, even when the game itself doesn’t support higher frame rates of 90 or 120fps (as most games don’t). All in all, the iQoo 11’s level of performance is extremely impressive.

  • Performance score: 5 / 5

iQoo 11 review: battery

The bottom edge of an iQoo 11

(Image credit: TechRadar)
  • 5,000mAh battery
  • Comfortable all-day battery life, even with heavy usage
  • Rapid 120W wired charging but no wireless

The iQoo 11 has been fitted out with a 5,000mAh battery. That’s not an uncommon sight in an Android flagship, but it’s reassuring nonetheless, especially when combined with that efficient Snapdragon 8 Gen 2 chipset.

We were able to get through a full 15-hour day of heavy usage (just shy of 6 hours of screen-on time) with the screen set to QHD and 144Hz, and the iQoo 11 still had around 30% left in the tank.

On days with more moderate usage, it wasn’t uncommon to be left with around half a tank left. That’s a very solid showing.

Charging is also extremely rapid, with a 120W charging brick bundled in. We found that a 15-minute charge would get the phone from empty to 74%, while it hit 100% in around 25 minutes.

The only real disappointment here is that the iQoo 11 doesn’t support wireless charging. This isn’t a given at less-than-flagship prices, of course, but the Pixel 7 and Nothing Phone 1 show that such an inclusion isn’t outside the realms of possibility.

  • Battery score: 4 / 5

Should you buy the iQoo 11?

Buy it if...

You want top power for a reasonable price
The iQoo 11 packs the latest Snapdragon 8 Gen 2 chipset, and it knows what to do with it, while undercutting many rivals in terms of price.

You want an outstanding display for less
The iQoo 11 uses the latest Samsung E6 AMOLED panel - it’s big, bright, sharp, and at 144Hz it’s also unusually fluid.

You don’t mind importing
The iQoo 11 isn’t coming to western markets, so you’ll need to be comfortable with importing if you want to buy it.

Don't buy it if...

You highly value the camera
The iQoo 11’s main camera isn’t bad by any means, but nor is it the best you can get for the money, and it struggles with indoor lighting.

You like a clean or stock Android experience
The Funtouch 13 overlay here is far from unusable, but it is busy, buggy, and bloated. It's one of the very weakest points of the iQoo 11.

You want high-end extras
Water resistance and wireless charging are fairly standard high-end smartphone features, yet neither are present here.

iQoo 11: Also consider

The iQoo 11 isn't the easiest phone to get hold of in most regions, so for alternatives, consider the following options.

Google Pixel 7

Available for about the same price – and in more markets – the Pixel 7 might not have the impressive performance and display specs of the iQoo 11, but it’s an altogether classier phone with a superior main camera.

OnePlus 10T

Also similarly priced, the OnePlus 10T lacks the cutting edge specs of the iQoo 11, but isn’t lacking in the performance or display stakes. It’s also available in more markets, features cleaner software, and charges even faster.

Nubia Red Magic 7

If it’s a pure-bred gaming phone you’re after for this sort of money, then the Nubia Red Magic 7 goes above and beyond what the iQoo 11 has to offer. While its processor isn’t quite as impressive, its thermal system is even more extensive and its display is even more fluid, while dedicated physical controls will further enhance your gaming performance.

First reviewed: April 2023

Intel Arc A750 review: a great budget graphics card with major caveats
9:48 pm | May 22, 2023

Author: admin | Category: Computers Gadgets | Tags: , , , | Comments: Off

Intel Arc A750: Two-minute review

The Intel Arc A750 is probably the one graphics card I've most wanted to get my hands on this year, and now that I've put it through a fairly rigorous testing regime, I can honestly say I am very impressed with Intel's first effort at a discrete GPU. At the same time, it's also not an easy card to recommend right now, which is a tragedy.

First, to the good, namely the great price and stylish look of the Intel Limited Edition reference card. The Intel Arc A750 Limited Edition card has an MSRP of just $249.99 (about £200 / AU$375), and the limited number of third-party cards out there are retailing at roughly the same price. 

The Arc A750 I tested also looks spectacular compared to the reference cards from Nvidia and AMD, thanks to its matte black look, subtle lighting, and silver trim along the edge of the card. It will look great in a case, especially for those who don't need their PCs to look like a carnival.

When it comes to performance, I was most surprised by how the Arc A750 handled modern AAA games like Cyberpunk 2077 and Returnal, both of which put a lot of demands on a graphics card in order to maintain a stable frame rate. The Arc A750 handled them much better than the RTX 3050 it is ostensibly competing against. It even outperformed the RTX 3060 in many cases, putting it just under halfway between the RTX 3060 and the RTX 3060 Ti, two of the best graphics cards ever made.

The Arc A750 manages to pull this off while costing substantially less, which is definitely a huge point in its column.

An Intel Arc A750 running on a test bench

(Image credit: Future / John Loeffler)
Test system specs

This is the system we used to test the Intel Arc A750:

CPU: AMD Ryzen 9 7950X3D
CPU Cooler: Cougar Poseidon GT 360 AIO
RAM: 32GB Corsair Dominator Platinum @ 5,200MHz & 32GB G.Skill Trident Z5 Neo @ 5,200MHz
Motherboard: ASRock X670E Taichi
SSD: Samsung 980 Pro 1TB NVMe M.2 SSD
Power Supply: Corsair AX1000
Case: Praxis Wetbench

The thing about the Arc A750 is that the things it does well, it does really well, but those areas where it flounders, like older DirectX9 and DirectX10 workloads, it does so pretty badly. 

It's a tale of two halves, really. Nothing exposes the issues with the Arc A750 more than its synthetic performance scores, which on average trounce the RTX 3060, 23,924 to 20,216. In that average though is its PassMark 3D score, a good measure of the card's ability to render content that wasn't just put out within the last couple of years. Here, the Arc A750 scored a dismal 9,766 to the RTX 3060's 20,786 - a 10,000 point deficit.

The story is similar when gaming, where the Arc A750 generally outperforms its rival cards, even in ray tracing in which Intel is the newcomer behind mature leader Nvidia and fiesty, determined AMD. In fact, when gaming with ray tracing at 1080p, the Intel Arc A750 comes in a close second behind Nvidia's RTX 3060 8GB, 37fps on average to the 3060's 44fps.

Bump that up to 1440p, however, and the Intel Arc A750 actually does better than the RTX 3060 8GB - 33fps on average to the 3060's 29fps average. When running Intel XeSS and Nvidia DLSS, the Arc A750 averages about 56fps on max settings with full ray tracing at 1080p, while the RX 6600 can only muster 46fps on average.

These are much lower than the RTX 3060's 77fps, thanks to DLSS, but getting roughly 60fps gaming with full ray tracing and max settings at 1080p is a hell of an accomplishment for the first generation of Intel discrete graphics. The Arc A750 can even run even with the AMD Radeon RX 6650 XT in ray tracing performance with upscaling at 1440p, getting 42fps on average. 

If only this performance were consistent across every game, then there would be no question that the Intel Arc A750 is the best cheap graphics card on the market. But it is exactly that inconsistency that drags this card down. Some games, like Tiny Tina's Wonderland, won't even run on the Arc A750, and it really, really should. How many games are there out there like Tiny Tina's? It's impossible to say, which is the heartbreaking thing about this card.

I really can't recommend people drop $250 on a graphics card that might not play their favorite games. That is simply not a problem that AMD or Nvidia have. Their performance might be rough for a few days or weeks after a game launches, but the game plays. The same can't be said of the A750, and only you, the buyer, can decide if that is worth the risk.

In the end, the Intel Arc A750 is a journeyman blacksmith's work: showing enormous potential but not of enough quality to merit selling in the shop. Those pieces are how craftspeople learn to become great, and I can very clearly see the greatness that future Arc cards can achieve as Intel continues to work on lingering issues and partners with more game developers.

It's just not there yet. As Intel's drivers improve, a lot of these issues might fade away, and the Intel Arc A750 will grow into the formidable card it seems like it should be. If you're comfortable dropping this kind of cash and taking that chance, you will still find this card does a lot of things great and can serve as a bridge to Intel's next generation of cards, Arc Battlemage, due out in 2024. 

Intel Arc A750 Price & availability

An Intel Arc A750 graphics card on a pink desk mat next to its retain packaging

(Image credit: Future / John Loeffler)
  • How much does it cost? MSRP of $249.99 (about £200 / AU$375)
  • When can you get it? It is available now
  • Where can you get it? It is available in the US, UK, and Australia, but stock may be an issue

The Intel Arc A750 is available now, starting at $249.99 (about £200 / AU$375). There are a limited number of third-party partners who also make the A750, though these tend to sell at or very close to Intel's MSRP from what I've seen.

This puts the Arc A750 on the same level price-wise as the Nvidia RTX 3050, but it definitely offers better performance, making it a better value so long as you're ok with the varying compatibility of the Arc A750 with some PC games out there.

  • Value: 4 / 5

Intel Arc A750 Specs

An Intel Arc A750 graphics card on a pink desk mat next to its retain packaging

(Image credit: Future / John Loeffler)

Should you buy the Intel Arc A750?

Buy it if...

You're looking for a cheap GPU
At $249.99, this is one of the best cheap GPUs you're going to find.

You want a stylish looking card
This card is very cool looking in a way that Nvidia and AMD reference cards simply aren't.

You want strong ray tracing and upscaling
Not only do Intel's AI cores make XeSS upscaling a serious contender, the Arc A750's ray tracing performance is quite strong.

Don't buy it if...

You are concerned about compatibility
While only one game I tested wouldn't work, that's one game too many for many gamers out there.

You're concerned about power consumption
At 225W TGP, this card soaks up way more power than a card in this class reasonably should.

Intel Arc A750: Also consider

If my Intel Arc A750 has you considering other options, here are two more cards to consider...

How I tested the Intel Arc A750

An Intel Arc A750 graphics card on a pink desk mat next to its retain packaging

(Image credit: Future / John Loeffler)
  • I spent several days with the Intel Arc A750
  • I used the A750 in my personal PC playing games and doing creative work
  • I ran our standard battery of tests on the Arc A750

I spent several days with the Intel Arc A750 to test its gaming and creative performance, including at 1080p and 1440p. In addition to gaming, I ran our standard suite of GPU tests at it using the same system set up I use for all our graphics card tests. 

Besides my extensive computer science education or practical experience, I have been a hardware reviewer for a few years now, and a PC gamer for even longer, so I know how well graphics cards are supposed to perform with a given set of specs.

Read more about how we test

First reviewed May 2023

Dimensity 9200+ brings higher CPU and GPU clocks, promises lower power usage
2:31 pm | May 10, 2023

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

MediaTek is on a roll and has unveiled its third chipset this month. Today’s item is the Dimensity 9200+, which schooled the Snapdragon 8 Gen 2 in preliminary Geekbench 6 tests. As the name suggests, this is a boosted version of the Dimensity 9200 chipset from last year. It’s still fabbed on TSMC’s N4P node (4nm, second gen) but is able to run its CPU and GPU at higher clock speeds. This includes all three CPU clusters, which promises a 10% uplift over the original version of the chip. The new Dimensity 9200+ chipset at a glance As for the ARM Immortalis G715 GPU, MediaTek...

Dimensity 9200+ brings higher CPU and GPU clocks, promises lower power usage
2:31 pm |

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

MediaTek is on a roll and has unveiled its third chipset this month. Today’s item is the Dimensity 9200+, which schooled the Snapdragon 8 Gen 2 in preliminary Geekbench 6 tests. As the name suggests, this is a boosted version of the Dimensity 9200 chipset from last year. It’s still fabbed on TSMC’s N4P node (4nm, second gen) but is able to run its CPU and GPU at higher clock speeds. This includes all three CPU clusters, which promises a 10% uplift over the original version of the chip. The new Dimensity 9200+ chipset at a glance As for the ARM Immortalis G715 GPU, MediaTek...

Acer Chromebook Enterprise Vero 514 Review
2:31 pm | May 2, 2023

Author: admin | Category: Computers Gadgets | Tags: , , | Comments: Off

The Chromebook Vero 514 Enterprise edition is an impressive Chromebook with some serious power under the keyboard, hindered only by the limitations of ChromeOS. 

This Chromebook performs exceptionally well as a business laptop and has a keyboard we enjoy using for extended periods. Our model had the Enterprise upgrade from Chrome, showcasing Acer's drive to be a proper business computer through fleet management and more included in the enterprise upgrade, compared to the standard Chromebook Vero 514.

Unboxing and First Impressions

Initially, unboxing the computer was a typical process involving a box within a box and some packaging around the computer. However, upon examining the packaging, we discovered that all the packing materials were recyclable and made from recycled materials, which is an excellent to see.

On the same track, the box that Acer wrapped around the power brick and cable to charge this Chromebook can fold together to create a laptop riser, fitting wonderfully under the back of the Vero 514, giving it a lift.

Another thing we noticed right away was the unique texture and coloring of the Chromebook Vero 514. The speckled grey color looks better in person than in pictures, and rather than looking cheap like we have seen with some past recycled computers, this one seems intentional. Acer calls this chassis its "Cobblestone Gray Finish," which includes 30% PCR plastic, and we don't mind it. Granted, it doesn't look like a luxury item, but it still looks well-built and of high quality.

Acer Chromebook Vero 514 Enterprise

Acer Chromebook Vero 514 Enterprise PCR (Post Consumer Recycled) badging (Image credit: Collin Probst // Future)

Design and Build Quality

The focus on recycled materials continues throughout the Vero 514. The keycaps are made of 50% PCR (post-consumer recycled) content, the screen itself is 99% recyclable, and the trackpad is 100% ocean-bound plastics. Acer calls this trackpad its OceanGlass touchpad, and we found it quite responsive and enjoyable to use. The overall chassis of this laptop is entirely paint free, giving it a unique look and feel. 

While we initially expected the build quality of the Vero 514 to allow for some flex and cheap-feeling materials, we are pleasantly surprised with just how sturdy this laptop is with daily use. We can use this laptop, not keep it carefully placed on a desk, afraid to take it to work or on the go.

In Use

Having used this Chromebook for the last few weeks, we have been wildly impressed with its speed. While somewhat hindered by ChromeOS, the Vero 514 is snappy for those who only need a Chrome browser to complete their work. In addition, we were pleasantly surprised with the battery. While we couldn't run our standard benchmark software of choice due to this computer not having a full version of Windows, we tested through daily use, and we were able to achieve 8-10 hours of use regularly with standard settings and doing basic work - nothing too demanding.

Specs

Display: 14-inch (1920 x 1080), 16:9

Brightness: 300 nits

CPU: 12th Generation Intel Core i7

GPU: Intel Iris X Graphics

Memory: 16GB

Storage: 256GB SSD

Ports: 2x USB-C 3.2 (10Gb/s), 1x USB-A 3.2, 1x HDMI, 1x 3.5mm headphone/speaker/line-out port

Battery: 56Wh (10hrs)

OS: ChromeOS

Weight: 3.09lb / 1.4kg

Dimensions (W x D x H): 12.81 x 8.83 x 0.80in / 325.4 x 224.3 x 20.4mm

We found the Vero 514's 14-inch screen to be an excellent size for an enterprise laptop. We could see all the content we wanted to (understanding that it's a laptop and not one of our large ultrawide monitors) and, at the same time, did not feel like we were carrying around anything that was ridiculously large.

While using the Vero 514, we noticed the audio quality could have been better. It got the job done for virtual meetings or the occasional quick video. We frequently reached for headphones for music or any time we were in a long meeting, and there was a decent amount of background noise.

Acer Chromebook Vero 514 Enterprise

Acer Chromebook Vero 514 Enterprise Left side ports (Image credit: Collin Probst // Future)

We chose the Enterprise Vero model, which includes an i7 upgrade, 16GB RAM, an anti-glare Corning Gorilla Glass touch display, and more. This bump-up in specs shows that Acer is genuinely trying to become a reasonable and quality option for a business fleet of computers. That could be possible depending on the workforce and the employees' tasks.

Acer Chromebook Vero 514 Enterprise

Acer Chromebook Vero 514 Enterprise right side ports (Image credit: Collin Probst // Future)

The last thing we will mention about the Vero 514 is the overall display experience. It could have been better, but not great. It is a good enough screen for indoor use, but the brightness can't quite keep up once you get outside, and it gets tough to see. Again, if you or your business use these indoors, the 1920 x 1080 displays will be fine for most tasks.

Acer Chromebook Vero 514 Enterprise

Acer Chromebook Vero 514 Enterprise fully opened (Image credit: Collin Probst // Future)

Final Verdict

The Vero 514 is a snappy Chromebook model that is a solid contender for writers, web browsers, email responders, and Google Workspace lovers. Essentially, anyone who works from the web could look into this Chromebook as a wonderful option to upgrade their old laptop or find one that works better for what they do.


« Previous PageNext Page »