Organizer
Gadget news
ARM unveils the Cortex-X4, its fastest CPU yet, Cortex-A720 and A520 and 5th gen GPUs join it
1:19 pm | May 29, 2023

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

ARM’s big.LITTLE CPU configurations have been around for years and the standard practice so far has been to have as many – or more – little cores as there are big cores. This is now changing with the introduction of ARM’s new designs. The high-power Cortex-X4, middle Cortex-A720 and small Cortex-A520 were unveiled today and will work in tandem to balance between performance and efficiency. ARM also introduced the DSU-120, which is what drives the DynamIQ Shared Unit system that allows different CPU cores to work together. With the DSU-120, chipset makers can build designs with up...

AMD Radeon RX 7600: a major gift for gamers on a budget
4:00 pm | May 24, 2023

Author: admin | Category: Computers Gadgets | Tags: , , , | Comments: Off

AMD Radeon RX 7600: Two minute review

The AMD Radeon RX 7600 is probably the best 1080p graphics card you can buy right now, and in all honesty, it should be the last of its kind.

Team Red has been a bit gun-shy of late with its graphics card offerings, with the last graphics card we saw being the AMD Radeon RX 7900 XT. While that was a great card, it launched almost half a year ago, and we haven't heard much from AMD since. 

Meanwhile, its rival has released a steady stream of cards, and at this rate, it's almost through its main GPU stack at this point, so it's interesting that AMD chose to release a very budget-friendly midrange card rather than go down the list of higher-end offerings the way Nvidia has.

In a way, it's a very smart strategy (and one I actually recommended back in February), and with the Radeon RX 7600 going on sale on May 25, 2023, for just $269 (about £215/AU$405), AMD manages to make it to market with its all-important midrange offering at least a full month ahead of Nvidia's competing RTX 4060 while also managing to undercut its rival on price.

In terms of performance, the RX 7600 is a major improvement over the AMD Radeon RX 6600 it replaces, while also generally outperforming the competing Intel Arc A750. It does fall short of the RTX 3060 overall, but not by much, and a lot of that is relative to ray tracing performance, which isn't great on either card to begin with, so this advantage looks bigger than it really is in practice.

If there is one knock against the RX 7600, it's its power draw, which is pulling down 165W TGP, which is more than the 8GB Nvidia GeForce RTX 4060 Ti and about 33W more than the RX 6600, so this is definitely the wrong direction for AMD to be going in, power wise.

AMD also has to step up its game when it comes to FSR. Nvidia's most recent launch, the RTX 4060 Ti, was a fairly disappointing card when it came to its baseline performance, but there was no denying that DLSS 3, especially with Frame Generation, is a huge value-add for Team Green. And while DLSS 3 is only available on about 50-ish games, FSR 2 is generally more available with about 120 games featuring FSR 2, but DLSS 2.0 is available in more than 200 games, so AMD has some catching up to do.

When it finally does, the RX 7600 will be an even better buy for midrange gamers, and while it's a sad state of affairs that $269 is about as "budget" as we can hope to see for a while, it's a substantially better value than just about any card on the market right now.

That might change when the RTX 4060 lands, but given that the performance of the baseline performance of the RTX 4060 is expected to be about 20% better than that of the RTX 3060, I expect that it will fall in pretty close to where the RX 7600 currently is, only with a more expensive MSRP and no Founders Edition to keep third-party partners honest in terms of price. 

So unless the RTX 4060 pulls a rabbit out of a hat, I still expect the AMD Radeon RX 7600 to hold the edge over its rival on value, which at this price point is really the only thing that really matters. As it stands, it is the best cheap graphics card you can buy right now, and I expect that will remain the case for the rest of this generation.

AMD Radeon RX 7600: Price & availability

An AMD Radeon RX 7600 on a desk

(Image credit: Future / John Loeffler)
  • How much is it? MSRP listed at $269 (about £215/AU$405)
  • When is it out? It goes on sale May 25, 2023
  • Where can you get it? You can buy it in the US, UK, and Australia

The AMD Radeon RX 7600 goes on sale on May 25, 2023, with an MSRP of $269, (about £215/AU$405), making it the cheapest card of this generation to launch. Not only that, it's a substantial price drop from the Radeon RX 6600, which launched at $329 (about £265/AU$495), so you're getting a much better graphics card for almost 20% less. This is more like it! 

Ostensibly, the rival to the RX 7600 is the RTX 4060, but since that card has yet to launch, we can only really compare it to the last-gen midrange offerings from Nvidia and Intel.

The Nvidia RTX 4060 when it launches will sell for $299 (about £240/AU$450), which is 9% cheaper than the RTX 3060's official MSRP of $329. The RX 7600 has a cheaper MSRP than either of those, but I expect that the RTX 3060 especially will see some heavy discounting as a result of both the RTX 4060 and the RX 7600, so the value proposition of the RX 7600 might shift depending on what SKU you're looking at.

The RX 7600 does come in slightly more expensive than the Intel Arc A750, and while you might do a double-take at the mention of Intel, the Arc A750 can give the RX 7600 a run for its money at times, so you definitely can't write it off completely.

AMD Radeon RX 7600: Features and chipset

An AMD Radeon RX 7600 on a desk

(Image credit: Future / John Loeffler)
  • More ray tracing cores and new AI cores
  • Higher TGP

With the move to RDNA 3, the AMD Radeon RX 7600 starts off on a 6nm TSMC process over the RX 6600's 7nm, which gives the RX 7600 a roughly 20% jump in the number of transistors it has to work with (13.3 billion to 11.1 billion). And even though the actual GPU die on the RX 7600 is about 14% smaller than that of the RX 6600, it manages to pack in four additional compute units for a total of 32 compared to the RX 6600's 28.

This is also a more mature architecture, so the 2,048 stream processors (a roughly 14% increase over the RX 6600), are more performant, and the second-generation ray accelerators are a huge improvement over the first-gen RAs in the RX 6600.

The RX 7600 also has faster clocks than the RX 6600, with a boost clock improvement of about 6%, but the big improvement comes with the memory clock speed, which is 2,250MHz for the RX 7600 and 1,750MHz for the RX 6600. This means a nearly 30% boost to memory speed, so even though the RX 7600 is still rocking the same 8GB GDDR6 VRAM on a 128-bit bus as the RX 6600, it has an 18 Gbps effective memory speed compared to 14 Gbps for the RX 6600.

There is also the addition of 64 AI accelerators for the RX 7600, which the RX 6600 simply didn't have. This means that things like Radeon Super Resolution (RSR) will run better than it did on the RX 6600, and it will enable advanced AI workloads like generative AI content creation.

All this does come at the cost of power though, as the RX 7600 has a 25% higher TGP than the RX 6600. This isn't good, and given how Nvidia's cards are typically getting better performance with less power gen-on-gen, this is definitely the wrong direction for AMD to be going in. It still keeps the card "reasonable" when it comes to your PSU, and AMD recommends a 550W PSU for the RX 7600 at a minimum, but this still manages to keep things under 600W overall.

AMD Radeon RX 7600: design

An AMD Radeon RX 7600 on a desk

(Image credit: Future / John Loeffler)

The AMD reference card for the Radeon RX 7600 is a compact dual-fan number that will fit in just about any case. This is a dual-slot card, but it's just over eight inches long and a little over four inches tall, so it's great for mini-tower builds, and with just a single 8-pin power connector, you won't have any issues with cable management here.

In terms of outputs, we get three DisplayPort 2.1 ports, with a single HDMI 2.1a port, though no USB-C output. Honestly, having the DisplayPort 2.1 output is nice, but really unnecessary. With just 8GB VRAM, there is no universe where this card can output 8K video that doesn't default to a slow sequence of still images, so it's a nice-to-have that you are almost guaranteed to never use. Far be it for me to be a buzzkill, though, so if you want to push this card at 8K, do let me know how that turns out.

As for the lack of USB-C, this really isn't a creative card, so this isn't something that you should worry about unless you have one of the best USB-C monitors and nothing else. Even then, I recommend looking further up the stack (like the AMD Radeon RX 7900 XT), since USB-C monitors are almost universally for creative pros and this card isn't going to cut it for the kind of work you'll need to do with it.

In terms of its actual aesthetics, like the two RDNA 3 cards before it, the RX 7600 eschews any RGB and features a matte black design with some subtle accent touches like the red stripes on the fins of the heat sink which would be visible in a case. Overall, it's a cool-looking card, especially for those not looking to have excessive RGB lighting up everything in their case.

AMD Radeon RX 7600: Performance

An AMD Radeon RX 7600 on a desk

(Image credit: Future / John Loeffler)
  • Best-in-class 1080p rasterization performance
  • Much improved ray tracing performance
  • Can manage some decent 1440p performance, especially without ray tracing

Given the missteps Nvidia has been making lately, AMD has a real shot of taking some market share if it can offer compelling performance for gamers. Fortunately for Team Red, the AMD Radeon RX 7600 manages to pull off quite a coup when it comes to gaming performance.

Test system specs

This is the system we used to test the AMD Radeon RX 7600:

CPU: Intel Core i9-13900K
CPU Cooler: Cougar Poseidon GT 360 AIO
RAM: 64GB Corsair Dominator Platinum RGB DDR5-6600MHz
Motherboard: MSI MAG Z790 Tomahawk Wifi
SSD: Samsung 990 Pro 2TB NVMe M.2 SSD
Power Supply: Corsair AX1000
Case: Praxis Wetbench

For the most part, the RTX 4060 is the RX 7600's main competition, but with the Nvidia RTX 4060 Ti just being released, it's the natural comparison at the moment. Is this necessarily fair? No, it's not, and the RX 7600 does lose out to the RTX 4060 Ti on just about every measure, but it really doesn't lose that badly.

In rasterized workloads at 1080p, the RX 7600 is only about 12% slower than the RTX 4060 Ti, and only about 13% slower at 1440p. This changes drastically as soon as you start factoring in ray tracing and upscaling, but it's something I definitely wasn't expecting. Against the RTX 3060 Ti, the RX 7600 fares better, obviously, and generally it outperforms the RTX 3060 in rasterization workloads.

In terms of its predecessor, the RX 7600 is the kind of gen-on-gen improvement I was really expecting to see from the RTX 4060 Ti and didn't get. The RX 7600's rasterization performance is great, but its improved ray accelerators really outshine what the RX 6600 is capable of, and really makes ray tracing at this price point accessible to the midrange.

Synthetic Benchmarks

In synthetic benchmarks, the RX 7600 roundly beats its predecessor, as well as the RTX 3060. Against the card it's replacing, the RX 7600 outperforms the RX 6600 by about 19%, while the RX 7600 beats the RTX 3060 by about 18% overall.

Digging into the results a bit further though, we can see some of the biggest gains come in ray-traced workloads like Port Royal, where the RX 7600 saw a 33% improvement over the previous gen.

The only benchmark where the RX 7600 comes up a bit short is in the Speedway benchmark, which is a 1440p, ray tracing benchmark. Here, the RTX 3060 just barely edges out the RX 7600 by just 219 points, which is close enough to be a bit of a wash.

Gaming Benchmarks

As you can see, when it comes to general rasterization performance at 1080p, the RX 7600 is the hands-down winner, only falling to the RTX 3060 in Counterstrike: Global Offensive, and only then by the barest of margins. Everywhere else, you can expect roughly 15-20% better performance out of the RX 7600 overall.

Things take a bit of a turn when it comes to ray tracing performance, but the results here are a bit deceptive for a couple of reasons. First, Cyberpunk 2077 is Nvidia's major showcase game, and that game is very well optimized for Nvidia cards, so the ray tracing performance for the RTX 3060 is substantially better than for either AMD card. However, take Cyberpunk 2077 out of the mix, and the RX 7600 actually outperforms the RTX 3060 in ray tracing performance. 

It's not all good for AMD though, since the minimum fps for the RX 7600 in both Returnal and Cyberpunk 2077 is in the single digits, and it's not just for a brief moment, but fairly regular dips into slideshow territory, especially around volumetric fog with applied lighting effects.

It's a similar story when you apply upscaling to either Cyberpunk 2077 or Returnal, where the RTX 3060's DLSS 2.0 is simply better optimized for the former, and the AMD RX 7600 struggles on the minimum fps on the latter, so even though the average fps on Returnal looks like it's north of 60 fps, you'll dip as low as 6 fps on the Quality FSR preset or 15 fps on the Ultra Performance preset, and trust me, it's noticeable. 

Of course, turn ray tracing off and you probably won't have this issue, but that will be a series of settings compromises you will have to decide for yourself. Overall though, the AMD Radeon RX 7600 manages to perform well above where you would expect from this generation at this price point. If you're looking for an outstanding and reasonably cheap 1080p graphics card, you can't go wrong with this one.

Should you buy the AMD Radeon RX 7600?

An AMD Radeon RX 7600 on a desk

(Image credit: Future / John Loeffler)

Buy it if…

Don’t buy it if…

Also consider

An AMD Radeon RX 7600 on a desk

(Image credit: Future / John Loeffler)

If my AMD Radeon RX 7600 review has you considering other options, here's two other graphics cards to consider.

How we test graphics cards

I spend several days with the RX 7600 running benchmarks, playing games, and generally measuring its performance against competing cards.

I paid special attention to its 1080p performance, since this is the main target audience for this card, while also stretching into 1440p gaming as well.

Having covered and tested many graphics cards in my career, I know how a graphics card should perform at this level and what you should be spending for this level of performance. 

Read more about how we test

Nvidia GeForce RTX 4060 Ti: A DLSS and ray tracing upgrade, but not much else
4:00 pm | May 23, 2023

Author: admin | Category: Computers Gadgets | Tags: , , , | Comments: Off

Nvidia GeForce RTX 4060 Ti: two minute review

The Nvidia GeForce RTX 4060 Ti is without a doubt one of the most anticipated graphics card launches of this generation, and now that it's here, it should be an easy win for Nvidia over archrival AMD. I wish that were the case.

That's not to say that the RTX 4060 Ti doesn't hold up well against AMD's midrange offerings at the moment, it absolutely does, and there's no question that the features this card brings to the table are impressive, especially DLSS 3, which is the first time a properly midrange GPU (under $500/£500/AU$750) is seeing this feature.

It goes without saying that Nvidia is leaning into DLSS 3 as its biggest selling point, and as I'll get into later, it definitely delivers significantly better performance than the RTX 4060 Ti should be capable of given its various specs — even factoring in the expanded cache which widens up the memory bandwidth of the card despite still having just 8GB GDDR6 VRAM to work with.

The decision to go with 8GB VRAM for this card — a 16GB VRAM variant is going to be released in July 2023 for an MSRP of $499 (about £400/AU$750) — is probably the only thing that kept the price on this card under $400. With an MSRP of $399 (about £320/AU$600), the Nvidia Founders Edition RTX 4060 Ti 8GB is the same price as the Nvidia GeForce RTX 3060 Ti it is replacing, and generally, it offers a pretty good value for that money, with some caveats.

In terms of native, non-DLSS performance, there isn't a whole lot of improvement over the previous generation, which is definitely going to disappoint some, if not many. Given the kinds of performance advances we've seen with higher-end cards, we were hoping to see that extend down into the heart of the midrange, but it seems those benefits generally stop at the Nvidia GeForce RTX 4070.

Instead, you have a card that relies very heavily on DLSS for carry its performance over the line, and where it works, it is generally phenomenal, offering a real, playable 1440p gaming experience, and even brushing up against some decent 4K performance with the right settings. 

This is something AMD has really struggled to match with its FSR, and so Nvidia really has a chance to score a major blow against AMD, but as we'll see, the best AMD graphics card in the last generation's midrange, the AMD Radeon RX 6750 XT, actually outperforms the RTX 4060 Ti in non-ray tracing workloads, including gaming, so this does not bode well for Nvidia once AMD releases its current-gen midrange cards.

This is somewhat exacerbated by the fact that the RTX 4060 Ti's ability to use its new features is fairly limited, and while features like DLSS 3 with Frame Generation are available on the best PC games like Cyberpunk 2077 and Returnal, as of the launch of the RTX 4060 Ti, there are only 50-ish games that support DLSS 3. 

This list will surely grow over time, but you certainly won't get this kind of support on games that may be just recent enough to push the RTX 4060 Ti in terms of its performance, while being just old enough that you'll never see a DLSS 3 patch for it.

I can say that if you're coming from an RTX 2060 Super or older, then this card is absolutely going to blow your mind. It's effectively the RTX 3060 Ti's NG+, so if you missed what I consider to be the best graphics card of the last generation, you'll get all that and more with the RTX 4060 Ti. If you're coming from an Nvidia Ampere card though — especially from greater than the RTX 3060 Ti — chances are you are going to find this is really a downgrade with some neat features to soften the blow. 

Nvidia GeForce RTX 4060 Ti: Price & availability

An Nvidia GeForce RTX 4060 Ti

(Image credit: Future / John Loeffler)
  • How much is it? MSRP listed at $399 (about £240, AU$600)
  • When is it out? It is available starting May 24, 2023
  • Where can you get it? You can buy it in the US, UK, and Australia

The Nvidia GeForce RTX 4060 Ti 8GB is available starting May 24, 2023, with an MSRP of $399 (about £240, AU$600). This is the same launch price of the Nvidia RTX 3060 Ti that this card is replacing, so we're glad to see that Nvidia didn't increase the price on this variant with this generation.

This also puts it on par with the AMD Radeon RX 6750 XT, which initially launched for $549 (about £440/AU$825), but which you can find under $400 right now, even without discounts, at major retailers. AMD hasn't released an RX 7700 XT yet, which would be this card's more natural competition, so comparing the RX 6750 XT and the RTX 4060 Ti isn't really fair, but it's all we have for now until AMD launches its RTX 4060 Ti challenger.

Nvidia GeForce RTX 4060 Ti: Features and chipset

An Nvidia GeForce RTX 4060 Ti

(Image credit: Future / John Loeffler)
  • Only uses one 8-pin...but still requires a 16-pin converter?
  • 3rd-generation ray tracing and 4th-generation tensor cores
  • 288.0 GB/s memory bandwidth, but 32MB L2 cache boosts effective bandwidth to 554.0 GB/s (according to Nvidia)

The Nvidia RTX 4060 Ti is the first properly midrange Nvidia Lovelace graphics card, and so it is built on TSMC's 5nm process, with about 22.9 billion transistors across 34 streaming multiprocessors (SM), which come with 128 shader cores (CUDA), 4 fourth-generation tensor cores, and 1 third-generation ray tracing core per SM.

The clock speed is a solid 2,310MHz base clock, which is about a 64% improvement over the RTX 3060 Ti's 1,410MHz, with a boost clock of 2,535MHz, or about 52% faster than the RTX 3060 Ti's 1,665MHz.

The biggest difference between the two cards is the memory bus. The Nvidia RTX 4060 Ti uses a 128-bit bus for 8GB GDDR6 VRAM, while the RTX 3060 Ti uses a 256-bit bus for the same amount of VRAM. The RTX 4060 Ti has a faster memory clock (2,250MHz), which combined with the expanded L2 cache (32MB), the RTX 4060 Ti has a slightly faster effective memory speed of 18 Gbps to the RTX 3060 Ti's 15 Gbps, while having a much faster effective memory bandwidth.

Still, this really does smack of over-engineering. The move to a 128-bit bus doesn't seem necessary, and given what we've seen of other Lovelace cards, like the Nvidia GeForce RTX 4070, I definitely think Nvidia could have stuck with a higher bus width and it wouldn't be catching nearly the grief it is getting over this. 

What's more, even though the performance of the RTX 4060 Ti is better than the RTX 3060 Ti, I really think that had this card had the same bus width as the RTX 3060 Ti, this card would absolutely anything that approached it in the midrange. As it stands, the RTX 4060 Ti is great, but fails to score the knockout it really needed.

It is worth mentioning though that this card also uses significantly less power than the RTX 3060 Ti. That card had a TGP of 200W, while the RTX 4060 Ti 8GB comes in at 160W, which is a 20% improvement in efficiency. This is great for keeping your build under 600W, and it's a move in the right direction for everyone and deserves to be praised.

Nvidia GeForce RTX 4060 Ti: Design

An Nvidia GeForce RTX 4060 Ti

(Image credit: Future / John Loeffler)

The Nvidia RTX 4060 Ti Founders Edition keeps the same black heatsink with chrome trim as the other Founders Edition cards this generation, and — unfortunately — it also sticks with the 12VHPWR 16-pin power connector. Fortunately, you only need to plug a single 8-pin into it, so it is at least somewhat easier to manage in a case.

Also easier to manage is the size of the card. Using the same dual-fan design as previous Founders Edition cards, the RTX 4060 Ti pretty much shrinks these down a bit. While it's still a dual-slot card, it comes in at just under 9.5 inches long and 4.5 inches tall, making it the kind of card that will easily fit in your case.

There's not much flash here, but that's a given with the Founders Edition, so if you're looking for visual bells-and-whistles like RGB or super-cooling features like a triple fan design, you're going to want to look at any of the third-party cards that release alongside this one for those.

Nvidia GeForce RTX 4060 Ti: Performance

An Nvidia GeForce RTX 4060 Ti

(Image credit: Future / John Loeffler)
  • DLSS 3 is the real draw here
  • Improved ray tracing performance
  • Baseline performance not much improved over the RTX 3060 Ti

When it comes to performance, the Nvidia GeForce RTX 4060 Ti really leans on DLSS 3 for most of its major performance gains, and while this can be substantial, some are going to feel somewhat disappointed.

Test system specs

This is the system we used to test the Nvidia GeForce RTX 4060 Ti:

CPU: Intel Core i9-13900K
CPU Cooler: Cougar Poseidon GT 360 AIO
RAM: 64GB Corsair Dominator Platinum RGB DDR5-6600MHz
Motherboard: MSI MAG Z790 Tomahawk Wifi
SSD: Samsung 990 Pro 2TB NVMe M.2 SSD
Power Supply: Corsair AX1000
Case: Praxis Wetbench

This is largely because even with the introduction of some pretty advanced tech, there aren't a lot of games out right now that can really leverage the best features of this card. While three of the games I use as benchmarks — F1 2022, Cyberpunk 2077, and Returnal — all feature frame generation, these are also three of the latest games out there from major studios that have the time and staffing to implement DLSS 3 with Frame Generation in their games. 

There is a DLSS 3 plug-in coming for Unreal Engine, which should definitely expand the number of games that feature the tech, that's still going to be a ways off at this point before that starts to trickle down to the gamers who will actually be using this card.

I'll get more into DLSS 3 and Frame Generation in a bit, but a quick glance over the underlying architecture for the RTX 4060 Ti tells something of a story here, as shown by synthetic benchmarks using 3DMark and Passmark 3D.

Synthetic Benchmarks

As you can see, the RTX 4060 Ti beats out the RTX 3060 Ti, but only just barely, getting about 11% better performance than the card it's replacing. This is, okay, I guess, but hardly the generational leaps that previous Lovelace cards have been making. 

For example, the RTX 4070 offers a roughly 21% jump over the RTX 3070 on these same synthetic benchmarks. In fact, this puts the RTX 4060 Ti just ahead of the RX 6750 XT, and ultimately just behind the RTX 3070 in terms of raw performance.

As a gaming card, the performance outlook is better, but not by a whole lot overall.

Gaming Benchmarks

On games with heavy effects-based visuals like Metro: Exodus and Cyberpunk 2077 where the advanced architecture of the RTX 4060 Ti can be leveraged, it does edge out the competition, sometimes. The RX 6750 XT still manages a slightly better fps on Returnal at 1080p, on average, when not using ray tracing or upscaling tech, for example. 

The RTX 4060 Ti also gets crushed in CS:GO at 1080p, relatively speaking, which I chalk up entirely to pushing textures through the smaller memory bus of the RTX 4060 Ti. The 192-bit bus on the RX 6750 XT's 12GB GDDR6 VRAM and the 256-bit bus on the RTX 3060 Ti's 8GB GDDR6 really show up in cases like this.

Things do start to turn in the RTX 4060 Ti's favor once you start fiddling with ray tracing. The third-generation ray tracing cores on the RTX 4060 Ti are definitely much more capable than the RTX 3060 Ti's and especially more than the RX 6750 XT's, which are second-generation and first-generation cores, respectively.

The RTX 4060 Ti is the first midrange card I've tested that is going to give you playable native ray-traced gaming at 1080p consistently on max settings, though it will struggle to get to 60 fps on more demanding titles like Cyberpunk 2077.

But lets be honest, nobody is playing any of these games with native resolution ray tracing, you're going to be using an upscaler (and if you aren't then you really need to start). 

Here, the performance of Nvidia's DLSS really shines over AMD's FSR, even without frame generation. In both Cyberpunk 2077 and Returnal, the RTX 4060 Ti can get you over 120 fps on average when using the DLSS Ultra Performance preset, and if you want things to look their best, you can easily get well north of 60 fps on average with every setting maxed out, even ray tracing.

Now, one of the things that the wider memory bus on the RTX 3060 Ti gave that card was a faster throughput when gaming at 1440p. Now, not every game is going to run great at 1440p, but for a lot of them, you're going to be able to get a very playable frame rate. 

The RTX 4060 Ti improves over the RTX 3060 Ti here, but not nearly as much as it should, and on games like F1 2022 and CS:GO where that memory bandwidth difference is going to show, well, it shows up here at 1440p, too.

Of course, once you turn on ray tracing, most games are going to turn into a slide show, but unsurprisingly, the RTX 4060 Ti manages to score a win here on every ray-traced game I tested.

That said, you are really pushing it here on these settings, and you're better off using upscaling if you're going to go for 1440p, especially with settings turned up.

The biggest win for the RTX 4060 Ti here is with Cyberpunk 2077, where it manages 67% better performance at max quality settings than the RX 6750 XT, but maddeningly, it's only about 13% better than the RTX 3060 Ti on the quality preset. On ultra performance, the RTX 4060 Ti is about 52% better than the RX 6750 XT, but again, only 13% better than the RTX 3060 Ti.

When it comes to Returnal, the RX 6750 XT is essentially tied with the RTX 4060 Ti on the quality preset for FSR 2.1 and DLSS, respectively. Bump this up to ultra performance, and the RTX 4060 Ti does better, beating out the RX 6750 XT by about 22% and the RTX 3060 Ti by about 17%.

I imagine the RTX 4060 Ti will perform more or less the same across most games that still rely on DLSS 2.0, which number more than 200. For those games that really leverage DLSS 3 with Frame Generation though, it really is another story entirely.

With Frame Generation, you can get about a 40-60% performance improvement on games that support it. This isn't nothing, since this can even get you playing Cyberpunk 2077 at a playable framerate at 4K on ultra performance. The RTX 3060 Ti and RX 6750 XT really don't have any answer to this, and so they are going to lag behind considerably on any games that have DLSS 3 with Frame Generation.

Does Frame Generation increase latency on some titles, along with other issues? Sure. Will it matter to gamers who get to play Cyberpunk 2077, Returnal, and other titles that play like they were RTX 3080 Ti's? Probably not.

Will any of this matter to anyone who doesn't play those games? Obviously not. And that is ultimately the issue with this card. For what it does well, it has no peer at this price, but if you already have an RTX 3060 Ti, then there is really very little reason to upgrade to this card. Hell, if you have an RX 6750 XT, you might feel like you're better off just waiting to see what AMD has in store for the RX 7700 XT, and I would not blame you in the slightest. 

This isn't a whiff by Team Green by any means, but there's no getting around the fact that the performance of the Nvidia GeForce RTX 4060 Ti absolutely leaves a massive opening for AMD to exploit in the coming months with the RX 7700 XT, or even the RX 7650 XT.

Should you buy the Nvidia GeForce RTX 4060 Ti?

An Nvidia GeForce RTX 4060 Ti

(Image credit: Future / John Loeffler)

Buy it if...

Don’t buy it if…

Also consider

How I tested the Nvidia GeForce RTX 4060 Ti

I spend several days with the RTX 4060 Ti running benchmarks, playing games, and generally measuring its performance against competing cards.

I paid special attention to its DLSS 3 Frame Generation technology, since this is one of the card's biggest selling points, and played several games at length with the tech turned on.

Having covered and tested many graphics cards in my career, I know how a graphics card perform at this level. 

Read more about how we test

Intel Arc A750 review: a great budget graphics card with major caveats
9:48 pm | May 22, 2023

Author: admin | Category: Computers Gadgets | Tags: , , , | Comments: Off

Intel Arc A750: Two-minute review

The Intel Arc A750 is probably the one graphics card I've most wanted to get my hands on this year, and now that I've put it through a fairly rigorous testing regime, I can honestly say I am very impressed with Intel's first effort at a discrete GPU. At the same time, it's also not an easy card to recommend right now, which is a tragedy.

First, to the good, namely the great price and stylish look of the Intel Limited Edition reference card. The Intel Arc A750 Limited Edition card has an MSRP of just $249.99 (about £200 / AU$375), and the limited number of third-party cards out there are retailing at roughly the same price. 

The Arc A750 I tested also looks spectacular compared to the reference cards from Nvidia and AMD, thanks to its matte black look, subtle lighting, and silver trim along the edge of the card. It will look great in a case, especially for those who don't need their PCs to look like a carnival.

When it comes to performance, I was most surprised by how the Arc A750 handled modern AAA games like Cyberpunk 2077 and Returnal, both of which put a lot of demands on a graphics card in order to maintain a stable frame rate. The Arc A750 handled them much better than the RTX 3050 it is ostensibly competing against. It even outperformed the RTX 3060 in many cases, putting it just under halfway between the RTX 3060 and the RTX 3060 Ti, two of the best graphics cards ever made.

The Arc A750 manages to pull this off while costing substantially less, which is definitely a huge point in its column.

An Intel Arc A750 running on a test bench

(Image credit: Future / John Loeffler)
Test system specs

This is the system we used to test the Intel Arc A750:

CPU: AMD Ryzen 9 7950X3D
CPU Cooler: Cougar Poseidon GT 360 AIO
RAM: 32GB Corsair Dominator Platinum @ 5,200MHz & 32GB G.Skill Trident Z5 Neo @ 5,200MHz
Motherboard: ASRock X670E Taichi
SSD: Samsung 980 Pro 1TB NVMe M.2 SSD
Power Supply: Corsair AX1000
Case: Praxis Wetbench

The thing about the Arc A750 is that the things it does well, it does really well, but those areas where it flounders, like older DirectX9 and DirectX10 workloads, it does so pretty badly. 

It's a tale of two halves, really. Nothing exposes the issues with the Arc A750 more than its synthetic performance scores, which on average trounce the RTX 3060, 23,924 to 20,216. In that average though is its PassMark 3D score, a good measure of the card's ability to render content that wasn't just put out within the last couple of years. Here, the Arc A750 scored a dismal 9,766 to the RTX 3060's 20,786 - a 10,000 point deficit.

The story is similar when gaming, where the Arc A750 generally outperforms its rival cards, even in ray tracing in which Intel is the newcomer behind mature leader Nvidia and fiesty, determined AMD. In fact, when gaming with ray tracing at 1080p, the Intel Arc A750 comes in a close second behind Nvidia's RTX 3060 8GB, 37fps on average to the 3060's 44fps.

Bump that up to 1440p, however, and the Intel Arc A750 actually does better than the RTX 3060 8GB - 33fps on average to the 3060's 29fps average. When running Intel XeSS and Nvidia DLSS, the Arc A750 averages about 56fps on max settings with full ray tracing at 1080p, while the RX 6600 can only muster 46fps on average.

These are much lower than the RTX 3060's 77fps, thanks to DLSS, but getting roughly 60fps gaming with full ray tracing and max settings at 1080p is a hell of an accomplishment for the first generation of Intel discrete graphics. The Arc A750 can even run even with the AMD Radeon RX 6650 XT in ray tracing performance with upscaling at 1440p, getting 42fps on average. 

If only this performance were consistent across every game, then there would be no question that the Intel Arc A750 is the best cheap graphics card on the market. But it is exactly that inconsistency that drags this card down. Some games, like Tiny Tina's Wonderland, won't even run on the Arc A750, and it really, really should. How many games are there out there like Tiny Tina's? It's impossible to say, which is the heartbreaking thing about this card.

I really can't recommend people drop $250 on a graphics card that might not play their favorite games. That is simply not a problem that AMD or Nvidia have. Their performance might be rough for a few days or weeks after a game launches, but the game plays. The same can't be said of the A750, and only you, the buyer, can decide if that is worth the risk.

In the end, the Intel Arc A750 is a journeyman blacksmith's work: showing enormous potential but not of enough quality to merit selling in the shop. Those pieces are how craftspeople learn to become great, and I can very clearly see the greatness that future Arc cards can achieve as Intel continues to work on lingering issues and partners with more game developers.

It's just not there yet. As Intel's drivers improve, a lot of these issues might fade away, and the Intel Arc A750 will grow into the formidable card it seems like it should be. If you're comfortable dropping this kind of cash and taking that chance, you will still find this card does a lot of things great and can serve as a bridge to Intel's next generation of cards, Arc Battlemage, due out in 2024. 

Intel Arc A750 Price & availability

An Intel Arc A750 graphics card on a pink desk mat next to its retain packaging

(Image credit: Future / John Loeffler)
  • How much does it cost? MSRP of $249.99 (about £200 / AU$375)
  • When can you get it? It is available now
  • Where can you get it? It is available in the US, UK, and Australia, but stock may be an issue

The Intel Arc A750 is available now, starting at $249.99 (about £200 / AU$375). There are a limited number of third-party partners who also make the A750, though these tend to sell at or very close to Intel's MSRP from what I've seen.

This puts the Arc A750 on the same level price-wise as the Nvidia RTX 3050, but it definitely offers better performance, making it a better value so long as you're ok with the varying compatibility of the Arc A750 with some PC games out there.

  • Value: 4 / 5

Intel Arc A750 Specs

An Intel Arc A750 graphics card on a pink desk mat next to its retain packaging

(Image credit: Future / John Loeffler)

Should you buy the Intel Arc A750?

Buy it if...

You're looking for a cheap GPU
At $249.99, this is one of the best cheap GPUs you're going to find.

You want a stylish looking card
This card is very cool looking in a way that Nvidia and AMD reference cards simply aren't.

You want strong ray tracing and upscaling
Not only do Intel's AI cores make XeSS upscaling a serious contender, the Arc A750's ray tracing performance is quite strong.

Don't buy it if...

You are concerned about compatibility
While only one game I tested wouldn't work, that's one game too many for many gamers out there.

You're concerned about power consumption
At 225W TGP, this card soaks up way more power than a card in this class reasonably should.

Intel Arc A750: Also consider

If my Intel Arc A750 has you considering other options, here are two more cards to consider...

How I tested the Intel Arc A750

An Intel Arc A750 graphics card on a pink desk mat next to its retain packaging

(Image credit: Future / John Loeffler)
  • I spent several days with the Intel Arc A750
  • I used the A750 in my personal PC playing games and doing creative work
  • I ran our standard battery of tests on the Arc A750

I spent several days with the Intel Arc A750 to test its gaming and creative performance, including at 1080p and 1440p. In addition to gaming, I ran our standard suite of GPU tests at it using the same system set up I use for all our graphics card tests. 

Besides my extensive computer science education or practical experience, I have been a hardware reviewer for a few years now, and a PC gamer for even longer, so I know how well graphics cards are supposed to perform with a given set of specs.

Read more about how we test

First reviewed May 2023

Lenovo Legion Pro 5i review: a solid gaming laptop
8:00 pm | May 19, 2023

Author: admin | Category: Computers Gadgets | Tags: , , , | Comments: Off

Lenovo Legion Pro 5i: Two-minute review

Unlike the Lenovo Legion Pro 7i that was released earlier this year, the Lenovo Legion Pro 5i is a much more affordable gaming laptop that offers excellent performance for its price point. It comes in two separate types: the Legion Pro 5i outfitted with an Intel CPU and the Legion Pro 5 which comes with an AMD CPU. 

Both versions use an Nvidia 4000-series GPU, as AMD The cheapest configuration you can nab without having to customize one yourself is about $1,259.99 / £1,410 (including VAT) / AU$2,499, which is far more affordable than most of the best gaming laptops on the market while still commanding respectable specs.

Just as many of the other desktop replacements we’ve seen in 2023, this one comes in the standard black color with nothing particularly interesting about its shape and design. The chassis feels decently sturdy, with a nice metal finish on the top of the laptop. Opening it, the keyboard keys are more uniquely shaped, with a roundness to them you normally don’t see. It makes typing a bit weird at first until you adjust to it. The trackpad is pretty solid in terms of sensitivity, and I’m always a fan of mechanical feedback versus haptic feedback.

It has an excellent, well-balanced 16-inch WQXGA (2560x1600) display, with a choice to upgrade to HDR and from 165Hz to 240Hz refresh rate. For most gamers, the difference doesn’t matter, and if you’re purchasing this laptop in particular to save money, then you can do without the pricier upgrade. I also enjoy the fact that there’s a manual switch to turn the webcam off and on, which is lacking in the vast majority of laptops, though I wish it linked up to a physical shutter instead.

There are two downsides to the Legion Pro 5i, however. The first is the audio, specifically how low it is. Of course, you could mitigate this by simply using a headset, but the fact that the speakers are so quiet compared to every other laptop I’ve reviewed in 2023 so far is a huge inconvenience. The second issue is the placement of the keyboard and trackpad. Both feel like they’re too far to the left,  requiring you to adjust to the orientation. Until you do, typos and mis-presses are commonplace. This isn’t much of an issue if you use a controller or mouse, but for those who use a keyboard for gaming or productivity work, this could be an issue.

I received two personal review units — the first came with the Intel Core  i7-13700HX and the second with an AMD Ryzen 7 7745HX. Both come with an Nvidia GPU, as AMD hasn’t yet released gaming laptops with mobile processors. The Intel and AMD CPUs are mostly comparable in theory but in practice, the Ryzen 7 7745HX completely blows the i7-13700HX out of the water in every benchmark.

Despite these differences, the Lenovo Legion Pro 5i’s gaming performance is excellent, handling any of the best PC games easily, including Cyberpunk 2077 at max settings and ray tracing on while maintaining 60fps on average. Other titles like Final Fantasy VII Remake and Marvel’s Spider-Man Remastered also look and run great, with very little slowdown at max settings. The former is able to lock in at 60fps when the option is chosen, while the latter consistently stays above that with the proper frame rate settings.

Lenovo Legion Pro 5i: Price & availability

closeup of keys

(Image credit: Future)
  • Starting at $1,259.99 / £1,410 (including VAT) / AU$2,499
  • Available now 
  • Available in the US and UK, and Australia

The Lenovo Legion Pro 5i sits firmly in the affordable market of gaming laptops — never quite dipping down to budget levels but still a well-rounded choice for those wanting great gaming performance for a solid price. The price is especially impressive for a gaming laptop with a 4000-series GPU and a 13th Gen CPU.

It’s available in the US, UK, and Australia, with a nice range of configurations for each region. The US has the most choices, with several models available for purchase as well as an option to customize your laptop, while the UK and Australia only have the preset models.

It’s difficult to compare to other gaming laptops in the 2023 market, as many of them are meant to either be super expensive desktop replacements or ultra-cheap laptops. The Legion Pro 5i is meant as an affordable option that sits in mid-range pricing. The closest are the Alienware m18 and the Asus ROG Zephyrus M16, which are nearly double the price but feature the best specs in return.

  • Price score: 5 / 5

Lenovo Legion Pro 5i: Specs

closeup of stickers

(Image credit: Future)

The specs for the Lenovo Legion Pro 5i review unit sent to me are as follows: Intel Core  i7-13700HX CPU, Nvidia GeForce RTX 4060 8GB GDDR6 GPU, 16GB DDR5 RAM, 512GB SSD of storage, and a 16-inch WQXGA (2560 x 1600) display with 100% sRGB, 300 nits, and 165Hz.

I was also sent a Lenovo Legion Pro 5 for comparison, which features the following specs: AMD Ryzen 7 7745HX CPU, Nvidia GeForce RTX 4070 8GB GDDR6 GPU, 16GB DDR5 RAM, 512GB SSD of storage, and a 16-inch WQXGA (2560 x 1600) display with 100% sRGB, 300 nits, and 165Hz.

Like the Pro 7, the Legion Pro 5 comes in two main types: the Legion Pro 5i outfitted with an Intel CPU, and the Legion Pro 5 with an AMD CPU. The Lenovo Legion Pro 5i comes in several configurations depending on the region. The main difference in configurations will have you choose between Intel Core i5-13500HX and i9-13900HX CPUs, between the Nvidia GeForce RTX 4050 and RTX 4070 GPUs, RAM, storage, and displays.

Only those in the US can configure their laptop based on several specs. Those in the UK and Australia can only choose from the available models, with no customization options.

  • Specs score: 4.5 / 5

a closed black laptop

(Image credit: Future)

Lenovo Legion Pro 5i: Design

  • Plain looks
  • Great port selection
  • Great display, average keyboard and touchpad
  • The sound is very low

Like many affordably priced gaming laptops, the Lenovo Legion Pro 5i is rather plain looking from the outside. Its chassis has a nice metal finish, and its weight is hefty but not unmanageable. The size of the display makes it a little tricky to carry around in bags, but a large enough one won’t struggle with the five-pound weight as well.

It has an excellent port selection with a wide selection of slots including four USB Type-A ports, two USB Type-C (both with DisplayPort 1.4), one HDMI port, one ethernet port, one headphone/microphone combo jack, and one power connector. 

Many of the ports are located in the back but are thankfully labeled, making it a breeze to know which port is what without having to turn around the laptop. Unfortunately, there’s no SD card reader, which is a shame as that’s one of the most useful ports for a laptop to have.

Image 1 of 4

closeup of ports

(Image credit: Future)
Image 2 of 4

closeup of ports

(Image credit: Future)
Image 3 of 4

closeup of keyboard

(Image credit: Future)
Image 4 of 4

closeup of ports

(Image credit: Future)

The display is a thin-bezel beauty 16-inch WQXGA (2560 x 1600) with its brightness between 300 to 500 nits, 100% sRGB coverage for creatives and editors (which pairs perfectly with the gaming-level GPU and CPU), and a choice of either 165Hz or 240Hz refresh rate, as well as a screen that supports HDR. 

It would be nice if the HDR support was included and we had a choice for an OLED screen instead, which many other gaming laptops have been offering. Rounding that out is a handy manual switch on the side for the largely average webcam, which is always preferable to a key press, but a physical shutter for the camera would have made things even sweeter.

Though I always appreciate the RGB backlighting of the keyboard and the unique shape of the keys that afford more space to type on, the keyboard and touchpad are positioned in an odd way. They’re a little more to the left than normal, which requires a period of adjustment that can cause mistyping and missed presses on the touchpad in the meantime. This could be an issue for those who heavily rely on both for work and gaming. Otherwise, feedback from the keys and pad is perfectly serviceable and shouldn’t hamper gamers who use a controller and gaming mouse instead.

Ventilation is probably the biggest issue with this laptop, which is strange considering it has more than enough vents. The largest ones are located at the bottom – a standard gaming laptop design, but for some reason, they aren’t quite up to snuff when it comes to encouraging airflow properly. I found myself having to prop up the laptop using the cable itself, giving it just enough wiggle room to cool down. You may have to invest in a cooling pad or prop if you have these issues too.

  • Design score: 4 / 5

a black laptop

(Image credit: Future)

Lenovo Legion Pro 5i: Performance

  • Gaming performance is excellent
  • CPU performance is fine 
  • But underperforms in benchmarks
Alienware m18: Benchmarks

Here's how the Lenovo Legion Pro 5i performed in our suite of benchmark tests:

3DMark: Night Raid: 52,244; Fire Strike: 21,729; Time Spy: 8,869; Port Royal: 4,834
GeekBench 5: 1,825 (single-core); 8,126 (multi-core)
Cinebench R23 Multi-core:
10,450 points
Total War: Warhammer III (1080p, Ultra): 80 fps; (1080p, Low): 227 fps
Cyberpunk 2077 (1080p, Ultra): 44 fps; (1080p, Low): 63 fps
Dirt 5 (1080p, Ultra): 40 fps; (1080p, Low): 131 fps
25GB File Copy: 15.0
Handbrake 1.6: 3:37
CrossMark: Overall: 2,017 Productivity: 1,916 Creativity: 2,148 Responsiveness: 1,945
PCMark 10 (Home Test): 6,854 points
Battery Life (TechRadar movie test): 3 hours, 28 minutes

At this point, it’s difficult to directly compare the Lenovo Legion Pro 5i to other gaming laptops in 2023, as most of them are running on high-end GPUs like the RTX 4070, 4080, and 4090. However, comparing benchmark scores between those and last-gen laptops give us a better range of how well the Legion Pro 5i performs. I’ve found that the RTX 4060 scores quite high on its own merits, far surpassing the 3000-series and merely 30K points behind the two most powerful GPUs.

This, in turn, is well reflected in its general gaming performance and frame rate stability. When maxing out Cyberpunk 2077’s settings and turning on both tray-racing and DLSS 3, the laptop was able to maintain a stable 60fps. On Final Fantasy VII Remake, I chose to lock in gameplay at 60fps while maxing out the graphic setting, and it ran beautifully with no slowdown to speak off. And Marvel’s Spider-Man Remastered also runs incredibly well, staying above 60fps at all times, even during the more intensive web-swinging sections.

Testing out the Intel Core  i7-13700HX CPU, however, and the results aren’t nearly as impressive. Benchmark scores across the board for the 13th Gen Core i7 are far lower than any of the laptops with a 13th Gen Core i9. Even worse, many of the scores are comparable to the 12th Gen CPUs. I also tested out the AMD Ryzen 7 7745HX CPU in my other review unit, and in benchmarks like Geekbench, Cinebench, and PCMark10 the results were definitively superior to the i7-13700HX. 

When testing out how this translated to creative and productivity performance, however, I found no slowdown or sluggishness in either model, with responsiveness that never waivered no matter how many tasks were going off at once. But if you’re looking for which is the better-performing processor, the AMD version is the way to go. Unfortunately, I wasn’t able to test which version has the better-performing Nvidia 4000-series GPU, since the Legion Pro 5 uses a 4070 instead of the 4060 in the Legion Pro 5i.

  • Performance score: 4 / 5

Lenovo Legion Pro 5i: Battery

closeup of time and battery display

(Image credit: Future)
  • Terrible battery life
  • Charges fast

I regret to inform you that, as with most other desktop replacement gaming laptops, the Lenovo Legion Pro 5i’s battery life is horrible. At most, it lasts about three and a half hours for productivity and creative work, less than half a standard workday and about the same amount of time if you’re streaming videos or movies instead. 

Keeping this baby plugged in at all times is the way to go, especially for intensive gaming sessions that will drain the power even faster. Its saving grace is the fast charge time, which will give you a full battery in about an hour. 

  • Battery score: 2.5 / 5

Should you buy the Lenovo Legion Pro 5i?

Buy it if...

You need great gaming performance
For its pricing, the gaming performance is quite great, maintaining a solid framerate even on the highest settings.

Don't buy it if...

You need a high-volume audio system
For some reason, the audio is extremely low and you'll most likely have to purchase a headset or headphones to get anything adequate.

Lenovo Legion Pro 5i: Also consider

If the Lenovo Legion Pro 5i has you considering other options, here are two more laptops to consider...

How I tested the Lenovo Legion Pro 5i

  • I tested two models of the Lenovo Legion Pro 5 for several weeks
  • I tested it using both benchmark tests and video game benchmarks
  • I stress-tested the battery using the TechRadar movie test

First, I tested the general weight and portability of the Lenovo Legion Pro 5i by carrying it around in a laptop bag for a day. After I set it up, I ran several CPU and GPU benchmarks to thoroughly test out the graphics card's performance and how much it affected processing performance. Finally, I used a variety of programs and applications to test out both battery life and general performance during work-like conditions, as well as gaming benchmarks to test the RTX 4060 GPU.

The Lenovo Legion Pro 5i is an average desktop replacement for gaming, meaning it's meant to be used for hardcore gaming sessions. I made sure to thoroughly test out this laptop in that regard, to make sure it reached certain levels of performance. I also tested out the CPU to see how it fared against the current competition.

I've tested plenty of gaming PCs and laptops, making me more than qualified to understand benchmark test results and how to properly stress test machines to see how well they perform as a work machine.

Read more about how we test

First reviewed May 2023

Dimensity 9200+ brings higher CPU and GPU clocks, promises lower power usage
2:31 pm | May 10, 2023

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

MediaTek is on a roll and has unveiled its third chipset this month. Today’s item is the Dimensity 9200+, which schooled the Snapdragon 8 Gen 2 in preliminary Geekbench 6 tests. As the name suggests, this is a boosted version of the Dimensity 9200 chipset from last year. It’s still fabbed on TSMC’s N4P node (4nm, second gen) but is able to run its CPU and GPU at higher clock speeds. This includes all three CPU clusters, which promises a 10% uplift over the original version of the chip. The new Dimensity 9200+ chipset at a glance As for the ARM Immortalis G715 GPU, MediaTek...

Dimensity 9200+ brings higher CPU and GPU clocks, promises lower power usage
2:31 pm |

Author: admin | Category: Mobile phones news | Tags: , | Comments: Off

MediaTek is on a roll and has unveiled its third chipset this month. Today’s item is the Dimensity 9200+, which schooled the Snapdragon 8 Gen 2 in preliminary Geekbench 6 tests. As the name suggests, this is a boosted version of the Dimensity 9200 chipset from last year. It’s still fabbed on TSMC’s N4P node (4nm, second gen) but is able to run its CPU and GPU at higher clock speeds. This includes all three CPU clusters, which promises a 10% uplift over the original version of the chip. The new Dimensity 9200+ chipset at a glance As for the ARM Immortalis G715 GPU, MediaTek...

MediaTek Dimensity 8050 official with 3GHz prime CPU core
12:49 pm | May 9, 2023

Author: admin | Category: Mobile phones news | Tags: | Comments: Off

MediaTek just announced the Dimensity 8050 - a new chip in name only, as it's basically identical to the Dimensity 1300 and 1200. The Dimensity 8050 is built on TSMC's N6 (6nm) process and comes with a 5G modem. The octa-core processor has four fast Cortex-A78 cores and four efficient Cortex-A55 cores. The Cortex-A78 units are actually divided between 1 Super Core running at up to 3GHz, and three Performance Cores ticking at up to 2.6Ghz. The chipsets supports up to 16GB of LPDDR4x RAM and UFS 3.1 storage. Arm Mali-G77 with 9 cores is in charge of graphics. The Dimensity 8050...

Intel Core i5-13600K: the best everyday CPU around
1:00 am | May 6, 2023

Author: admin | Category: Computers Gadgets | Tags: , , , | Comments: Off

Intel Core i5-13600K: Two-minute review

The Intel Core i5-13600K follows up one of the top budget chips ever and manages to improve on just about everything across the board, except for the price.

When Intel announced its Raptor Lake processors, a lot of us were a bit dismayed that the price of the Core i5 went up by nearly 15% over the Intel Core i5-12600K that preceded it. That chip was arguably the best processor ever made for budget gaming PCs and those who need good performance without a whole lot of extras at a fair price.

At $329 (about £280 / AU$475), the Intel Core i5-13600K puts itself just outside of the budget class of processors. And that's a shame because otherwise, this is the best processor for the vast majority of people and even for a lot of those who tell themselves that they absolutely must have something more powerful like the Intel Core i7-13700K.

Across the general lineup of performance tests I threw at this chip, it pretty much came out on top in every one of them, beating out the competing AMD Ryzen 5 7600X and substantially outperforming the Core i5-12600K. Getting into the nitty-gritty, the Ryzen 5 7600X puts up a much better fight against the i5-13600K than I was expecting, beating the 13600K to a rough draw by the end.

That does mean that if you're looking for a budget gaming CPU, you're probably going to be better off with the Ryzen 5 7600X since you can save a bit of money in the process. But that savings can easily be gobbled up and then some by the extra cost to upgrade to DDR5 RAM, which the i5-13600K still lets you skip in favor of the aging DDR4 RAM that most people still have. So there is definitely a trade-off to be made in either case.

Ultimately though, there's just no denying that the Intel Core i5-13600K has better specs and performance at this price range, give or take a little spare change. So this is a very easy processor to recommend to just about anybody who isn't a gamer or creative professional.

Intel Core i5-13600K: Price & availability

An Intel Core i5-13600K

(Image credit: Future / John Loeffler)
  • MSRP: $329 (about £280 / AU$475)
  • More expensive than competing Ryzen 5 7600X

The Intel Core i5-13600K is on sale now for $329 (about £280 / AU$475). This puts it at about 10% more expensive than the competing AMD Ryzen 5 7600X and about 14% more expensive than the Core i5-12600K.

Considering that the Intel Core i9-13900K didn't get a price increase over its 12th-gen counterpart, the price hike here is probably the biggest disappointment with this chip. Enthusiast users are used to spending the extra money to have the best right out the gate, so they could absorb some of the price inflation rather than let it fall squarely on the one chip that most people are going to use.

This is especially bad considering that AMD's competing chip is right there for a good bit less. There are performance considerations here, obviously, and we'll get to those soon. Still, at this level, the performance difference is not so great as to really justify taking the best Intel processor in the budget class and pushing it into the lower mid-range for a few extra bucks.

  • Price score: 3.5 / 5

Intel Core i5-13600K: Chipset & features

An Intel Core i5-13600K

(Image credit: Future / John Loeffler)
  • Overclockable
  • Supports DDR4 and DDR5

The Intel Core i5-13600K is Intel's second-gen big.LITTLE mainstream processor, following up the i5-12600K, and there have been some big improvements on the architecture side.

My test bench specs

These are the systems I used to test desktop CPU performance for both AMD and Intel systems in this review:

CPU Cooler: Cougar Poseidon GT 360 AIO
Graphics card:
Nvidia GeForce RTX 4090
SSD:
Samsung 980 Pro SSD @ 1TB
Power Supply:
Corsair AX1000 80-Plus Titanium (1000W)
Case:
Praxis Wetbench

Intel motherboard and RAM:
Motherboard:
MSI MPG Z690 Carbon Wifi
DDR5 RAM: 32GB Corsair Dominator Platinum @ 5,200MHz & 32GB Kingston Fury Beast @ 5,200MHz 

AMD motherboard and RAM:
Motherboard:
ASRock X670E Taichi
DDR5 RAM: 32GB Corsair Dominator Platinum @ 5,200MHz & 32GB G.Skill Trident Z5 Neo @ 5,200MHz

While Intel Meteor Lake chips still use the same 10nm "Intel 7" process as the previous 12th-gen Alder Lake chips, the 13th-gen chips improve on the previous architecture in a number of key ways. 

In addition to more cache memory, there have been some improved clock speeds on the high-end, so that the i5-13600K runs slightly slower at base frequency while boosts slightly higher than the 12600K — though both Intel chips have a lower base and boost frequency than the competing AMD Ryzen 5 7600X.

In terms of core counts, the i5-13600K doubles the efficiency cores over the i5-12600K, for a total of 14 cores and 20 threads to the i5-12600K's 10 cores and 16 thread. This is also substantially more than the Ryzen 5 7600X, which is a straight six-core/12-thread chip with all its cores being full-power performance cores.

And while the rated 125W TDP for the i5-13600K remains the same as with the 12600K, it pulls substantially more power under load than its predecessor in my tests, so plan your build accordingly.

Finally, like its predecessor, the Core i5-13600K supports both PCIe 5.0 and DDR4 and DDR5 RAM, so you can either upgrade to new DDR5 RAM or stick with the best RAM of the DDR4 generation, which definitely helps defray the cost of an upgrade.  

  • Chipset & features score: 4 / 5

Intel Core i5-13600K: Performance

An Intel Core i5-13600K

(Image credit: Future / John Loeffler)
  • Fantastic all around performance
  • Decent gaming chip
  • Low performance per watt rating

The Intel Core i5-13600K is the best processor all-around for most people right now, though that does come with a number of caveats.

Generally, the Core i5-13600K outperforms both the Core i5-12600K and Ryzen 5 7600X by a substantial amount, and while the Ryzen 5 7600X holds its own against the i5-13600K, it's a qualified success rather than a straightforward win.

When it comes to synthetic performance, the Intel Core i5-13600K simply overpowers both chips with a larger number of cores, faster clocks, and raw power wattage. Overall, the Core i5-13600K performs about 42% better than the Ryzen 5 7600X and about 26% better than the Core i5-12600K.

In creative workloads, the Core i5-13600K is a great option for folks on a budget who want to dabble in some creative work like 3D rendering or photo editing. But with only six performance cores, using the best graphics card possible will be far more determinative in most cases. That said, the Core i5-13600K outperforms the Ryzen 5 7600X by about 21% and the 12600K by about 12%.

In my gaming performance tests, the Ryzen 5 7600X actually scores a technical win here, chalking up an extra 2 fps on average over the 13600K, but this might as well be a wash. The 13600K does manage a very solid improvement over its predecessor though, getting as much as 34% higher fps, but landing a solid 20% average performance improvement.

In the end, the Core i5-13600K outperforms the Ryzen 5 7600X by about 40%, while improving on the Core i5-12600K's performance by about 25%. As far as bottom line results go, this would make this processor a slam dunk, but one thing keeps this chip from true greatness: its power consumption.

While the 13600K has the lowest minimum power draw of the three chips tested with 1.973W (an 18% lower power consumption than the 12600K's minimum of 2.415W), it also maxes out at an astonishing 204.634W, which is about 83% more power to achieve a roughly 40% better performance.

This chip also draws 65% more power than the Core i5-12600K for a roughly 25% better performance. These are hardly signs of efficiency, and it continues the exact wrong trend we saw with Intel Alder Lake. For comparison, the AMD Ryzen 9 7950X has a max power draw of 211.483W, and its 3D V-Cache variant has an incredibly tight 136.414W power draw in my AMD Ryzen 9 7950X3D review

So yeah, it's not hard to put up the kind of numbers that the Core i5-13600K does when Intel turns the electron firehose to full on its processor. Considering how this is the ideal chip for a budget build, that build will now have to factor in a bigger PSU than it should account for a burst of power demand from a chip "rated" for 125W. 

Is this a dealbreaker? Not yet, but if Intel thinks it can keep the top spot by just keeping its foot on the gas while AMD is making real investments in power efficiency within a single generation of processors, this won't be good for Intel in the long run.

  • Performance: 4 / 5

Should you buy the Intel Core i5-13600K?

An Intel Core i5-13600K

(Image credit: Future / John Loeffler)

Buy it if...

Don't buy it if...

Also Consider

If my Intel Core i5-13600K review has you considering other options, here are two processors to consider... 

How I tested the Intel Core i5-13600K

  • I spent nearly two weeks testing the Intel Core i5-13600K
  • I ran comparable benchmarks between this chip and rival processors
  • I gamed with this chip for several days

I spent an extensive amount of time testing the Core i5-13600K over the past two weeks, including using the processor in my primary work and gaming machine at home.

In addition to general work tasks and gaming, I used the processor extensively for content creation work like Adobe Photoshop, Adobe Illustrator, and Blender 3D modeling.

I also ran an extensive battery of benchmark tests on this chip and rival CPUs a customer might consider, using as close to identical hardware as possible in order to gather sufficient comparable data to determine how the chips performed in real-life and simulated workloads.

Read more about how we test

First reviewed May 2023

Asus Zenbook 17 Fold OLED: The future is foldable
12:00 am | May 4, 2023

Author: admin | Category: Computers Gadgets | Tags: , | Comments: Off

Asus Zenbook 17 Fold OLED: Two-minute review

‘Foldable laptops’ - what a ridiculous term. All laptops are foldable, surely? They’ve all got a hinge; some are arguably more foldable than the Asus Zenbook 17 Fold OLED, like the 360-degree convertibles found on our best 2-in-1 laptops list. If you try to open this laptop that far, you’ll snap it in two!

But I digress. The Zenbook 17 Fold OLED is a huge technological achievement for Asus - even if it’s one the manufacturer mirrored from Lenovo’s ThinkPad X1 Fold Gen 2. Fortunately, both of these large-scale foldables are a lot better than the original ThinkPad X1 Fold; unfortunately, though, the Zenbook 17 Fold still has some pretty major drawbacks.

Before I get into the meat and potatoes of this review, I’ll provide a quick breakdown of exactly how this weird, awesome laptop works. Fully unfolded, the Zenbook 17 Fold looks and feels like the world’s most luxurious tablet: a huge 17.3-inch touchscreen with a rear kickstand and a separate Bluetooth keyboard. That OLED display is undeniably gorgeous, with excellent maximum brightness and color density.

The Asus Zenbook 17 Fold OLED pictured on a wooden desk.

(Image credit: Future)

But of course, colors that pop aren’t all this screen has to offer. Thanks to a sliding rear panel and concealed hinge, you can fold (see what they did with the name? Truly ingenious) the display and snap the keyboard magnetically to the lower half, turning it into a much more compact 12.5-inch ‘conventional’ laptop. Alternatively, you can leave the keyboard off and split the display in two at the hinge, or turn it horizontally for a sort of ‘book’ configuration.

It’s a clever, versatile device, and a great deal of work has clearly gone into making it feel durable. I’ve long been wary of folding displays - though they’re stylish and appealing, I’m dubious about the longevity of the best foldable phones - but this one at least feels very robust. When I first saw this product unveiled at IFA 2022 in Berlin, Asus had set up a big machine to repeatedly fold and unfold the display; it’s reportedly rated for 30,000 cycles, which should be more than enough.

The Asus Zenbook 17 Fold OLED pictured on a wooden desk.

(Image credit: Future)

When folded away, you can either clip the incredibly thin keyboard between the two sides of the screen, or fold it shut without the keyboard (though the angle of the hinge when I did this made me a little anxious). However, at almost two kilograms, it’s undeniably pretty thick and heavy when folded up - one of the best ultrabooks, this ain’t.

There are other sacrifices - and advantages - involved in the Zenbook 17 Fold OLED’s novel design, but is it really worthy of being called one of the best laptops purely for its innovation and, well, how boastful you could be if you bought one? I’m not so convinced…

Asus Zenbook 17 Fold OLED review: Price and availability

  • Starts at $3,499 / £3,299 / around AU$5,200
  • Only one configuration available
  • Bluetooth keyboard and carry case included

And here we come to one of the biggest drawbacks of the Asus Zenbook 17 Fold OLED: the insanely high asking price. Sure, this is essentially an entirely unique product barring Lenovo’s competing (and similarly expensive) model, but $3,499 / £3,299 / around AU$5,200 is still a sky-high barrier to entry. At least the wireless keyboard and leather carry case doesn’t cost extra.

So no, it won’t be going on our best cheap laptops list - and don’t expect to see any foldable tech on that list anytime soon, frankly. Folding displays of this size are still a fledgling science; I’m sure they’ll become more widely available with time, but we’re definitely a long way off from Westworld-style ultra-foldable tablets.

The Asus Zenbook 17 Fold OLED pictured on a wooden desk.

(Image credit: Future)

I wouldn’t even say it’s particularly good value for money, unless you’re in it solely for the wow factor (and don’t get me wrong, this laptop will wow people). For that steep retail price, you’re only getting a 12th-gen Intel Core i7 CPU, with no discrete GPU and only 16GB of RAM.

There is at least 1TB of storage, but you can’t open up the case to upgrade or replace any components yourself without voiding the warranty, since that could damage the sliding hinge mechanism.

  • Price score: 2 / 5

Asus Zenbook 17 Fold OLED review: Specs

There's only one model of the Asus Zenbook 17 Fold OLED available - you're getting the Intel Core i7-1250U processor with Iris Xe integrated graphics, 16GB of LPDDR5 RAM, and a 1TB PCIe 4.0 SSD.

Obviously there are no screen variants here either - Asus wasn't going to mess about with multiple versions of its uber-complex foldable panel. We've got a 5MP front webcam, which is top-mounted in the 12.5-inch layout and side-mounted in the full 17-inch layout.

The Asus Zenbook 17 Fold OLED pictured on a wooden desk.

(Image credit: Future)

Asus Zenbook 17 Fold OLED review: Design

  • Bezels are a bit chunky, but that’s understandable
  • Keyboard has good travel considering its thinness
  • IR webcam, but only 2 Thunderbolt 4 ports

I won’t waste more space here waxing lyrical about the Zenbook 17 Fold’s beautiful 4:3 OLED display - so beautiful, clearly, that Asus had to include it in the name of the product. Yes, it’s bright and colorful and generally fantastic. In fact, I’d say it would be incredible for content creators; but there are two significant problems with that.

Firstly, the lack of a dedicated graphics card means that you’re relying entirely on the CPU’s integrated Iris Xe graphics, which are fine, but won’t be carrying you through intensive workloads like video editing or 3D rendering. Second, and arguably the thing I dislike the most about the Zenbook 17 Fold OLED: there’s no stylus support.

No stylus support! I could hardly believe it when I heard this. Surely, the huge tablet mode of this Zenbook would make it a perfect fit for digital artists? Apparently not; as Asus informed me, it was decided that the official line would be ‘don’t use a stylus’, since while the display actually does support stylus input, there were concerns that using one could potentially damage the foldable display where it creases in the middle. So if you do happen to wreck the screen with a hardpoint smartpen, you’ll be $3,499 out of pocket.

Speaking of that screen crease - I do have to admit that it’s not too noticeable to the naked eye when the screen is turned on, an impressive feat considering how new this technology is. You can definitely feel a slight ridge when running your finger across the display, but I never found it disruptive while using the laptop in tablet mode.

The bezels surrounding the display are a bit beefier than I’d like for a high-end laptop, but this was a necessary move to include more stuff behind the folding panel. Notably, this includes a genuinely very impressive set of Dolby Vision Atmos speakers with great clarity and maximum volume, among the best I’ve seen on any laptop, which surprised me. Also included here is a pretty decent 5MP webcam and an IR camera for facial recognition logins via Windows Hello.

The Asus Zenbook 17 Fold OLED pictured on a wooden desk.

(Image credit: Future)

Unfortunately, it doesn’t include much in the way of ports for physical connectivity. All you get is two Thunderbolt 4-enabled USB-C ports and a 3.5mm headphone jack - and you’ll need to use one of those USB-C outlets for charging the laptop too, so be prepared to shell out some extra cash for a Thunderbolt 4 dock. I think even a single USB-A port would’ve been an excellent inclusion here.

Externally, I will say that the Zenbook Fold 17 OLED looks and feels great; the exterior finish is a combination of reflective brushed metal and leatherette padding that wraps around the rear of the hinge and the kickstand, which itself feels very sturdy. There are tiny rubber feet on two edges, providing some grip on flat surfaces regardless of configuration.

Lastly, let’s discuss that snap-on Bluetooth keyboard. It’s got rubber pads on the underside to keep it from sliding around when it’s not magnetically connected to the main chassis of the laptop, and it sits quite firmly atop the lower half of the screen when you’re in 12.5-inch mode.

Despite its incredible thinness (less than 4mm), it actually has very good key travel and the size of the keycaps themselves is good, even if some buttons have been compacted a bit to fit the small form factor. The trackpad is responsive enough, though the click action feels a bit floaty. Overall, I was quite impressed with how good the keyboard felt to use - but since it uses Bluetooth, you’ll basically need to leave that on at all times, which will drain your battery life.

  • Design score: 3.5 / 5
Image 1 of 2

The Asus Zenbook 17 Fold OLED pictured on a wooden desk.

(Image credit: Future)
Image 2 of 2

The Asus Zenbook 17 Fold OLED pictured on a wooden desk.

(Image credit: Future)

Asus Zenbook 17 Fold OLED review: Performance

  • 12th-gen Intel i7 CPU is strong for everyday tasks…
  • …but falters in gaming and content creation
  • ‘ScreenXpert’ software is useful but fiddly
Benchmarks

Here's how the Asus Zenbook 17 Fold OLED performed in our suite of benchmark tests:

3DMark Night Raid: 12,754; Fire Strike: 3,742; Time Spy: 1,381
GeekBench 5.4: 1,705 (single-core); 7,098 (multi-core)
25GB File Copy:
1,480MBps
Handbrake 1.6: 14m 11s
CrossMark: Overall: 1,449 Productivity: 1,438 Creativity: 1,527 Responsiveness: 1,267
Sid Meier's Civilization VI: Gathering Storm (1920p): 22fps
Web Surfing (Battery Informant): 7hrs 57m
PCMark 10 Battery Life: 9hrs 10m 

Well, I’ve certainly seen more impressive benchmark results. Don’t get me wrong; the 12th-gen Intel Core i7 processor powering the Asus Zenbook 17 Fold OLED is perfectly fine - to the point where I’d argue that limiting yourself to the newer 13th-gen Intel CPUs when looking at which laptop to buy is a fool’s errand - but we’ve definitely seen better performance elsewhere.

Synthetic CPU tests in GeekBench and CrossMark turned up perfectly adequate if unspectacular results, proving that at the very least, the Zenbook 17 Fold will be able to handle everyday office tasks. Opening numerous Chrome tabs didn’t cause any noticeable slowdown, and file transfers were speedy thanks to the PCIe 4.0 SSD.

However, it’s badly outclassed by other large laptops, like the 16-inch MacBook Pro (2023) and the similarly-sized Dell XPS 17. The lack of a dedicated GPU causes it plenty of issues in graphical tests; synthetic results using the 3DMark benchmarking suite were underwhelming, and it struggled to play 3D games at all at native resolution - even the generally very lightweight Civilization VI.

The Asus Zenbook 17 Fold OLED pictured on a wooden desk.

(Image credit: Future)

I was able to play some 2D games smoothly - Into The Breach and Slay The Spire both ran fine - so if you’re planning to mostly use this device for everyday tasks like editing documents and browsing the internet with a bit of light indie gaming on the side, you should be okay. If you’re looking for something to do high-end video editing or 3D modeling tasks, this simply isn’t it.

I feel like I should also discuss the screen-switching software employed here too, dubbed ‘ScreenXpert’ by Asus. This tool is essentially designed to allow you to quickly swap between different modes - whether that’s simply folding the screen into a ‘book’ configuration to easily

The simple fact is that at this price point, you should be able to expect far superior performance. Needless to say, the Zenbook 17 Fold probably won’t be showing up on our list of the best 17-inch laptops - and if we had a list of the best 12.5-inch models, it wouldn’t be on that one either.

  • Performance score: 3.5 / 5
Image 1 of 3

The Asus Zenbook 17 Fold OLED pictured on a wooden desk.

(Image credit: Future)
Image 2 of 3

The Asus Zenbook 17 Fold OLED pictured on a wooden desk.

(Image credit: Future)
Image 3 of 3

The Asus Zenbook 17 Fold OLED pictured on a wooden desk.

(Image credit: Future)

Asus Zenbook 17 Fold OLED review: Battery life

  • Adequate but unspectacular battery life
  • Drains fast at maximum brightness with Bluetooth keyboard connected
  • Keyboard battery life is good

I have no strong feelings about the battery life on this laptop. It’s fine; at 50% brightness (which is perfectly sufficient in an averagely-lit space), it lasted for a little over 9 hours in the PCMark 10 battery life test - though that was with the keyboard disconnected and Bluetooth turned off.

Whack the brightness up to full and connect the keyboard, and you’re looking at less than 6 hours - even lower if you’re playing audio out of those booming speakers. At 50% brightness, zero volume, and Bluetooth turned on, it did last for almost 8 hours in our web surfing test, so you should be able to just about squeeze a full workday out of this laptop without needing to charge.

Those tests were conducted with the laptop in full 17-inch mode; I re-ran the PCMark 10 test in 12.5-inch mode hoping for better longevity, but cutting the screen in half only bought me a measly half-hour of extra use.

Of course, the wireless keyboard has its own battery, which I found lasted surprisingly well - I only had to charge it twice during my time with the Zenbook 17 Fold OLED. Both the keyboard and the laptop itself charge via the same Thunderbolt 4 adaptor, which is convenient, and the charge time seemed pretty speedy.

  • Battery life: 3.5 / 5

Should you buy the Asus Zenbook 17 Fold OLED?

Buy it if...

You want to show off
Look, this thing is on the cutting edge. Anyone who sees you using it in the office or a cafe is going to be mesmerized; if you’ve got the cash and want to be seen, this is the laptop for you.

You value flexibility
The versatility of the Zenbook 17 Fold OLED is almost unmatched. You want a big tablet? A small laptop? A book with a screen? You’ve got it.

You want to watch stuff
That 17.3-inch bendable OLED panel sure is something; it looks fantastic, and the speakers are great too. Watching videos and movies on this Zenbook in ‘giant tablet’ mode feels like a strange sort of techy decadence.

Don't buy it if...

You’re a digital artist
No official stylus support. Boo! This product would’ve been incredible for creatives if it packed a garaged smartpen. Maybe next time, Asus.

You want super-portability
Sure, it folds up pretty compact and the included leatherette carry case is very nice, but this thing is heavier than I’d like and requires a lot of desk space to use the 17-inch configuration.

You’re not loaded
It’s too expensive. End of story. I know there’s a price to pay to be on the cutting edge of innovation, but this price is a tad too high for what you’re getting here.

Asus Zenbook 17 Fold OLED review: Also consider

If our Asus Zenbook 17 Fold OLED review has you considering other options, here are two more laptops to consider...

How I tested the Asus Zenbook 17 Fold OLED

The Asus Zenbook 17 Fold OLED pictured on a wooden desk.

(Image credit: Future)
  • Used in every possible configuration
  • Played games on it and did my day-to-day work
  • Took it to a friend's house (to show it off!)

I used this bad boy for weeks, doing all sorts of things! It's literally perfect for watching YouTube in bed - I felt strangely opulent sitting propped up by pillows with my huge 17.3-inch tablet, watching Brian David Gilbert's weird-ass cooking videos. I also used it for browsing Readly in 'book' mode, which felt pretty good.

Naturally, I had to use it in 12.5-inch 'laptop' mode too, and I used that for working (as well as writing part of this review). I mostly used the Bluetooth keyboard magnetically snapped onto the lower half, but I was sure to also test it in 'screen-only' mode using the virtual keyboard instead.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed May 2023

Next Page »