The Latitude 9440 2-in-1 from Dell is an outstanding laptop. It might just be the best business laptop available right now. For everything I would use this for as a business user, The Latitude 9440 handles it with flying colors. The computer is beautiful, the speeds are quick, and the laptop is excellent.
(Image credit: Collin Probst // Future)
Unboxing and First Impressions
Unboxing the laptop was nothing exciting until I pulled the wrapper off the computer. That is when I first felt the matte finish on the 9440 2-in-1, and let me say, I love it. I am a massive fan of matte black and dark greys, so this laptop is a dream finish. While signing in, I noticed the keyboard and the touchpad. The touchpad, first of all, feels enormous. After doing some research, I found out the touchpad is, in fact, over 15% larger than the previous model. Second, the keyboard immediately felt comfortable, which says a lot about a keyboard. It felt natural to type from the first word I wrote.
(Image credit: Collin Probst // Future)
The last thing I noticed right away was the need for more ports. If you are fully moved over to the new standard of USB-C with your devices, or if you plug into one of the best Thunderbolt docks at your desk, you're golden. If not, you'll run into the same problem MacBook Air users have where no legacy ports are available, so you'll need to resort to an adapter, dongle, or dock.
(Image credit: Collin Probst // Future)
Design and Build Quality
Specs
*as tested
Dimensions: 12.20 x 8.46 x 0.64in CPU: 13th-generation Intel Core processors GPU: Intel Iris Xe Graphics RAM: Up to 64GB Display: 14-inch, 16:10 Resolution: 2560 x 1600 or 2660 x 1600 Storage: Up to 2TB Weight: 3.38 lb
The Latitude 9440 2-in-1 from Dell has a screen that measures 14 inches but feels gigantic. This phenomenon is partially because of the high-resolution screen and partially because of the near bezel-less borders.
The touchpad, as mentioned, is significantly larger than the last generation of Latitude laptops. While not entirely necessary because this is a 2-in-1 laptop and has a full touchscreen, the larger trackpad is greatly appreciated when you need to get things done with a trackpad like a standard laptop.
The keyboard above the trackpad is quite comfortable to type on. This keyboard also has the same matte finish the laptop case does while remaining a very easy-to-use keyboard. While writing this review, my fingers don't feel any sense of discomfort or unfamiliarity, which means that the keys are spaced out well.
As mentioned, this laptop is almost entirely made of a matte dark grey material. Around the computer's edges, a band of slightly shiny material helps it pop visually and gives this computer a bit of a fancy look.
(Image credit: Collin Probst // Future)
There are not a ton of ports on this laptop; outside of the three Thunderbolt or USB-C ports, there is only a single headphone jack. I have gotten to the full USB-C or Thunderbolt life, so I have an adapter with me at all times in my laptop backpack just in case I need it, and then my desk setups have Thunderbolt docks. If you don't have a system like that, you should pick up a Thunderbolt dock unless all your peripherals are USB-C or non-existent.
The last thing about this laptop is that the 16:10 aspect ratio is warmly welcomed. I love having a more vertical screen real estate, which boosts business productivity, particularly quite a bit.
(Image credit: Collin Probst // Future)
In Use
Using this laptop for the last few weeks has been fantastic. I love this laptop. It hits all my marks in what I would want in a professional business laptop, and it looks good while doing it. Dell's Latitude line has been high-ranking on our lists for quite a while, and with good reason.
The 14-inch screen, as mentioned, feels massive. I can fit plenty of reference documents, websites, productivity tool windows, and so on without feeling like I want more while portable.
(Image credit: Collin Probst // Future)
Whenever I grab this laptop, I love feeling the matte texture on my fingers. It's soft yet rugged while feeling premium. It's hard to describe in words, but it's incredible. I've already mentioned that the keyboard and touchpad are both excellent. The touchpad has integrated collaboration features which sadly only work for Zoom. However, when I have been able to use them, having soft buttons pop out of a touchpad feels like something out of a movie.
(Image credit: Collin Probst // Future)
One more remarkably impressive part of this laptop is that it can actively be connected to two networks at once and switch between them as needed to keep the strongest and fastest connection. This feature is impressive, especially for power business users who take vital calls and can't risk losing connection. The way the business world is going, dropping a call is as good as losing a sale, contract, or business sometimes. So, being constantly connected to two networks with one as an always-ready, redundant network is incredible.
(Image credit: Collin Probst // Future)
Final Verdict
All in all, this laptop is nearly perfect. If the price were lower, it would be perfect. However, some elements make it worth the cost. Regardless, this is an astounding laptop with great features, high build quality, and one of my favorite finishes in a computer to date. That's why I will happily still give this laptop a near-perfect rating.
After showing off the Narzo 60 series design last week, Realme has updated the Amazon landing page for the upcoming phones with new details. This time it also shared an image that shows off the front of the device to help highlight some key details.
The Narzo 60 Pro will have a curved display with 120Hz refresh rate and 2,160Hz high-frequency PWM dimming. The bottom bezel of the display will measure 2.3mm. There are no details on the size and resolution, but (as we noted last time) there is a clear similarity between the Narzo 60 series and the Realme 11 Pro phones.
Realme Narzo...
Qualcomm just unveiled the Snapdragon 4 Gen 2 (SM4450), a sequel to the 4 Gen 1 from September of last year. Thanks to the Snapdragon X61 modem, this is the first chip in the series to support the improved 3GPP Release 16 version of 5G. This is also the first 4nm chip in the Snapdragon 4-series.
Qualcomm doesn’t name the foundry, but the newer node should promises improved overall efficiency. For reference, the Gen 1 chip is fabbed at 6nm (TSMC), the 480 is 8nm (Samsung).
The new CPU has two performance cores (clocked at 2.2GHz) and 6 efficiency cores (2.0GHz). This results in a 10%...
Many companies claim to have built the best rugged tablet, only to find out they feel and look flimsy and are not designed to hold up over time. That’s not the case with the Durabook R11 Rugged tablet. From the first time we picked up the R11, we could tell that Durabook had put a lot of thought into designing a tablet to withstand the rigors of various unforgiving environments. The tablet is encased in a hard plastic shell surrounded by thick rubber bumpers that are easily replaceable by removing the screws. Also, they’ve designed protective covers for the port slots around the device. It covers close with satisfying and reassuring clicks so you know they won’t accidentally come loose. The R11 received MIL-STD 810H, MIL-STD 461F, and IP66 certifications, and Durabook claims it can survive up to 4 ft drops. The tablet also comes with a 3-year accidental warranty, so you can have peace of mind that should the worst happen, you can get your device repaired.
The 1920x1080, 10-point glossy touchscreen is beautiful to use and protected on all edges of the display, and you’ll find the corners have additional bumpers for extra protection. The screen is designed to be used with fingers, gloves, and styluses, and modes can be toggled quickly through programmable buttons along the face of the display bezel. The screen provided enough brightness for us to find usable indoors and outdoors. There is an optional 1000 nits DynaVue sunlight readable display with a capacitive multi-touch screen for when you’ll use the tablet in highly bright environments.
We appreciated the modular approach to the R11, which helps keep costs down. However, adding accessories and upgrading components starts to push this tablet into a full-fledged laptop price range. The R11 is equipped with WiFi 6E, Bluetooth V5.3, and a Thunderbolt 4 port for one of the fastest connectivity devices we’ve seen on a tablet. Expansion modules let you equip the device with a rear-facing camera with flash, RFID (NFC) reader, smart card reader, GPS, and 4G LTE capabilities to use the device anywhere. Durabook also offers a vast range of cases and mounts to use the device at your desk, mounted in a vehicle, or even in the field on stands. We missed having a hardware keyboard, though you can purchase a detachable one for the tablet.
Inside the R11, you’ll find why Durabook went to such great lengths to protect this tablet. It comes outfitted with Windows 11 running on a 12th Gen Intel Core i7-1255U CPU with vPro, 16 GB of RAM, and an Intel Iris Xe graphics card. We found the R11 capable of handling various office applications with blazing speed. On our PCMark 10 benchmark, the tablet received an overall score of 5325 which puts it squarely between what you’ll find in a well-equipped office laptop and a gaming PC.
(Image credit: Bryce Hyland // Future)
Design
One of the first things you’ll notice when picking up the Durabook R11 Rugged tablet is the solid heft of the 2.65lb form factor. It’s not extremely heavy but provides a sense of security that it’s well protected from drops, dust, water, and day-to-day abuse of on-the-job use. Its chassis is molded, hardened plastic with machine screws holding different body parts together. While we didn’t get to test any of the modular add-on accessories, there are plenty to choose from that will make this tablet a top choice for those needing to scan data, document progress, and connect to the office network whether in a manufacturing plant, warehouse, construction site, or back in the office.
Specs
*Specs as tested
CPU: 12th Gen Intel Core i7-1255U Graphics: Intel Iris Xe RAM: 16GB 4800MHz DDR5 Screen: 11.6” FHD (1920 x 1080), User selectable touch mode for Finger/Water, Glove, or Stylus programmable function Storage: 256 NVMe PCIe SSD Dimensions: 11.73in x 7.56in x 0.79in Weight: 2.65lbs
The sleek black casing with molded rubber bumper provides a secure gripping surface for carrying the tablet around without fear of slipping out of your hands. For those needing to move it around all day, Durabook offers multiple shoulder strap options and a hand strap mounted to the back of the R11. The R11 can also be mounted using any custom cases and mounts Durabook offers to fit your specific use case.
Along the front of the display is an integrated 2 MP front-facing camera with multiple touch buttons that can be defined and programmed for various tasks such as changing the screen touch mode, operating the camera, and other shortcuts. We were impressed with the many ports we found under a locking cover, including micro HDMI, nano SIM, Thunderbolt 4, MicroSD, USB 3.2 GEN 2, and a mic/headset combo jack. Along the left are easy-to-operate power and volume buttons and a Kensington lock slot to secure the device. The back of the tablet has many different attachment points for swappable accessories. Additionally, our demo model featured the optional 8.0 MP auto-focus rear camera with flash. While neither camera is suitable for taking photos, we’d expect from a smartphone camera; both are up to reading barcodes, taking onsite pictures for documentation purposes, and making the occasional video conference call.
(Image credit: Bryce Hyland // Future)
For power, the rugged tablet comes outfitted with a smaller 950mAh battery which we found to be underpowered for keeping this tablet running for a full 8-hour work day. However, Durabook offers a longer-life 6900 mAh battery, or you can keep multiple batteries on hand with the convenient charging dock accessory and easily swap them out as needed. Additionally, if you can access mains power, you can use the AC adapter to recharge the battery and power the device.
(Image credit: Bryce Hyland // Future)
Features
At its core, the Durabook R11 Rugged tablet is designed to be assembled on a case-by-case basis specific to your work environment requirements. Depending on the data you need to interact with, you can add reader modules for LF/HF-RFID (NFC), smart cards, and a barcode camera with flash. If upgrading to the DynaVue display, you can also enable a Night vision mode that we could not test. With Intel vPro technology, TPM 2.0 security, WiFi 6E, an intelligent selection of ports, and a multi-touch screen, the R11 is ready to tackle almost any job in any environment while keeping data and the tablet safe and sound. Users can customize programmable touchscreen functions for Finger/Water, Glove, or Stylus and make a selection from a wide range of attachments beyond the data readers listed above (GPS & 4G LTE connectivity, detachable keyboard, cradles, as well as hand and shoulder straps).
(Image credit: Bryce Hyland // Future)
Usability
We found the Durabook R11 easy to get up and running on with a highly responsive touch screen and the ease of running Windows 11. It’s sleek, high-powered, and quickly runs most applications we tested like those found in a typical office setting. While it’s not designed for high-end graphics or modeling work, it does provide a snappy experience everywhere else. With modular accessories for accessing multiple data types via RFID, smart cards, barcodes, and more, the R11 can be customized for various industry needs. Our test model came with the tablet, stylus, and hand strap. The hand strap provides a secure fit and a storage sleeve for the stylus.
Once we discovered multiple buttons along the face of the tablet screen, we took our productivity to the next level by quickly changing settings and launching shortcuts. However, without a physical keyboard, we found our productivity waning when typing in web addresses, making notes, or composing messages. We’d recommend looking into the Durabook detachable keyboard or Bluetooth keyboard solution if you’re doing more than tapping on a screen or reviewing documents.
As we reviewed this tablet, we found the hardware and software choices from Durabook made this tablet a breeze to pick up and start working with right away. With the ability to add features such as data scanning, mounting, and carrying options, the Durabook R11 is sure to find its home across a wide diversity of environments.
(Image credit: Bryce Hyland // Future)
Performance
This Windows 11 tablet comes kitted with a 12th Gen i7-1255U processor with 12MB Cache, up to 4.7 GHz clocking and 2P+8E cores, an Intel Iris Xe graphics card, and 16 GB of 4800MHz DDR5 RAM. Opening applications feels snappy, and you can be up and running quickly. Speed rules when scanning and working with large quantities of data and with the ability to upgrade to 32 GB of 4800MHz DDR5 RAM and 1 TB NVMe PCIe SSD of storage space, this machine boasts some great specs that are usually only found in full-fledged business laptops.
When we ran the PCMark10 benchmarking application for Modern Office, the R11 scored 5325, which put it above your average business laptop and below a decently kitted gaming PC. Looking at the PCMark10 scores in more detail revealed the R11 received an Essentials Score of 10425, Productivity Score of 7283, and Digital Content Creation Score of 5400. We also ran the PCMark 10 Modern Office battery test and found that our demo unit's standard 3950mAh battery at 93% wear capacity clocked in just under 6.5 hours, well under the 8.5 hours listed on the Durabook page.
Durabook has put together a rugged powerhouse that will help keep business operations running smoothly and efficiently. However, we’d recommend upgrading to the larger capacity batteries if you need this to stay powered for more than 6 hours between battery swaps.
Value
The Durabook R11 isn’t designed like the regular tablet at home or in a typical office environment. As such, not everyone will want to invest in a device that would be overkill in distinct business spaces. However, if you need a device that can withstand the rigors of dust, dirt, water, and being dropped on hard surfaces while keeping operations up and running smoothly, consider the R11.
With swappable and rechargeable batteries, this can be used for a work shift and quickly swap out a storm over a break or before an upcoming change. Rather than building a device with everything already included, Durabook took the modular approach to keep costs down by offering unique accessories to affix to the tablet for specific use cases. Additionally, IT and systems managers will appreciate the vPro and Windows 11 configuration to keep data safely managed and protected.
In all, the Durabook R11 rugged tablet is a beast made to endure anything thrown at it and looks to be a sound investment for those needing to protect their tablet and their data.
The AMD Radeon RX 7600 is probably the best 1080p graphics card you can buy right now, and in all honesty, it should be the last of its kind.
Team Red has been a bit gun-shy of late with its graphics card offerings, with the last graphics card we saw being the AMD Radeon RX 7900 XT. While that was a great card, it launched almost half a year ago, and we haven't heard much from AMD since.
Meanwhile, its rival has released a steady stream of cards, and at this rate, it's almost through its main GPU stack at this point, so it's interesting that AMD chose to release a very budget-friendly midrange card rather than go down the list of higher-end offerings the way Nvidia has.
In a way, it's a very smart strategy (and one I actually recommended back in February), and with the Radeon RX 7600 going on sale on May 25, 2023, for just $269 (about £215/AU$405), AMD manages to make it to market with its all-important midrange offering at least a full month ahead of Nvidia's competing RTX 4060 while also managing to undercut its rival on price.
In terms of performance, the RX 7600 is a major improvement over the AMD Radeon RX 6600 it replaces, while also generally outperforming the competing Intel Arc A750. It does fall short of the RTX 3060 overall, but not by much, and a lot of that is relative to ray tracing performance, which isn't great on either card to begin with, so this advantage looks bigger than it really is in practice.
If there is one knock against the RX 7600, it's its power draw, which is pulling down 165W TGP, which is more than the 8GB Nvidia GeForce RTX 4060 Ti and about 33W more than the RX 6600, so this is definitely the wrong direction for AMD to be going in, power wise.
AMD also has to step up its game when it comes to FSR. Nvidia's most recent launch, the RTX 4060 Ti, was a fairly disappointing card when it came to its baseline performance, but there was no denying that DLSS 3, especially with Frame Generation, is a huge value-add for Team Green. And while DLSS 3 is only available on about 50-ish games, FSR 2 is generally more available with about 120 games featuring FSR 2, but DLSS 2.0 is available in more than 200 games, so AMD has some catching up to do.
When it finally does, the RX 7600 will be an even better buy for midrange gamers, and while it's a sad state of affairs that $269 is about as "budget" as we can hope to see for a while, it's a substantially better value than just about any card on the market right now.
That might change when the RTX 4060 lands, but given that the performance of the baseline performance of the RTX 4060 is expected to be about 20% better than that of the RTX 3060, I expect that it will fall in pretty close to where the RX 7600 currently is, only with a more expensive MSRP and no Founders Edition to keep third-party partners honest in terms of price.
So unless the RTX 4060 pulls a rabbit out of a hat, I still expect the AMD Radeon RX 7600 to hold the edge over its rival on value, which at this price point is really the only thing that really matters. As it stands, it is the best cheap graphics card you can buy right now, and I expect that will remain the case for the rest of this generation.
AMD Radeon RX 7600: Price & availability
(Image credit: Future / John Loeffler)
How much is it? MSRP listed at $269 (about £215/AU$405)
When is it out? It goes on sale May 25, 2023
Where can you get it? You can buy it in the US, UK, and Australia
The AMD Radeon RX 7600 goes on sale on May 25, 2023, with an MSRP of $269, (about £215/AU$405), making it the cheapest card of this generation to launch. Not only that, it's a substantial price drop from the Radeon RX 6600, which launched at $329 (about £265/AU$495), so you're getting a much better graphics card for almost 20% less. This is more like it!
Ostensibly, the rival to the RX 7600 is the RTX 4060, but since that card has yet to launch, we can only really compare it to the last-gen midrange offerings from Nvidia and Intel.
The Nvidia RTX 4060 when it launches will sell for $299 (about £240/AU$450), which is 9% cheaper than the RTX 3060's official MSRP of $329. The RX 7600 has a cheaper MSRP than either of those, but I expect that the RTX 3060 especially will see some heavy discounting as a result of both the RTX 4060 and the RX 7600, so the value proposition of the RX 7600 might shift depending on what SKU you're looking at.
The RX 7600 does come in slightly more expensive than the Intel Arc A750, and while you might do a double-take at the mention of Intel, the Arc A750 can give the RX 7600 a run for its money at times, so you definitely can't write it off completely.
AMD Radeon RX 7600: Features and chipset
(Image credit: Future / John Loeffler)
More ray tracing cores and new AI cores
Higher TGP
With the move to RDNA 3, the AMD Radeon RX 7600 starts off on a 6nm TSMC process over the RX 6600's 7nm, which gives the RX 7600 a roughly 20% jump in the number of transistors it has to work with (13.3 billion to 11.1 billion). And even though the actual GPU die on the RX 7600 is about 14% smaller than that of the RX 6600, it manages to pack in four additional compute units for a total of 32 compared to the RX 6600's 28.
This is also a more mature architecture, so the 2,048 stream processors (a roughly 14% increase over the RX 6600), are more performant, and the second-generation ray accelerators are a huge improvement over the first-gen RAs in the RX 6600.
The RX 7600 also has faster clocks than the RX 6600, with a boost clock improvement of about 6%, but the big improvement comes with the memory clock speed, which is 2,250MHz for the RX 7600 and 1,750MHz for the RX 6600. This means a nearly 30% boost to memory speed, so even though the RX 7600 is still rocking the same 8GB GDDR6 VRAM on a 128-bit bus as the RX 6600, it has an 18 Gbps effective memory speed compared to 14 Gbps for the RX 6600.
There is also the addition of 64 AI accelerators for the RX 7600, which the RX 6600 simply didn't have. This means that things like Radeon Super Resolution (RSR) will run better than it did on the RX 6600, and it will enable advanced AI workloads like generative AI content creation.
All this does come at the cost of power though, as the RX 7600 has a 25% higher TGP than the RX 6600. This isn't good, and given how Nvidia's cards are typically getting better performance with less power gen-on-gen, this is definitely the wrong direction for AMD to be going in. It still keeps the card "reasonable" when it comes to your PSU, and AMD recommends a 550W PSU for the RX 7600 at a minimum, but this still manages to keep things under 600W overall.
AMD Radeon RX 7600: design
(Image credit: Future / John Loeffler)
The AMD reference card for the Radeon RX 7600 is a compact dual-fan number that will fit in just about any case. This is a dual-slot card, but it's just over eight inches long and a little over four inches tall, so it's great for mini-tower builds, and with just a single 8-pin power connector, you won't have any issues with cable management here.
In terms of outputs, we get three DisplayPort 2.1 ports, with a single HDMI 2.1a port, though no USB-C output. Honestly, having the DisplayPort 2.1 output is nice, but really unnecessary. With just 8GB VRAM, there is no universe where this card can output 8K video that doesn't default to a slow sequence of still images, so it's a nice-to-have that you are almost guaranteed to never use. Far be it for me to be a buzzkill, though, so if you want to push this card at 8K, do let me know how that turns out.
As for the lack of USB-C, this really isn't a creative card, so this isn't something that you should worry about unless you have one of the best USB-C monitors and nothing else. Even then, I recommend looking further up the stack (like the AMD Radeon RX 7900 XT), since USB-C monitors are almost universally for creative pros and this card isn't going to cut it for the kind of work you'll need to do with it.
In terms of its actual aesthetics, like the two RDNA 3 cards before it, the RX 7600 eschews any RGB and features a matte black design with some subtle accent touches like the red stripes on the fins of the heat sink which would be visible in a case. Overall, it's a cool-looking card, especially for those not looking to have excessive RGB lighting up everything in their case.
AMD Radeon RX 7600: Performance
(Image credit: Future / John Loeffler)
Best-in-class 1080p rasterization performance
Much improved ray tracing performance
Can manage some decent 1440p performance, especially without ray tracing
Given the missteps Nvidia has been making lately, AMD has a real shot of taking some market share if it can offer compelling performance for gamers. Fortunately for Team Red, the AMD Radeon RX 7600 manages to pull off quite a coup when it comes to gaming performance.
Test system specs
This is the system we used to test the AMD Radeon RX 7600:
For the most part, the RTX 4060 is the RX 7600's main competition, but with the Nvidia RTX 4060 Ti just being released, it's the natural comparison at the moment. Is this necessarily fair? No, it's not, and the RX 7600 does lose out to the RTX 4060 Ti on just about every measure, but it really doesn't lose that badly.
In rasterized workloads at 1080p, the RX 7600 is only about 12% slower than the RTX 4060 Ti, and only about 13% slower at 1440p. This changes drastically as soon as you start factoring in ray tracing and upscaling, but it's something I definitely wasn't expecting. Against the RTX 3060 Ti, the RX 7600 fares better, obviously, and generally it outperforms the RTX 3060 in rasterization workloads.
In terms of its predecessor, the RX 7600 is the kind of gen-on-gen improvement I was really expecting to see from the RTX 4060 Ti and didn't get. The RX 7600's rasterization performance is great, but its improved ray accelerators really outshine what the RX 6600 is capable of, and really makes ray tracing at this price point accessible to the midrange.
Synthetic Benchmarks
In synthetic benchmarks, the RX 7600 roundly beats its predecessor, as well as the RTX 3060. Against the card it's replacing, the RX 7600 outperforms the RX 6600 by about 19%, while the RX 7600 beats the RTX 3060 by about 18% overall.
Digging into the results a bit further though, we can see some of the biggest gains come in ray-traced workloads like Port Royal, where the RX 7600 saw a 33% improvement over the previous gen.
The only benchmark where the RX 7600 comes up a bit short is in the Speedway benchmark, which is a 1440p, ray tracing benchmark. Here, the RTX 3060 just barely edges out the RX 7600 by just 219 points, which is close enough to be a bit of a wash.
Gaming Benchmarks
As you can see, when it comes to general rasterization performance at 1080p, the RX 7600 is the hands-down winner, only falling to the RTX 3060 in Counterstrike: Global Offensive, and only then by the barest of margins. Everywhere else, you can expect roughly 15-20% better performance out of the RX 7600 overall.
Things take a bit of a turn when it comes to ray tracing performance, but the results here are a bit deceptive for a couple of reasons. First, Cyberpunk 2077 is Nvidia's major showcase game, and that game is very well optimized for Nvidia cards, so the ray tracing performance for the RTX 3060 is substantially better than for either AMD card. However, take Cyberpunk 2077 out of the mix, and the RX 7600 actually outperforms the RTX 3060 in ray tracing performance.
It's not all good for AMD though, since the minimum fps for the RX 7600 in both Returnal and Cyberpunk 2077 is in the single digits, and it's not just for a brief moment, but fairly regular dips into slideshow territory, especially around volumetric fog with applied lighting effects.
It's a similar story when you apply upscaling to either Cyberpunk 2077 or Returnal, where the RTX 3060's DLSS 2.0 is simply better optimized for the former, and the AMD RX 7600 struggles on the minimum fps on the latter, so even though the average fps on Returnal looks like it's north of 60 fps, you'll dip as low as 6 fps on the Quality FSR preset or 15 fps on the Ultra Performance preset, and trust me, it's noticeable.
Of course, turn ray tracing off and you probably won't have this issue, but that will be a series of settings compromises you will have to decide for yourself. Overall though, the AMD Radeon RX 7600 manages to perform well above where you would expect from this generation at this price point. If you're looking for an outstanding and reasonably cheap 1080p graphics card, you can't go wrong with this one.
Should you buy the AMD Radeon RX 7600?
(Image credit: Future / John Loeffler)
Buy it if…
Don’t buy it if…
Also consider
(Image credit: Future / John Loeffler)
If my AMD Radeon RX 7600 review has you considering other options, here's two other graphics cards to consider.
How we test graphics cards
I spend several days with the RX 7600 running benchmarks, playing games, and generally measuring its performance against competing cards.
I paid special attention to its 1080p performance, since this is the main target audience for this card, while also stretching into 1440p gaming as well.
Having covered and tested many graphics cards in my career, I know how a graphics card should perform at this level and what you should be spending for this level of performance.
The Nvidia GeForce RTX 4060 Ti is without a doubt one of the most anticipated graphics card launches of this generation, and now that it's here, it should be an easy win for Nvidia over archrival AMD. I wish that were the case.
That's not to say that the RTX 4060 Ti doesn't hold up well against AMD's midrange offerings at the moment, it absolutely does, and there's no question that the features this card brings to the table are impressive, especially DLSS 3, which is the first time a properly midrange GPU (under $500/£500/AU$750) is seeing this feature.
It goes without saying that Nvidia is leaning into DLSS 3 as its biggest selling point, and as I'll get into later, it definitely delivers significantly better performance than the RTX 4060 Ti should be capable of given its various specs — even factoring in the expanded cache which widens up the memory bandwidth of the card despite still having just 8GB GDDR6 VRAM to work with.
The decision to go with 8GB VRAM for this card — a 16GB VRAM variant is going to be released in July 2023 for an MSRP of $499 (about £400/AU$750) — is probably the only thing that kept the price on this card under $400. With an MSRP of $399 (about £320/AU$600), the Nvidia Founders Edition RTX 4060 Ti 8GB is the same price as the Nvidia GeForce RTX 3060 Ti it is replacing, and generally, it offers a pretty good value for that money, with some caveats.
In terms of native, non-DLSS performance, there isn't a whole lot of improvement over the previous generation, which is definitely going to disappoint some, if not many. Given the kinds of performance advances we've seen with higher-end cards, we were hoping to see that extend down into the heart of the midrange, but it seems those benefits generally stop at the Nvidia GeForce RTX 4070.
Instead, you have a card that relies very heavily on DLSS for carry its performance over the line, and where it works, it is generally phenomenal, offering a real, playable 1440p gaming experience, and even brushing up against some decent 4K performance with the right settings.
This is something AMD has really struggled to match with its FSR, and so Nvidia really has a chance to score a major blow against AMD, but as we'll see, the best AMD graphics card in the last generation's midrange, the AMD Radeon RX 6750 XT, actually outperforms the RTX 4060 Ti in non-ray tracing workloads, including gaming, so this does not bode well for Nvidia once AMD releases its current-gen midrange cards.
This is somewhat exacerbated by the fact that the RTX 4060 Ti's ability to use its new features is fairly limited, and while features like DLSS 3 with Frame Generation are available on the best PC games like Cyberpunk 2077 and Returnal, as of the launch of the RTX 4060 Ti, there are only 50-ish games that support DLSS 3.
This list will surely grow over time, but you certainly won't get this kind of support on games that may be just recent enough to push the RTX 4060 Ti in terms of its performance, while being just old enough that you'll never see a DLSS 3 patch for it.
I can say that if you're coming from an RTX 2060 Super or older, then this card is absolutely going to blow your mind. It's effectively the RTX 3060 Ti's NG+, so if you missed what I consider to be the best graphics card of the last generation, you'll get all that and more with the RTX 4060 Ti. If you're coming from an Nvidia Ampere card though — especially from greater than the RTX 3060 Ti — chances are you are going to find this is really a downgrade with some neat features to soften the blow.
Nvidia GeForce RTX 4060 Ti: Price & availability
(Image credit: Future / John Loeffler)
How much is it? MSRP listed at $399 (about £240, AU$600)
When is it out? It is available starting May 24, 2023
Where can you get it? You can buy it in the US, UK, and Australia
The Nvidia GeForce RTX 4060 Ti 8GB is available starting May 24, 2023, with an MSRP of $399 (about £240, AU$600). This is the same launch price of the Nvidia RTX 3060 Ti that this card is replacing, so we're glad to see that Nvidia didn't increase the price on this variant with this generation.
This also puts it on par with the AMD Radeon RX 6750 XT, which initially launched for $549 (about £440/AU$825), but which you can find under $400 right now, even without discounts, at major retailers. AMD hasn't released an RX 7700 XT yet, which would be this card's more natural competition, so comparing the RX 6750 XT and the RTX 4060 Ti isn't really fair, but it's all we have for now until AMD launches its RTX 4060 Ti challenger.
Nvidia GeForce RTX 4060 Ti: Features and chipset
(Image credit: Future / John Loeffler)
Only uses one 8-pin...but still requires a 16-pin converter?
3rd-generation ray tracing and 4th-generation tensor cores
288.0 GB/s memory bandwidth, but 32MB L2 cache boosts effective bandwidth to 554.0 GB/s (according to Nvidia)
The Nvidia RTX 4060 Ti is the first properly midrange Nvidia Lovelace graphics card, and so it is built on TSMC's 5nm process, with about 22.9 billion transistors across 34 streaming multiprocessors (SM), which come with 128 shader cores (CUDA), 4 fourth-generation tensor cores, and 1 third-generation ray tracing core per SM.
The clock speed is a solid 2,310MHz base clock, which is about a 64% improvement over the RTX 3060 Ti's 1,410MHz, with a boost clock of 2,535MHz, or about 52% faster than the RTX 3060 Ti's 1,665MHz.
The biggest difference between the two cards is the memory bus. The Nvidia RTX 4060 Ti uses a 128-bit bus for 8GB GDDR6 VRAM, while the RTX 3060 Ti uses a 256-bit bus for the same amount of VRAM. The RTX 4060 Ti has a faster memory clock (2,250MHz), which combined with the expanded L2 cache (32MB), the RTX 4060 Ti has a slightly faster effective memory speed of 18 Gbps to the RTX 3060 Ti's 15 Gbps, while having a much faster effective memory bandwidth.
Still, this really does smack of over-engineering. The move to a 128-bit bus doesn't seem necessary, and given what we've seen of other Lovelace cards, like the Nvidia GeForce RTX 4070, I definitely think Nvidia could have stuck with a higher bus width and it wouldn't be catching nearly the grief it is getting over this.
What's more, even though the performance of the RTX 4060 Ti is better than the RTX 3060 Ti, I really think that had this card had the same bus width as the RTX 3060 Ti, this card would absolutely anything that approached it in the midrange. As it stands, the RTX 4060 Ti is great, but fails to score the knockout it really needed.
It is worth mentioning though that this card also uses significantly less power than the RTX 3060 Ti. That card had a TGP of 200W, while the RTX 4060 Ti 8GB comes in at 160W, which is a 20% improvement in efficiency. This is great for keeping your build under 600W, and it's a move in the right direction for everyone and deserves to be praised.
Nvidia GeForce RTX 4060 Ti: Design
(Image credit: Future / John Loeffler)
The Nvidia RTX 4060 Ti Founders Edition keeps the same black heatsink with chrome trim as the other Founders Edition cards this generation, and — unfortunately — it also sticks with the 12VHPWR 16-pin power connector. Fortunately, you only need to plug a single 8-pin into it, so it is at least somewhat easier to manage in a case.
Also easier to manage is the size of the card. Using the same dual-fan design as previous Founders Edition cards, the RTX 4060 Ti pretty much shrinks these down a bit. While it's still a dual-slot card, it comes in at just under 9.5 inches long and 4.5 inches tall, making it the kind of card that will easily fit in your case.
There's not much flash here, but that's a given with the Founders Edition, so if you're looking for visual bells-and-whistles like RGB or super-cooling features like a triple fan design, you're going to want to look at any of the third-party cards that release alongside this one for those.
Nvidia GeForce RTX 4060 Ti: Performance
(Image credit: Future / John Loeffler)
DLSS 3 is the real draw here
Improved ray tracing performance
Baseline performance not much improved over the RTX 3060 Ti
When it comes to performance, the Nvidia GeForce RTX 4060 Ti really leans on DLSS 3 for most of its major performance gains, and while this can be substantial, some are going to feel somewhat disappointed.
Test system specs
This is the system we used to test the Nvidia GeForce RTX 4060 Ti:
This is largely because even with the introduction of some pretty advanced tech, there aren't a lot of games out right now that can really leverage the best features of this card. While three of the games I use as benchmarks — F1 2022, Cyberpunk 2077, and Returnal — all feature frame generation, these are also three of the latest games out there from major studios that have the time and staffing to implement DLSS 3 with Frame Generation in their games.
There is a DLSS 3 plug-in coming for Unreal Engine, which should definitely expand the number of games that feature the tech, that's still going to be a ways off at this point before that starts to trickle down to the gamers who will actually be using this card.
I'll get more into DLSS 3 and Frame Generation in a bit, but a quick glance over the underlying architecture for the RTX 4060 Ti tells something of a story here, as shown by synthetic benchmarks using 3DMark and Passmark 3D.
Synthetic Benchmarks
As you can see, the RTX 4060 Ti beats out the RTX 3060 Ti, but only just barely, getting about 11% better performance than the card it's replacing. This is, okay, I guess, but hardly the generational leaps that previous Lovelace cards have been making.
For example, the RTX 4070 offers a roughly 21% jump over the RTX 3070 on these same synthetic benchmarks. In fact, this puts the RTX 4060 Ti just ahead of the RX 6750 XT, and ultimately just behind the RTX 3070 in terms of raw performance.
As a gaming card, the performance outlook is better, but not by a whole lot overall.
Gaming Benchmarks
On games with heavy effects-based visuals like Metro: Exodus and Cyberpunk 2077 where the advanced architecture of the RTX 4060 Ti can be leveraged, it does edge out the competition, sometimes. The RX 6750 XT still manages a slightly better fps on Returnal at 1080p, on average, when not using ray tracing or upscaling tech, for example.
The RTX 4060 Ti also gets crushed in CS:GO at 1080p, relatively speaking, which I chalk up entirely to pushing textures through the smaller memory bus of the RTX 4060 Ti. The 192-bit bus on the RX 6750 XT's 12GB GDDR6 VRAM and the 256-bit bus on the RTX 3060 Ti's 8GB GDDR6 really show up in cases like this.
Things do start to turn in the RTX 4060 Ti's favor once you start fiddling with ray tracing. The third-generation ray tracing cores on the RTX 4060 Ti are definitely much more capable than the RTX 3060 Ti's and especially more than the RX 6750 XT's, which are second-generation and first-generation cores, respectively.
The RTX 4060 Ti is the first midrange card I've tested that is going to give you playable native ray-traced gaming at 1080p consistently on max settings, though it will struggle to get to 60 fps on more demanding titles like Cyberpunk 2077.
But lets be honest, nobody is playing any of these games with native resolution ray tracing, you're going to be using an upscaler (and if you aren't then you really need to start).
Here, the performance of Nvidia's DLSS really shines over AMD's FSR, even without frame generation. In both Cyberpunk 2077 and Returnal, the RTX 4060 Ti can get you over 120 fps on average when using the DLSS Ultra Performance preset, and if you want things to look their best, you can easily get well north of 60 fps on average with every setting maxed out, even ray tracing.
Now, one of the things that the wider memory bus on the RTX 3060 Ti gave that card was a faster throughput when gaming at 1440p. Now, not every game is going to run great at 1440p, but for a lot of them, you're going to be able to get a very playable frame rate.
The RTX 4060 Ti improves over the RTX 3060 Ti here, but not nearly as much as it should, and on games like F1 2022 and CS:GO where that memory bandwidth difference is going to show, well, it shows up here at 1440p, too.
Of course, once you turn on ray tracing, most games are going to turn into a slide show, but unsurprisingly, the RTX 4060 Ti manages to score a win here on every ray-traced game I tested.
That said, you are really pushing it here on these settings, and you're better off using upscaling if you're going to go for 1440p, especially with settings turned up.
The biggest win for the RTX 4060 Ti here is with Cyberpunk 2077, where it manages 67% better performance at max quality settings than the RX 6750 XT, but maddeningly, it's only about 13% better than the RTX 3060 Ti on the quality preset. On ultra performance, the RTX 4060 Ti is about 52% better than the RX 6750 XT, but again, only 13% better than the RTX 3060 Ti.
When it comes to Returnal, the RX 6750 XT is essentially tied with the RTX 4060 Ti on the quality preset for FSR 2.1 and DLSS, respectively. Bump this up to ultra performance, and the RTX 4060 Ti does better, beating out the RX 6750 XT by about 22% and the RTX 3060 Ti by about 17%.
I imagine the RTX 4060 Ti will perform more or less the same across most games that still rely on DLSS 2.0, which number more than 200. For those games that really leverage DLSS 3 with Frame Generation though, it really is another story entirely.
With Frame Generation, you can get about a 40-60% performance improvement on games that support it. This isn't nothing, since this can even get you playing Cyberpunk 2077 at a playable framerate at 4K on ultra performance. The RTX 3060 Ti and RX 6750 XT really don't have any answer to this, and so they are going to lag behind considerably on any games that have DLSS 3 with Frame Generation.
Does Frame Generation increase latency on some titles, along with other issues? Sure. Will it matter to gamers who get to play Cyberpunk 2077, Returnal, and other titles that play like they were RTX 3080 Ti's? Probably not.
Will any of this matter to anyone who doesn't play those games? Obviously not. And that is ultimately the issue with this card. For what it does well, it has no peer at this price, but if you already have an RTX 3060 Ti, then there is really very little reason to upgrade to this card. Hell, if you have an RX 6750 XT, you might feel like you're better off just waiting to see what AMD has in store for the RX 7700 XT, and I would not blame you in the slightest.
This isn't a whiff by Team Green by any means, but there's no getting around the fact that the performance of the Nvidia GeForce RTX 4060 Ti absolutely leaves a massive opening for AMD to exploit in the coming months with the RX 7700 XT, or even the RX 7650 XT.
Should you buy the Nvidia GeForce RTX 4060 Ti?
(Image credit: Future / John Loeffler)
Buy it if...
Don’t buy it if…
Also consider
How I tested the Nvidia GeForce RTX 4060 Ti
I spend several days with the RTX 4060 Ti running benchmarks, playing games, and generally measuring its performance against competing cards.
I paid special attention to its DLSS 3 Frame Generation technology, since this is one of the card's biggest selling points, and played several games at length with the tech turned on.
Having covered and tested many graphics cards in my career, I know how a graphics card perform at this level.
The Intel Arc A750 is probably the one graphics card I've most wanted to get my hands on this year, and now that I've put it through a fairly rigorous testing regime, I can honestly say I am very impressed with Intel's first effort at a discrete GPU. At the same time, it's also not an easy card to recommend right now, which is a tragedy.
First, to the good, namely the great price and stylish look of the Intel Limited Edition reference card. The Intel Arc A750 Limited Edition card has an MSRP of just $249.99 (about £200 / AU$375), and the limited number of third-party cards out there are retailing at roughly the same price.
The Arc A750 I tested also looks spectacular compared to the reference cards from Nvidia and AMD, thanks to its matte black look, subtle lighting, and silver trim along the edge of the card. It will look great in a case, especially for those who don't need their PCs to look like a carnival.
When it comes to performance, I was most surprised by how the Arc A750 handled modern AAA games like Cyberpunk 2077 and Returnal, both of which put a lot of demands on a graphics card in order to maintain a stable frame rate. The Arc A750 handled them much better than the RTX 3050 it is ostensibly competing against. It even outperformed the RTX 3060 in many cases, putting it just under halfway between the RTX 3060 and the RTX 3060 Ti, two of the best graphics cards ever made.
The Arc A750 manages to pull this off while costing substantially less, which is definitely a huge point in its column.
(Image credit: Future / John Loeffler)
Test system specs
This is the system we used to test the Intel Arc A750:
The thing about the Arc A750 is that the things it does well, it does really well, but those areas where it flounders, like older DirectX9 and DirectX10 workloads, it does so pretty badly.
It's a tale of two halves, really. Nothing exposes the issues with the Arc A750 more than its synthetic performance scores, which on average trounce the RTX 3060, 23,924 to 20,216. In that average though is its PassMark 3D score, a good measure of the card's ability to render content that wasn't just put out within the last couple of years. Here, the Arc A750 scored a dismal 9,766 to the RTX 3060's 20,786 - a 10,000 point deficit.
The story is similar when gaming, where the Arc A750 generally outperforms its rival cards, even in ray tracing in which Intel is the newcomer behind mature leader Nvidia and fiesty, determined AMD. In fact, when gaming with ray tracing at 1080p, the Intel Arc A750 comes in a close second behind Nvidia's RTX 3060 8GB, 37fps on average to the 3060's 44fps.
Bump that up to 1440p, however, and the Intel Arc A750 actually does better than the RTX 3060 8GB - 33fps on average to the 3060's 29fps average. When running Intel XeSS and Nvidia DLSS, the Arc A750 averages about 56fps on max settings with full ray tracing at 1080p, while the RX 6600 can only muster 46fps on average.
These are much lower than the RTX 3060's 77fps, thanks to DLSS, but getting roughly 60fps gaming with full ray tracing and max settings at 1080p is a hell of an accomplishment for the first generation of Intel discrete graphics. The Arc A750 can even run even with the AMD Radeon RX 6650 XT in ray tracing performance with upscaling at 1440p, getting 42fps on average.
If only this performance were consistent across every game, then there would be no question that the Intel Arc A750 is the best cheap graphics card on the market. But it is exactly that inconsistency that drags this card down. Some games, like Tiny Tina's Wonderland, won't even run on the Arc A750, and it really, really should. How many games are there out there like Tiny Tina's? It's impossible to say, which is the heartbreaking thing about this card.
I really can't recommend people drop $250 on a graphics card that might not play their favorite games. That is simply not a problem that AMD or Nvidia have. Their performance might be rough for a few days or weeks after a game launches, but the game plays. The same can't be said of the A750, and only you, the buyer, can decide if that is worth the risk.
In the end, the Intel Arc A750 is a journeyman blacksmith's work: showing enormous potential but not of enough quality to merit selling in the shop. Those pieces are how craftspeople learn to become great, and I can very clearly see the greatness that future Arc cards can achieve as Intel continues to work on lingering issues and partners with more game developers.
It's just not there yet. As Intel's drivers improve, a lot of these issues might fade away, and the Intel Arc A750 will grow into the formidable card it seems like it should be. If you're comfortable dropping this kind of cash and taking that chance, you will still find this card does a lot of things great and can serve as a bridge to Intel's next generation of cards, Arc Battlemage, due out in 2024.
Intel Arc A750 Price & availability
(Image credit: Future / John Loeffler)
How much does it cost? MSRP of $249.99 (about £200 / AU$375)
When can you get it? It is available now
Where can you get it? It is available in the US, UK, and Australia, but stock may be an issue
The Intel Arc A750 is available now, starting at $249.99 (about £200 / AU$375). There are a limited number of third-party partners who also make the A750, though these tend to sell at or very close to Intel's MSRP from what I've seen.
This puts the Arc A750 on the same level price-wise as the Nvidia RTX 3050, but it definitely offers better performance, making it a better value so long as you're ok with the varying compatibility of the Arc A750 with some PC games out there.
Value: 4 / 5
Intel Arc A750 Specs
(Image credit: Future / John Loeffler)
Should you buy the Intel Arc A750?
Buy it if...
You're looking for a cheap GPU
At $249.99, this is one of the best cheap GPUs you're going to find.
You want a stylish looking card
This card is very cool looking in a way that Nvidia and AMD reference cards simply aren't.
You want strong ray tracing and upscaling Not only do Intel's AI cores make XeSS upscaling a serious contender, the Arc A750's ray tracing performance is quite strong.
Don't buy it if...
You are concerned about compatibility
While only one game I tested wouldn't work, that's one game too many for many gamers out there.
You're concerned about power consumption At 225W TGP, this card soaks up way more power than a card in this class reasonably should.
Intel Arc A750: Also consider
If my Intel Arc A750 has you considering other options, here are two more cards to consider...
How I tested the Intel Arc A750
(Image credit: Future / John Loeffler)
I spent several days with the Intel Arc A750
I used the A750 in my personal PC playing games and doing creative work
I ran our standard battery of tests on the Arc A750
I spent several days with the Intel Arc A750 to test its gaming and creative performance, including at 1080p and 1440p. In addition to gaming, I ran our standard suite of GPU tests at it using the same system set up I use for all our graphics card tests.
Besides my extensive computer science education or practical experience, I have been a hardware reviewer for a few years now, and a PC gamer for even longer, so I know how well graphics cards are supposed to perform with a given set of specs.
The ProXMem Kerberos TUF DDR5 RAM kits might lack variety, but they sure make up for it in terms of performance and price, earning top marks from me pretty much across the board.
While it's too early to declare this the best RAM kit I've tested this year, it's damned close. And while the Kerberos TUF DDR5 module lacks the almost monolithic refinement of the Corsair Dominator Platinum RGB DDR, it's still an attractive addition to any PC case out there while giving you more than enough memory runway for serious, high-intensity gaming and pro-am content creation.
(Image credit: Future / John Loeffler)
In terms of performance, this is where the ProXMem Kerberos TUF DDR5 really shines. While normally I wouldn't compare two RAM kits running at different speeds, since you can get most RAM kits at comparable speeds.
However, I've made an exception in this case purely based on the value proposition of the ProXMem Kerberos TUF DDR5. At the price you'd pay for this 32GB RAM kit ($149.99 as reviewed, about £120/AU$225), you could get either the Dominator Platinum RGB DDR5 kit mentioned above, or you could get the Fury Beast DDR5 32GB kit, though both of those kits are substantially slower than the Kerberos TUF RGB.
(Image credit: Future / John Loeffler)
A Note on Testing
Some motherboards aren't compatible with some modules under dual-channel configurations, while others will limit the speed of the DDR5 RAM when run in pairs, so needless to say it's hard to give quantifiable data to demonstrate the Dominator Platinum RGB DDR5's performance in a way that makes it comparable across different systems.
For this reason, we only benchmark a single DDR5 module to get comparable performance figures. This does mean that adding a second module will offer substantially better performance in real-world usage. We also only compare modules to other modules running at the same speed and memory profile (XMP/EXPO), unless we are comparing kits by price, which will be noted accordingly.
This is owing to the Kerberos TUF DDR5 being XMP overclocked and pushing 1.410V, which is right around as much voltage as you'll want to push without risking damaging the RAM, something that comparable kits at this speed will all encounter as well.
For this, however, you're able to get nearly 50% better PassMark memory performance than either the Dominator Platinum RGB or Fury Beast DDR5. You also get around 30% better read performance, about 26% better write performance, and roughly 23% better copy performance than either of the similarly priced RAM from Corsair and Kingston. You also get a roughly 15% lower latency as well.
All this comes at the cost of a higher total power used, but the Kerberos TUF DDR5 doesn't use any more power than the Kingston Fury Beast DDR5 and it solidly outperforms it while costing less and looking better to boot.
The one thing those kits will have over the Kerberos TUF DDR5, however, is much wider compatibility with different manufacturer's motherboards. ProXMem modules are supported by a number of motherboards, but not nearly to the extent that Corsair's Dominator Platinum RGB or ADATA's XPG line. So, if you're looking at this RAM, do check to make sure it will run in your motherboard (you might need a BIOS update).
While the advantages of the ProXMem Kerberos TUF DDR5 will diminish when running against RAM kits of comparable speed, they will all cost substantially more to close the performance gap. And while not every motherboard is going to be able to run this kit as of this writing, those that can will absolutely let you get the most out of this kit. So if you are looking for a high performance RAM kit while being friggin' smart with your damn money, then there's nothing else to say other than to buy this RAM.
ProXMem Kerberos TUF DDR5: Price & availability
(Image credit: Future / John Loeffler)
How much does it cost? Starting at $119.99 (about £100 / AU$180)
When is it available? Available now
Where can you get it? Available in the US. Not available in the UK or Australia
Starting at $119.99 (about £100 / AU$180) for a 32GB DDR5 kit clocked at 5,600MHz, the Kerberos TUF DDR5 is almost as well-priced as the Corsair Vengence DDR5 RAM kit that I flagged recently as the best premium-value RAM on the market right now. The only downside for my overseas friends is that this RAM appears to only be available in the US at the moment, but hope springs eternal. Maybe one day.
As reviewed, the ProXMem Kerberos TUF DDR5 32GB (2 x 16GB) with a memory speed of 6,800MHz will set you back $149.99 (about £120 / AU$225), which is the same price as a Corsair Dominator Platinum RGB DDR5 32GB kit running at just 5,200MHz, and for slightly less than the Kingston Fury Beast DDR5 32GB kit at 5,200Mhz, which would cost you $159.99 (about £125/AU$235) at MSRP pricing.
ProXMem Kerberos TUF DDR5: Specs
Should you buy ProXMem Kerberos DDR5 RAM?
Buy it if...
You want high-performance DDR5 RAM
At 6,800MHz, this RAM is ridiculously fast, making it a great kit for content creation and gaming.
You want RAM that will look great in your case
Between the TUF alliance branding, aluminum casing, and well-done RGB, this is some seriously good-looking RAM.
Don't buy it if...
You want a single module
Sometimes, you just need that one stick of RAM. If that's the case, you're out of luck, this RAM only comes in kits of two.
You want larger module sizes
Unfotunately, 16GB RAM modules are all you're going to get here.
ProXMem Kerberos TUF DDR5: Also consider
If my ProXMem Kerberos DDR5 review has you considering different RAM kits, here are two that might better suit your needs.
How I tested ProXMem Kerberos DDR5 RAM
I spent a few days testing a ProXMem Kerberos DDR5 32GB kit in my home PC
In addition to general computing, gaming and creative use, I used professional third-party benchmark tools as well
In addition to general testing, I measured performance with PassMark and AIDA64, specifically
I used the ProXMem Kerberos DDR5 as my main gaming and content creation RAM for several days in my home PC.
In addition to gaming, I edited photos, videos, and other illustrations in Adobe Photoshop, Premeire, and Illustrator. I also tested the RAM out using Lumion 12.5 to test its creative chops on CAD-like software.
In addition to years of computer science education and training, I have been a hardware reviewer for a number of years now, so I know how memory is supposed to perform at this level.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
The Corsair Dominator Platinum RGB DDR5 RAM kit has been an integral part of my component testing process for more than a year, so I cannot think of any better endorsement than that.
It is simply the best RAM for the job, whether it's for testing the best processors or best graphics cards. And while I've been lucky enough for Corsair to give TechRadar a number of RAM kits to use over the years, even if it didn't, I would still have us go out and buy this RAM ourselves for use on our testing bench.
(Image credit: Future / John Loeffler)
Whether it's about compatibility or performance, Corsair's Dominator series of RAM kits have always been ideal for just about any midrange to premium build, and that is still very much the case. Starting at $144.99 / £134.99 (about AU$220) for a 32GB kit (2 x 16GB) running at 5,200MHz, you are paying something of a premium for this RAM, even more than you normally would for a DDR5 kit.
There are cheaper kits out there if you're willing to skip some of the extras you find here like RGB lighting and heat dissipation. That includes the Corsair Vengence DDR5 kits, which you can get for as low as $109.99 (about for a 32GB kit running at 4,800MHz.
(Image credit: Future / John Loeffler)
A Note on Testing
Some motherboards aren't compatible with some modules under dual-channel configurations, while others will limit the speed of the DDR5 RAM when run in pairs, so needless to say it's hard to give quantifiable data to demonstrate the Dominator Platinum RGB DDR5's performance in a way that makes it comparable across different systems.
For this reason, we only benchmark a single DDR5 module to get comparable performance figures. This does mean that adding a second module will offer substantially better performance in real-world usage. We also only compare modules to other modules running at the same speed and memory profile (XMP/EXPO).
Now, one thing to note about double data rate (DDR) memory is that it works best in pairs (which is why this RAM is almost always sold in kits of two or four), but every motherboard, processor, and system configuration is going to have a huge impact on what kind of performance you are going to get from your RAM kit, even beyond the speed of the RAM itself.
In this regard, Corsair's Dominator Platinum RGB DDR5 kits are about as widely supported as you're going to get, and they have always run at their top speed no matter which motherboard I've used in testing.
In terms of performance, the Dominator Platinum RGB DDR5 runs neck-and-neck with the best DDR5 RAM kits out there, like the Kingston Fury Beast kit, often beating it out all while using lower total power in the process.
As you increase the speed of the module you pick up, the performance will only improve from there. But as you can see, the performance of the Dominator Platinum RGB is a noticeable step up from the lower-tier Vengence DDR5 and is more or less even with the Kingston Fury Beast DDR5, which has a slightly higher MSRP.
In all, the Corsair Dominator Platinum RGB DDR5 kit offers a phenomenal balance of price, performance, and aesthetics to make it the baseline standard for what a DDR5 module should offer. It continues Corsair's legacy of high-quality PC components.
How much does it cost? Starting at $144.99 / £134.99 / about AU$220
When is it available? Available now
Where can you get it? Available in the US, UK, and Australia
The Dominator Platinum RGB kit we're looking at here is the 5,200MHz Intel XMP model, though you can get 32GB kits as fast as 7,800 mega transfers a second (MT/s) with Intel XMP 3.0 for $224.99 (about £180/AU$340).
The fastest AMD EXPO kit you can get is somewhat slower at 6,000 MT/s, with a 64GB (2 x 32GB) kit costing you $269.99 (about £220/AU$400) and a 32GB kit (2 x 16GB) costing you $174.99 (about £140/AU$265).
This puts it about 36% more expensive to start than the slightly lower-tier Corsair Vengence DDR5 modules at the same capacity and speed. However, it is marginally cheaper than Kingston's competing Fury Beast DDR5 modules, which have an MSRP of $159.99 (about £130/AU$240) for a 32GB (2 x 16GB) kit of 5,200 MT/s DDR5 with Intel XMP.
Corsair Dominator Platinum RGB DDR5: Specs
Should you buy the Corsair Dominator Platinum RGB DDR5?
(Image credit: Future / John Loeffler)
Buy it if...
You want high performance DDR5
The Corsair Dominator Platinum RGB DDR5 is about as fast and high-performance as you're going to find on the consumer market.
You want great looking RGB modules
The clean lines, color options, and RGB customization options for the Dominator Platiunum RGB make it the best looking RAM you can get.
Don't buy it if...
You're on a budget
This is one of Corsair's most expensive RAM kits, so you can get almost the same phenomenal performance with the Vengence DDR5 kits as you could here for much less.
You want just a single stick of RAM Sometimes, you don't need a full 2-stick kit, but in the case of the Dominator Platinum RGB DDR5, you can only get it in pairs.
Corsair Dominator Platinum RGB DDR5: Also consider
If my Corsair Dominator Platinum DDR5 RGB review has you considering other options, here are two more DDR5 RAM models to consider...
How I tested the Corsair Dominator Platinum RGB DDR5
I've spent several days dedicated to testing
I also used it as my standard configuration for component testing
I used benchmarking tools like AIDA64 and Passmark for precise performance data
In addition to using this RAM in all of my other component testing, I spend a few days testing the performance of the RAM module itself using third-party tools like AIDA64 and PassMark.
While this is high-performance RAM, I paid special attention to the aesthetic appeal of this RAM specifically since it is really meant to be a showpiece in a build on top of performing at the highest level.
I've been building PCs for many years now, so I'm very familiar with Corsair's lineup of PC components. In addition, my computer science background and years of hardware coverage have given me particular insight into how well computer components should perform.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released if you can still buy it, it's on our radar.
• Original review date: June, 2023
• Pixel Fold has been surprisingly left behind
• Launch price: $1,799 / £1,749
• Lowest price on Amazon (US only): $1,399
Update: April 2024. Google has made many updates to the Pixel phones since the Pixel Fold was launched, but, shockingly, some of the most important new Google features have not come to Google's foldable, even though it's more powerful than other phones that have been updated. Google's new Circle to Search feature, as well as new Gemini AI features, have not been added to the Google Pixel Fold, and Google has given us no timeline when or even if the Pixel Fold will get these updates. Truly a disappointing development in the short life of this pricey phone.
Two-minute preview
The Google Pixel Fold arrives a little late to the foldable party but, based on my time with the device, it's a smartphone/tablet combo that mostly delights, and which is sure to earn a place among our ranking of the best Foldable Phones.
From its construction, including its precision hinge, to its high-resolution screens, the Pixel Fold is a well-thought-out Android phone that's equally at home as a small-screen, but thick, 5.8-inch phone or, unfolded, as a 7.6-inch mini tablet.
The large bezel around the main screen might give pause to some, but it quickly fades into the background, thanks to a responsive, colorful, and multitasking-friendly screen. Even the unavoidable crease down the middle is somewhat less prominent than those on competing foldable phones. And when you fold the Pixel Fold, the two sides meet with nary any visible space between them.
The collection of cameras on board do not disappoint. They can capture lovely landscapes, portraits, macro-like photos (there isn't a dedicated macro mode), astrophotography, and striking long exposures that use image segmentation to blur motion while keeping other aspects of the scene in focus.
I'm particularly pleased that Google put a 5x optical zoom on this phone. Sure, that's half of what you get on the Samsung Galaxy S23 Ultra, but it does beat its closest foldable rival, the Samsung Galaxy Z Fold 4.
The Google Pixel Fold in my hand, opened to the 7.6-inch display (Image credit: Future / Lance Ulanoff)
Google has equipped the Pixel Fold with its Tensor G2 chip (the same one that's in its Pixel 7 line), a slightly aging piece of silicon that doesn't beat the competition, but which proved more than powerful enough for every task I threw at it. The Pixel Fold is as at home with web browsing as it is with high-intensity gaming. Plus, the screens' variable refresh rates keep everything looking smooth. A small nitpick might be, well, the lack of nits. The Pixel Fold's main screen is noticeably less bright than the Galaxy Z Fold 4's (the latter boasts more nits), and while I didn't have any issues on cloudy days, it might struggle a bit in direct sunlight.
Naturally, Android 13 (with five years of promised security updates) is perfectly at home on the Pixel Fold, but so are all the Google apps that Google has optimized for the new platform. Mail, Photos, and more work like a charm on the big screen, and there's real joy in being able to drag and drop a photo from another app into an email.
Google arguably stumbles a bit when it comes to the pricing. $1,799 / £1,749 is a lot to pay for a single device, especially as other newcomers, like the smaller but quite impressive Motorola Razr Plus, come in at under $1,000 (Google hasn't announced any plans to release the phone in Australia, but we'll let you know if and when we get official confirmation either way). My take, though is that you're essentially getting two premium devices in one here, and Google is asking you to pay for that.
Overall, I truly enjoyed my time with Google's first folding device. It's not a tentative or compromised first attempt at the form factor: the Google Pixel Fold makes a clean and emphatic landing in the foldable space.
Google unveiled the Pixel Fold during its May 10 Google I/0 2023 developer conference keynote, at which it also unveiled its mid-range Google Pixel 7a phone, the Google Pixel Tablet and charging speaker dock, and a ton of new AI technology.
You can preorder the Google Pixel Fold now, with shipping set to commence on June 27, although exactly when you'll be able to get your hands on the phone depends on where you are. The Fold comes in two colors: Porcelain (off-white) and Obsidian (black). My review unit is Obsidian, and I think I prefer it over the white.
The Pixel Fold in my hand, folded to show the 5.8-inch external display (Image credit: Future / Lance Ulanoff)
If you haven't already, you should disabuse yourself of the notion that when you buy a foldable you're buying one device, and so should pay for one device. The Google Pixel Fold is, like the Samsung Galaxy Z Fold 4, two full-blown devices in one and, as such, it's very nearly worth the $1,799 / £1,749 price tag.
How do I figure this? There are two screens on Google's first foldable, one 5.8 inches and the other 7.6 inches, and each one is large enough to operate as a standalone communication, information, gaming, and entertainment platform.
There are more cameras on the Pixel Fold than on the average handset: three on the back, another one on the external screen, and then one more right above the main display.
If you purchased, say, an iPhone 14 Pro ($999 / £1,099 / AU$1,749) and an iPad mini ($499 / £479 / $749), that would cost you about $1,500, or the UK and Australian equivalents. And naturally, you're paying a premium for more cameras, and that exquisite flexible and hard-to-manufacturer foldable display.
My point is, before you dismiss the Pixel Fold for its hefty price tag, I suggest you consider what you're actually getting for your money, and what this impressive Android 13 smartphone and tablet can do.
Still, at this price, the Pixel Fold is more than a considered purchase, and I fully understand that – especially if you're thinking about the 512GB and nearly $2,000 ($1,919 / £1,869) model – the cost will be a considerable issue.
The good news is that there are already Google Pixel Fold trade-in deals that essentially cut the price of the phone in half. Basically, there should be almost no reason to pay full list price for what is a very impressive device.
Value score: 4/5
Google Pixel Fold design
The right form factor for a phone-to-mini-tablet foldable
Feels solid, if a bit heavy
Folds completely flat
Whisper-quiet operation
Big bezel will distress some
Google's decision to wait out Samsung through four iterations of its foldable devices (and almost five, with the Galaxy Z Fold 5 set to be announced in the next few weeks at the time of writing) turns out to have been a smart move. The Pixel Fold is in many ways what I want all foldables to be.
When folded, its 139.7mm tall by 79.5mm wide by 12.1mm thick frame is like a very thick 5.8-inch smartphone. Unlike the tall and narrow Samsung Galaxy Z Fold 4, the dimensions of which, when folded, stretch the definition of a traditional smartphone display, the Pixel Fold and its front screen could almost pass for a standard smartphone; that is as long as you overlook the flat hinge side, which does not match the curved corners on the opposite side.
Plus, if you don't count the rather prominent camera bump (really a band that runs almost the width of the back of the phone), the Pixel Fold is, at 5.8mm unfolded, slightly thinner than the 6.3mm Galaxy Z Fold.
Google Pixel Fold stainless steel hinge (Image credit: Future / Lance Ulanoff)
Even by foldable standards, though, the Pixel Fold is a bit heavy. It weighs 283 grams – that's 20 grams more than the Galaxy Z Fold 4 and, unsurprisingly, 40 grams heavier than Apple's current biggest phone, the iPhone 14 Pro Max.
Again, if you don't appreciate that multi-purpose devices like this are naturally going to be bigger and heavier than standard smartphones, you're barking up the wrong, er, device tree.
This is a premium phone, with high-end materials like a polished aluminum frame, and Corning Gorilla Glass Victus on both the front screen and the back. The hinge is stainless steel and the entire body is IPX8-rated, which means it's ready to survive everything from a storm to an accidental drop in the bath (I didn't submerge the phone but did run it under some water – it survived).
The Google Pixel Fold's stainless steel hinge and flat-folding operation (Image credit: Future / Lance Ulanoff)
The hinge operation, by the way, is excellent. It's smooth, whisper-quiet (quieter even than the Z Fold 4, which makes a little crinkling sound when you open and close it), and can open to a full 180 degrees or virtually anywhere in between (to support tabletop and Tent operation).
I opened and closed the phone a lot during my testing time, and came away with the distinct impression of long-term durability.
Aside from the rather wide and tall camera bump, there aren't many distinctive features on the outside of the Pixel Fold. On the back, below that bump, is a polished version of Google's distinctive 'G'. The hinge has no markings at all. Opposite the hinge, on the right edge of the phone when it's unfolded, are the phone's two buttons. The power/sleep fingerprint reader (which is effective) is towards the top, and below it is the volume rocker. This is the opposite configuration to the Samsung Galaxy Z Fold 4 and, it took some getting used to – I kept pressing the power button when I meant to adjust the volume, although I'm sure that if I spend enough time with the Pixel Fold, hitting the right button will become second nature.
There are microphone and speaker grilles along the top and bottom edges of the phone. Along the bottom is the USB-C charging port (the foldable ships with a cable and even a USB-3-USB-C adapter, but no charging adapter – it feels like something that should be included at this price). There's also a physical SIM slot, though the Pixel Fold does support dual SIM and eSIM, too.
Google Pixel Fold back showing cover screen and the main camera array. (Image credit: Future / Lance Ulanoff)
A few things stand out when I unfold the Google Pixel Fold. One is that, unless you give it an extra press down on each side, the phone does not automatically unfold completely flat, although this isn't a big deal, as it's very easy to nudge it to an essentially flat plain. I remain somewhat surprised by the size of the bezel surrounding the Fold's flexible main screen. In contrast to the bezel on the Galaxy Z Fold 4 it's huge; however, once you start using this display, it quickly fades into the background.
There is a reason for the big bezel: it houses the main screen's 9MP camera. On the Galaxy Z Fold 4, Samsung chose to put a punch hole in the screen, and maybe that was the right call for a slightly large folding screen – I'm not sure.
As I mentioned earlier, the power button doubles as an effective fingerprint reader, and there's another biometric security option: you can register your face and unlock it with the Cover screen's camera. Oddly, though, you can't unfold the Pixel Fold and use that screen's camera to unlock with your face; it's a small but annoying omission on Google's part.
Design score: 4.5/5
Google Pixel Fold displays
5.8-inch external screen with a normal aspect ratio
Lovely, large flexible display that's a good fit for all activities
One of the best things about Pixel Fold's two screens is that there is zero trade-off between using just the outer cover screen or the expansive main display.
I love that Google went with a full-width 5.8-inch cover display. That's considerably shorter than the Galaxy Z Fold 4's 6.2-inch external display, but it's also almost a half-inch wider – and I can say without reservation that I prefer the Pixel Fold's wider external screen. Not only is it easier to navigate, but apps like Instagram and TikTok look a lot better on it. The difference in size is better illustrated when you look at the resolutions – where the Galaxy Z Fold 4's cover display is 2316 x 904 pixels, the Pixel Fold's OLED is 2092 x 1080.
It's a pleasingly bright screen both indoors and out, with a promised 1,200 nits of brightness in typical use (the peak brightness is 1,550 nits), and smooth in operation thanks to an adaptive refresh rate (60Hz to 120Hz). I also like that there's an always-on display option (you have to dig into the settings to find it as it's not set up by default).
Overall, the cover screen is the display you'll most often use when on the go. It's the perfect viewfinder for the main camera array on the back, and the size is, depending on your hand, basically palm-friendly.
The Pixel Fold's Main Screen being used as the main camera viewfinder (Image credit: Future / Lance Ulanoff)
Of course, there's a reason you're carrying around all that weight and girth: the large main screen. Unfolded, this is a 7.6-inch tablet-like display covered in ultra-thin flexible glass and a layer of protective plastic. At 2208 x 1840 it's got just a touch more pixels/resolution than the Galaxy Z Fold 4's main screen.
I grew to love this screen. Apps like Google Maps, Netflix, and YouTube, and games like Asphalt 9: Legends, and Call of Duty Mobile look fantastic on it. If you happen to start playing Call of Duty on the big screen, then close the Pixel Fold and try to continue on the cover screen, you may notice that the image is distorted. I was able to fix this by closing the game and restarting on the cover screen – it seems like a a bug that Google could fix with a software update.
In a side-by-side comparison, I did find that the Galaxy Z Fold 4's main screen is a little brighter. It's worth noting that the Pixel Fold's main display does not even match the brightness of the cover display; it's 1,000 nits as standard, with a peak brightness of 1,450 nits. Still, this is something you'd only notice if you had the two phones and screens side-by-side (as I did).
Like the cover display, the main screen supports an adaptive refresh rate of up to 120Hz. It has the same 1,000,000:1 contrast ratio, supports 16 million colors, and offers HDR support (though not HDR10+). it also supports the always-on display.
This being a foldable display, there is a crease that you can both see and feel, but it disappears when you're using apps, playing games, and watching videos. I did notice that this crease is ever so slightly less prominent than the seam on the Samsung Galaxy Z Fold 4's main screen.
The Pixel Fold's main screen in tabletop mode (Image credit: Future / Lance Ulanoff)
Google makes good use of the cover display, and of the device's folding capabilities. If I fold the phone to roughly 45 degrees and set it up like a tent, I can watch Netflix as a full-screen experience on the cover display. If I unfold the Pixel Fold, the show or movie is automatically switched to the main screen.
I did notice that YouTube is not entirely optimized for the Pixel Fold – when I tried to play a YouTube video in Tent mode, it insisted on playing upside down.
Tent mode is a nice way to watch a Netflix movie. YouTube, for now, only plays upside down. (Image credit: Future / Lance Ulanoff)
The main screen also has a couple of nifty mid-fold tricks up its sleeve. I can bend it 90 degrees and set the Pixel Fold up in Tabletop mode. With it, I can watch movies, take a selfie, capture perfectly still time-lapse videos, or, as I did on more than one occasion, conduct hands-free Google Meet video meetings. Try doing that with your regular phone and no tripod.
You can also bend the phone a bit further so the main cameras are pointed and the sky and collect tripod-free night photography.
A big screen also means that I have space for not just one, but two apps. The Pixel Fold is a good multitasker that makes running two apps easy. All I have to do is open one app, like Chrome, then sweep up from the bottom to access the app dock, hold down on a second app like the Camera, and then drag it to the left or right side of the screen. You can resize the split of the two screens but, unfortunately, cannot run a third app. Still, it is useful to be able to have a map open at the same time as your camera viewfinder, especially if you're hiking and want to capture great shots while not getting lost.
Watching Netflix on the Google Pixel Fold. (Image credit: Future / Lance Ulanoff)
Display score: 4.5/5
Google Pixel Fold cameras
Overall excellent cameras
Backed by powerful Google tools
Long exposure mode is a delight
Google Pixel Fold main camera array (Image credit: Future / Lance Ulanoff)
Google has been widely praised for the cameras on its Pixel phones, and I think the Pixel Fold also earns those accolades.
Its cameras not only take excellent photos across a wide range of styles, they're complemented by some of the most powerful on-board image-processing magic in the business. I haven't had this much fun using a smartphone's cameras in quite a while.
It's not just the camera app, or the editing I can do post-shot; the entire suite of camera hardware is strong. And while the Pixel Fold doesn't beat the Samsung Galaxy Z Fold 4 in every aspect, I don't think anyone will feel cheated by any single lens.
Here’s the full list of cameras:
48MP f/1.7 wide (rear)
10.8MP ultra-wide f/2.2, 121-degree field of view (rear)
10.8MP telephoto 5x optical f/3.05 (rear)
9.5MP f/2.2 (cover)
8MP f/2.0 (above main screen)
The Pixel Fold camera app (Image credit: Future / Lance Ulanoff)
By and large, this array matches up pretty well with what's on the Samsung Galaxy Z Fold 4. The biggest difference is probably the Pixel Fold's main display camera, which has double the megapixels of the Z Fold 4's.
What I really appreciate though is the 5x optical zoom (you get just 3x on the Z Fold 4). I love a good optical zoom. Yes, both devices offer their own form of digitally- and AI-enhanced zoom. The Pixel Fold's Super Res Zoom (up to 20x) is sort of impressive, but as with most of these digital implementations, the images kind of fall apart if you look too closely. Still, I love having an optical image stabilized (OIS) and electronic image stabilized (EIS) 5x zoom in my pocket.
As you can see from my photo gallery further down the page, the Pixel Fold not only takes sharp and bright images, it also maintains excellent color fidelity. These images all look impressively like the real-world subject; nothing is oversaturated beyond nature's creation. The cameras let you capture subjects from a distance, and also allow you to get up close and personal, courtesy of the Fold's approximation of macro photography. To be clear, I can't really get closer than, say six or seven inches, but the effect is like macro, with a blurred background and a tight, sharp focus on the nearby object (see my yellow flowers).
There are a number of cool onboard tricks that can improve your not-so-awesome photos. Photo Unblur can sharpen photos blurred by your wobbly hands (although the camera is fast enough that I had to work to make a blurry photo for my tests). Magic Eraser is here, and it let me easily select and remove a bunch of commuters from one of my photos, as you can see below. The process of selection and removal is not instantaneous – it's like the Pixel Fold wants to show you how hard it's working.
Image 1 of 2
Magic Eraser on the Google Pixel Fold automatically selects subjects to remove. (Image credit: Future / Lance Ulanoff)
Image 2 of 2
It does not get rid of every trace, though (Image credit: Future / Lance Ulanoff)
My other favorite feature in the Camera app is Long Exposure. This is not night photography. Instead, it's a much shorter-term exposure that captures some movement while leaving the rest of the photo sharp. When I took a photo of a flowing brook using this setting (the on-screen instructions ask you to hold still for a second), it kept the surrounding rocks in focus while blurring the flowing water. It did the same thing with my fountain shot: the water is blurred, but the fountain and surrounding detail are sharp. I tried it in the train station, and it turned rushing commuters into streaks while, in the background, a man who stood still was clear as day. Again, the process of creating these effects takes a moment, and I wonder if a newer Tensor chip (the G2 is almost a year old, after all), might make quicker work of these operations.
Image 1 of 5
Google Pixel Fold Long Exposure in the train station (Image credit: Future / Lance Ulanoff)
Image 2 of 5
Google Pixel Fold Long Exposure on the street (Image credit: Future / Lance Ulanoff)
Image 3 of 5
Google Pixel Fold Long Exposure on the street (Image credit: Future / Lance Ulanoff)
Image 4 of 5
Google Pixel Fold Long Exposure with a flowing brook (Image credit: Future / Lance Ulanoff)
Image 5 of 5
Google Pixel Fold Long Exposure at a fountain (Image credit: Future / Lance Ulanoff)
You can shoot a selfie, even in portrait mode, with the 9MP inside camera or the 8MP one on the cover screen, but Google also makes it possible to shoot selfies with the Pixel Fold's best camera.
First, you unfold the device and then open the Camera app. Below the 'switch camera' icon is an option that lets you switch camera display screens. Once you do that, the cover screen becomes the camera viewfinder and, because the Pixel Fold is open, you're staring at the rear camera array. It's not the smoothest process, and it's basically impossible to hold the device this way with just one hand and take the shot, unless you add one more step and set up the gesture-activated timer mode.
To do so, I had to set the timer for three seconds, and then hold up one hand until a yellow box appeared on screen around it, which initiated the timer. I could then lower my hand, and the Pixel Fold would take a perfect selfie.
Complicated? Sure. Useful? Absolutely.
Virtually all flagship phones offer some form of astrophotography, and the Pixel Fold is no different; however the double act of Nightscape photography and Tabletop mode is something special. I was able to set up the phone with the screen folded but not fully closed, so the main camera was pointed at the night sky, and then fiddle with the on-screen settings to get a perfectly still starscape, without the need to hold the phone and try to stand still for six seconds, or use a tripod.
The shot below was taken with the 5x optical zoom and a six-second exposure.
Zoom into this photo if you can to see what seems like a million stars (Image credit: Future / Lance Ulanoff)
Camera score: 4.5/5
Camera samples
Image 1 of 18
(Image credit: Future / Lance Ulanoff)
Image 2 of 18
(Image credit: Future / Lance Ulanoff)
Image 3 of 18
(Image credit: Future / Lance Ulanoff)
Image 4 of 18
(Image credit: Future / Lance Ulanoff)
Image 5 of 18
(Image credit: Future / Lance Ulanoff)
Image 6 of 18
(Image credit: Future / Lance Ulanoff)
Image 7 of 18
(Image credit: Future / Lance Ulanoff)
Image 8 of 18
(Image credit: Future / Lance Ulanoff)
Image 9 of 18
(Image credit: Future / Lance Ulanoff)
Image 10 of 18
(Image credit: Future / Lance Ulanoff)
Image 11 of 18
(Image credit: Future / Lance Ulanoff)
Image 12 of 18
(Image credit: Future / Lance Ulanoff)
Image 13 of 18
(Image credit: Future / Lance Ulanoff)
Image 14 of 18
(Image credit: Future / Lance Ulanoff)
Image 15 of 18
(Image credit: Future / Lance Ulanoff)
Image 16 of 18
(Image credit: Future / Lance Ulanoff)
Image 17 of 18
(Image credit: Future / Lance Ulanoff)
Image 18 of 18
(Image credit: Future / Lance Ulanoff)
Google Pixel Fold performance and specs
Packs Google's aging Tensor G2 chip
Perhaps a step behind the latest Qualcomm Snapdragon
12GB of RAM, starts at 256GB of storage
Inside the Google Pixel Fold is the zippy Google Tensor G2, the same chip that powers the Google Pixel 7. This is a capable and powerful mobile CPU, although with a Tensor G3 expected in a few months (maybe in the Pixel 8) we have to wonder why Google's first foldable didn't get what's set be Google's most cutting-edge silicon.
In general, though, there's almost no evidence that the chip is slowing anything down. Every game, app, and web operation I performed was smooth and instantaneous. Photo-editing operations and tricks like Long Exposure took a beat to render, though. Perhaps that's down to the G2, or maybe that's how long the likes of Magic Eraser and Long Exposure would take on any mobile platform.
The Google Pixel Fold has no trouble running action games (Image credit: Future / Lance Ulanoff)
Google pairs the Tensor G2 with a healthy 12GB RAM and its Titan M2 security coprocessor.
Benchmark scores put the Pixel Fold slightly behind the Galaxy Z Fold 4 and its Qualcomm Snapdragon 8+ Gen 1, while gaming benchmarks, specifically the ones that look at frames per second, put it somewhat behind Qualcomm's latest chips. However, in my gameplay experience across Asphalt 9: Legends and Call of Duty Mobile, I didn't notice a difference. There was no stuttering or tearing, and everything looked great and was highly responsive, so much so that I was MVP during my first round of Call of Duty.
This is also a 5G phone, though without a test SIM I wasn't able to test its cellular operations. It also supports WiFi 6e, which means I had fast and reliable connections at home and in the office.
As for audio performance, there are stereo speakers that can go pretty loud – and immersive, thanks to spatial audio support – without any distortion. The three microphones, meanwhile, are so sensitive that when I barely whispered "Hey, Google…" the phone heard me and awaited my instructions.
Performance score: 4.5/5
Google Pixel Fold software
Android 13
Google knows how to fold
Seamless multitasking
Image 1 of 3
The Google Pixel Fold can multitask, with apps interacting (Image credit: Future / Lance Ulanoff)
Image 2 of 3
Running the camera alongside another app is no problem (Image credit: Future / Lance Ulanoff)
Image 3 of 3
Tabletop mode made video conferencing easy (Image credit: Future / Lance Ulanoff)
What matters here, though, is not the speeds and feeds of this phone but, for me at least, how Google's first foldable uses Android 13, and the Fold's small outside and big inside screens, to maximum effect.
Many of Google's core apps, like Maps, Gmail, Photos, Home, and Drive, have been redesigned for the folding-screen environment (as have some third-party ones like Netflix). Mail, for instance, converts from a single-column experience on the cover screen to a dual column on the main screen that puts your mail list on the left and opens each email in a pane on the right. It's all smart and, honestly, what you would expect.
Multitasking is a strong suit here. As I mentioned, it's easy to drag and drop one app to open alongside another on the main screen, although I do wish I could add a third app on top of those two.
When you have two apps open side-by-side you can drag and drop between them. I opened Gmail and Google Photos, and to add a photo to an email I was composing I simply tapped and held my finger on the image until a little thumbnail appeared, then dragged it over to the compose screen on the left. Nothing could be easier.
The best way to describe my overall experience with the Google Pixel Fold software environment is that it was pleasant surprise. Everything looks so good, and works so well together.
Software score: 4.5/5
Google Pixel Fold battery life
4,727mAh
Laster 15 hours
Supply your own charging adapter
Extreme Battery Saver mode can extend the life a charge, but you do lose the use of some apps and notifications (Image credit: Future / Lance Ulanoff)
I did what I could to stress-test the Pixel Fold's ample 4,727mAh battery, pushing screen brightness to max, not letting the screen sleep before 30 minutes had elapsed, and playing action games, watching videos, browsing the web and holding multiple, lengthy video conference calls (colleagues said I sounded good, but looked a little less sharp than I normally do through my MacBook Air (M2) FaceTime camera).
After wirelessly charging the Pixel Fold on my Qi charging base, I grabbed the phone at 7am and used it almost continuously until 10pm when it ran out of juice. I did not, when it prompted me at 10% battery life, let it switch to Extreme Battery Saver mode because that would have paused my apps.
Battery score: 4.5/5
Google Pixel Fold score card
Should I buy the Google Pixel Fold?
Google Pixel Fold (Image credit: Future / Lance Ulanoff)
Buy it if...
Don't buy it if...
Google Pixel Fold review: also consider
Samsung Galaxy Z Fold 4 The Galaxy Z Fold 4 is a do-everything device that presents few compromises, and it's great for photography, multitasking, and watching Netflix. However, the high price might put off some potential buyers.
Motorola Razr Plus The Motorola Razr Plus / Razr 40 Ultra is a major evolutionary step for smartphones, going beyond what any previous flip or foldable phone has offered.
I embarked on an entertaining walking tour through New York's Central Park with a test device, during which I took lots of photos, and carried out an additional five days of testing with my Google-provided Pixel Fold test unit.
I carried the Fold with me every day, and used it as often as possible, including on the train, where I tethered it to my iPhone 14 Pro. I shot photos in a variety of environments and situations, and edited the photos with available tools on the device.
While I spent a lot of time using productivity and information apps on the Pixel Fold, I have to admit that I spent an almost equal amount of time playing games and watching videos. It's just such a fun device to use – there's nothing like having a tablet hidden in your pocket.
We ran GeekBench 6 and other benchmarks on the phone at Future Labs, and I combined that information with my anecdotal performance results.