With Sony’s PlayStation 5 offering support for a handful of SSD drive types, PC and console gamers alike have more choices when it comes to expandable storage and the Patriot Viper VP4300 comes with a lot to recommend it.
The Viper VP4300 SSD utilizes PCIe Gen4 x4 NVMe technology and includes a DDR4 DRAM cache. It offers two heat shield design options: an aluminum heat shield & graphene heatshield, both available on the 1TB and 2TB variants. Patriot promises sequential read speeds up to 7,400MB/s and sequential write speeds up to 6,800MB/s, and this is born out in my testing.
The 2TB SKU we got in for review has a US MSRP of $189.99 (about £155/AU$270), which isn't cheap, but few, if any, of the best SSD models that offer this kind of performance will be any cheaper right now. The 1TB SKU comes in much cheaper at $119.99 (about £100/AU$168), so if you're on something of a budget, you do have some options here.
Plus, there’s so much to appreciate with the Viper VP4300 SSD that it's easily the best M.2 SSD for gamers who might want to use it in their PC or PS5, making it a worthwhile investment.
Whatever gaming machine you're buying it for, it'll work, and the graphene heatshield will help keep things cool inside your PS5 while the aluminum heatshield will do the same in your PC.
When it comes to PC Gaming, the SSD drive’s performance is respectable though there were some weak spots, like its lower PassMark Disk benchmark score. Its CrystalDiskMark 8 scores were excellent and in line with the promised speed and expectations for a drive such as this.
This means that it’s speedy when it comes to tasks like installations or copying, saving, and transferring files, and my lived experience with it indicates that some anomalous scores we got during benchmarking were indeed outliers (but not all).
However, PC gamers should know that there are definitely faster SSD choices out there, especially if you have a PCIe 5.0-capable system.
VR games, for example, are notorious for long load times on PC and so the observed lower read speed on the Viper VP4300 could impact wait times with these kinds of cases. Even playing more visually low-impactful games like SuperHot VR and Cooking Simulator VR took nearly a full minute to get from SteamVR launching to the main menu screen.
More traditional non-VR games were affected by lower reading times as well. Alan Wake 2 and Cyberpunk 2077 took a bit longer than usual to load from start up to main menu but weren’t annoyingly slow. Even the initial load from the main menu to the most recent checkpoint took a little bit more time.
On the other hand, the Viper VP4300 may be great for gamers who are also creatives since export times to the drive in Adobe Premiere Pro were very zippy.
One huge positive in the Viper VP4300's column is its 2000TBW endurance rating, in addition to its standard five-year warranty. This means that theoretically, PC gamers who blow through their 2TB SSD drive storage can get a bigger storage replacement and use the Viper VP4300 on their PS5. Adding to those longer-lasting capabilities are the two heatsink options.
Benchmarks
Here's how the Patriot Viper VP4300 performed in our benchmark tests: CrystalDiskMark Sequential: 7,389 read / 6,799 write CrystalDiskMark Random Q32: 4,459 read / 3,805 write Second 25GB file copy: 16 seconds 25GB file transferrate : 1,677 MB/s PCMark10 SSD Overall: 2,660 PCMark10 SSD Memory Bandwidth: 323.93 MB/s
Our review unit came with the aluminum and graphene heat shields, though these definitely aren't hot-swappable. During testing, the Viper VP4300 got as hot as 57 degrees C when gaming but poked out a bit.
The other graphene headshield does look a bit better and leaves a smaller profile, especially useful for devices like laptops or the PS5. More so than gaming performance, it’s clear that the Patriot Viper VP4300's real niche is in its endurance.
While its read speeds don't top the charts, the Viper VP4300’s respectable performance, especially in write-intensive tasks, and compatibility with PS5 make it a versatile option that any gamer should consider. Additionally, its robust 2000TBW endurance and five-year warranty underscore its longevity, making the Viper VP4300 a valuable investment for gamers and creatives seeking reliable, high-performance storage.
Should you buy the Patriot Viper VP4300?
Buy the Patriot Viper VP4300 if...
You want an SSD compatible with Sony’s PS5 PC gamers and PlayStation 5 owners in need of additional storage may have a viable option
You require an SSD that’ll last a while Having a 2000TBW endurance and five-year warranty means this SSD is going to last a long time.
Don't buy it if...
You want the absolute best in gaming performance Again, having lower reading benchmarks means gaming performance for loading may not be up to snuff compared to rival SSDs available around the same price.
You need an SSD that’s affordable The 2TB version of the Patriot Viper VP4300 is around $150 which many may find expensive compared to others that offer similar or better performance.
Patriot Viper VP4300: Also consider
If my Patriot Viper VP4300 review has you looking for other options, here are two more SSDs to consider...
First reviewed January 2024
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
Asus ROG Harpe Ace Aim Lab Edition: Two-minute review
The Asus ROG Harpe Ace Aim Lab Edition doesn’t come cheap, but it absolutely nails all the important features for a premium wireless gaming mouse. Its ambidextrous symmetrical form factor is streamlined and comfortable, making it ideal for longer play sessions or competitive settings. The responsive micro switches deliver clean, satisfying clicks that easily keep pace with even the fastest moments of first-person shooter titles like Counter-Strike 2 or Call of Duty: Modern Warfare III.
It’s a very versatile mouse too, with support for USB-C wired play in addition to wireless connectivity via Bluetooth and the proprietary 2.4 GHz Asus ROG Omni Receiver. Although the report rate of up to 1,000 Hz isn’t the highest on the market, it's more than enough for competitive use and I didn’t experience any noticeable latency using any of the three connection types during my testing.
Even so, the inclusion of a dongle extender in the package is a great added bonus, as it allows you to clip the ROG Omni Receiver directly to your mousepad to further reduce the chance of latency affecting your aim. This is especially important given the clear focus on esports, where many pros tend to veer towards wired gaming mice for their reliability.
There are two color options to choose from: black and white. In addition to a ROG logo on the palm rest, both sport two baby blue side buttons that inject a little color to help give an otherwise utilitarian design some element of personality. The RGB scroll wheel, while nothing ground-breaking, adds a further element of customizability thanks to the option to illuminate it in a variety of basic colors and modes. It even alerts you when the battery is running low, which should help prevent you from ever being caught out by a depleted mouse in the middle of a match.
There’s some optional grip tape for the mouse buttons and sides too, though you’re unlikely to really need it given the grippy, premium-feeling plastic used across the mouse. The side also sports grooves (alongside a very subtle Aim Lab logo decal) which prevents the mouse from ever feeling slippery in the hands.
The Asus ROG Harpe Ace Aim Lab Edition certainly looks premium, but it also packs some seriously high-end specs under the hood. The first area where this mouse really raises the bar is its sensitivity, which is a staggeringly high 36,000 DPI. It goes without saying, but this will be more than enough to satisfy even the twitchiest competitive gamer and should comfortably make this model worth considering if sensitivity is your number one priority.
That’s not all it has to offer, though, as the ultra-light 1.90oz (54g) weight is very impressive too. This is lighter than even the Logitech G Pro X Superlight 2 Lightspeed, which comes in at 2.12oz (60g), and makes for a mouse that is not only very easy to transport but can glide smoothly across most surfaces with little friction or fatigue. There are still lighter mice around, like the wired 1.66oz (46g) Asus TUF M4 Air, but it's quite remarkable to see a wireless option this light.
There is also the matter of the Asus ROG Harpe Ace Aim Lab Edition’s namesake: its compatibility with Aimlabs. For those not in the know, Aimlabs is a freemium aim trainer program that is a popular choice for training among competitive FPS players. Aimlabs sees you undergo a series of short exercises, namely clicking on various targets in blank environments, in order to evaluate your overall performance. The Asus ROG Harpe Ace Aim Lab Edition is able to automatically find your ideal mouse settings based on these tests, even saving them to a special profile for easy access.
Although most esports competitors will likely already know their own preferred settings, this is still a brilliant addition that could prove genuinely game-changing for those not quite at that professional level. It is a shame, though, that the features offered by the compatible ROG Armory Crate software aren’t so strong. All the basics like the option to change your DPI, map buttons, or calibrate your mouse are at least here but the lack of premium additions like the ability to download profiles from the internet leave it lagging behind the offerings from leading esports brands like Zowie.
The only other real issue is the placement of the DPI button which is, unfortunately, on the bottom of the mouse. While this might not be an issue for some, I am someone who enjoys creating specific profiles for different games and alternating between them quite frequently so having more limited access to the switch is a little annoying.
Asus ROG Harpe Ace Aim Lab Edition: Price & Availability
How much does it cost? $139.99 / £139.99 / AU$189
When is it available? Available now
Where can you get it? Available in the US, UK, and Australia
Coming in at $139.99/£139.99 / AU$189, the ASUS ROG Harpe Ace Aim Lab Edition is very much a high-end product. Even so, it is less expensive than current esports favorites like the Logitech G Pro X Superlight 2 Lightspeed, which costs $159 / £149 / AU$299, and features better specs in most important areas including DPI and weight.
The real question here is whether it’s worth actually getting a wireless gaming mouse for esports in the first place. The wired Razer DeathAdder V3 features a higher report rate of 8000 Hz, a 2.08oz (59g) weight, and a very sensitive 30,000 DPI all for just $69.99 / £69.99 / around AU$99. Considering how many esports pros use wired mice, it might be worth weighing up just how much of a premium you’re willing to pay to cut out the cord. If you are shopping exclusively for wireless options, however, this is a very reasonable price for what you are getting.
Asus ROG Harpe Ace Aim Lab Edition: Specs
Should you buy the Asus ROG Harpe Ace Aim Lab Edition?
Buy it if...
Don’t buy it if…
Asus ROG Harpe Ace Aim Lab Edition: Also consider
How I tested the Asus ROG Harpe Ace Aim Lab Edition
Tested for over a month
Used for both gaming and productivity
Tested with leading esports titles
I spent over a month using the Asus ROG Harpe Ace Aim Lab Edition every day. This included standard productivity tasks, plenty of internet browsing, and of course playing lots of different games. I was careful to test the mouse with fast-paced esports FPS titles, including Counter-Strike 2, Valorant,and Call of Duty: Modern Warfare 3.
That said, I also used the mouse for plenty of other games including Anno 1800, The Sims 4, and The Caligula Effect 2, where I found that it performed well in a range of genres. I predominantly used the mouse with its wireless receiver, but was sure to test the other connectivity modes too. I experimented with both the ROG Armory Crate software and Aimlabs to try out all of the available features.
As a hardware writer over at TechRadar's sister site TRGaming, I have plenty of experience going hands-on with all kinds of gaming peripherals every day. I’m also a pretty avid FPS player and have played lots of games using a variety of different mice over the years.
The Dell XPS line has been the gold standard among laptops for some time now. Though some models stumble a bit, at least in relation to the heights of the best versions of the XPS, these laptops typically ooze quality and elegance.
The Dell XPS 17 9730 reviewed here, certainly does that. Of course, you miss out on the portability that makes the smaller versions such perennial members of our best Ultrabooks guide. This is on the heavier, bulkier side.
However, if you don’t need something that you can easily throw in a backpack for on-the-go work, the Dell XPS 17 is among the best laptops for its performance – including the fact that it can handle editing work and gaming – and elegant design. It also comes with a gorgeous screen, especially if you upgrade to the UHD+ resolution that our review unit sports.
If you’re looking for a larger laptop with more screen real estate with plenty of power, you can’t go wrong with the XPS 17 9730. Just be prepared to pay for it.
Dell XPS 17 9730: Price & availability
How much does it cost? Starting at $1,599 / £2,698.99 / AU$4,398.90
Where is it available? RTX 4050 model is only available in US, more expensive models worldwide
Though the Dell XPS 17 isn’t technically an Ultrabook, it comes from Ultrabook stock. After all, the Dell XPS 13 is the standard bearer for the category. It’s no wonder then that the Dell XPS 17 comes with the kind of premium price tag that these types of laptops come with.
Of course, part of that is the fact that even the base configuration, which goes for $1,599, comes with some powerful specs including an Nvidia GeForce RTX 4050. Still, the price of entry is nowhere near budget. Also, it seems that the model with an RTX 4050 is only available in the US - for UK and Australian readers, the base model starts with the more powerful - yet also more expensive - RTX 4060 GPU.
In Australia, the base model comes with a 13th gen i7 processor, RTX 4060 GPU, 16GB of DDR5 RAM and 512GB SSD storage for AU$4,298.80.
In the UK, the base model is more expensive, and while it comes with mainly the same specs as the Australian Dell XPS 17's base model, it only offers a minimum of 32GB of DDR5 RAM and 1TB SSD storage for £2,698.99.
You’ll have to spend even more if you want the review unit with its 32GB of RAM, slightly more powerful RTX 4060, and UHD+ screen. Specifically, you’ll have to spend $1,949 / £3,099 (about AU$2,990).
Now, the best 17-inch laptops usually aren't cheap. You can save a little money on an LG Gram if you don’t need all that power and want something a little more lightweight and portable. But, you’re still spending around $1,400 on one.
If you’re okay with a more gaming aesthetic and want some power to go along with that large screen, there are some other budget-ish options, such as the Acer Nitro 17. Its starting price of $1,249.99 (around £980 / AU$1,860) offers a bit of a saving, though you won’t end up with quite as elegant of a computer.
Price score: 4 / 5
Dell XPS 17 9730: Specs
The base configuration of the Dell XPS 17 is already pretty powerful with a 13th-Gen Intel Core i7, an Nvidia GeForce RTX 4050, 16GB of RAM and a 512GB SSD (in the US, at least). But, there’s plenty of customization to add even more power and/or storage. You can also choose a more powerful i9 CPU, up to an RTX 4080 GPU, and up to 64 of RAM. You can even upgrade to an 8TB SSD (technically two 4TB SSDs).
Beyond the internal specs, you also can choose between two different panels. There’s the more basic non-touch 1920x1200p screen or the one reviewed here that’s 3840x2400p with touch capabilities.
Specs score: 5 / 5
Dell XPS 17 9730: Design
It’s very large
Powerful components inside
Limited amount of ports, but they’re versatile
As you would expect with a 17-inch laptop, the Dell XPS 17 is large. And, unlike some models such as the LG Gram, it owns it instead of trying to balance that with portability. It weighs over five pounds to start, which while technically portable, is not the kind of weight you want to carry around all day if you’re hopping from coffee shop to coffee shop.
The Dell XPS 17 is not for that person. Instead, it offers the kind of components that typically can’t fit in those smaller models. So, it comes with a dedicated GPU (you can choose between the Nvidia GeForce RTX 4050, 406, 4070, and 4080) and you can also max out specs that you'd otherwise couldn’t with a smaller (and thinner) laptop, as you can upgrade to up to 64GB of RAM and 8TB of storage.
Since this is a Dell XPS laptop, it’s also a gorgeous computer with a platinum silver exterior machined aluminum shell with a black carbon fiber covering around the keyboard. It’s like a BMW version of a laptop.
Since it’s on the bigger side, that means it comes with a large 17-inch screen. While I’ll go into further detail on the display below, I’ll just mention for now that it has 'Infinity edge' bezels, so they’re tiny, and the upgraded UHD+ version reviewed here also has touch capabilities. On that note, the panel feels very high quality when using the touch functionality.
Port-wise, the Dell XPS 17 skews a bit more Ultrabook-ish with just four Thunderbolt ports and an SD card reader. However, all the Thunderbolt ports have power delivery and DisplayPort capabilities, so you can use an adapter to plug into an external display if that monitor doesn’t have USB-C inputs.
Since I’m used to using smaller laptops, the large keyboard and trackpad are a bit of an adjustment. However, they’re also of high quality and don’t create any issues other than being different from what I’m used to. Probably the biggest adjustment is that the keyboard is set further back than I would like. But, again, that’s just personal preference.
Design score: 4.5 / 5
Dell XPS 17 9730: Performance
Powerful performance
Great color accuracy and coverage
Webcam just 720p
Dell XPS 17 9730: Benchmarks
Here's how the Dell XPS 17 9730 performed in our suite of benchmark tests:
As someone who gets their hands on a lot of gaming computers, I’m always surprised when I get something that can hang that’s not really intended for that purpose. So, when booting up the Dell XPS 17, I can honestly say that I was surprised.
Whether you’re a bit shy about your extracurriculars or need a laptop that has the horsepower to handle editing work (within reason), the Dell XPS 17 is more than capable. The review unit tested here is quite powerful with a 13th-Gen Intel Core i7, 32GB of RAM, and an Nvidia GeForce RTX 4060. Of course, you can scale down a little bit to 16GB of RAM and a 4050 GPU. But, you can also go up to 64GB of RAM and a 4080, not to mention an Intel Core i9 CPU.
Frankly, it might be more power than you might need, depending on what you’re considering this for. But, more power is better than not enough, especially when you have a high resolution screen to power. The UHD+ (4K in 16:10 ratio) panel here is sharp, bright, and vibrant and has a Delta-E of 0.24. Color coverage is 188.8% sRGB and 133.7% DCI-P3 as well so you don’t have to worry about accuracy or color gamuts if you want to do some photo or video editing. At the very least, watching the latest streaming series is a pleasure.
The sound quality is pretty good for a laptop, though don’t believe Dell’s claims that you can mix on this (for any budding musician that’s considering this – get some good speakers).
Interestingly, the only issue I have performance-wise is the fact that Dell only included a 720p webcam.
Performance score:4.5 / 5
Dell XPS 17 9730: Battery life
Battery life is average
Charges quickly
Considering the fact that the Dell XPS 17 is quite a powerhouse, it’s no surprise that it doesn’t quite perform as well in the battery department as the newest MacBooks or Ultrabooks. In our battery informant benchmark, for instance, it last just over nine hours. That’s not bad at all, all things considering. Just be aware that there’s a bit of a trade-off for powering the high-res display and GPU.
A little more disconcerting is the fact that it does seem to lose some charge over time when the lid is closed. While I won’t hold that against the XPS 17, it something to keep in mind. Since it receives power via Thunderbolt however, it doesn’t take long to charge back up.
Battery life score: 4 / 5
Should you buy the Dell XPS 17 9730?
Buy it if...
You want power This laptop has some serious power behind it. Whether you want a work laptop that can do some gaming when you’re done or you have to also do some editing work, the Dell XPS 17 is more than capable.
You want a premium looking and feeling laptop True to the Dell XPS name, this laptop exudes elegance. Everything about it looks and feels like a lot of thought went into its design.
Don't buy it if...
You’re trying to save money While its price tag makes sense for its size and power, this is not a cheap computer. If you’re on a budget, there are plenty of other options out there.
You need portability As good as the Dell XPS 17 is, it’s not a portable computer. It’s heavy and a bit bulky, so you should look elsewhere if you need something to constantly take on the go.
Dell XPS 17 9730: Also consider
If our Dell XPS 17 9730 review has you considering other options, here are two laptops to consider...
How I tested the Dell XPS 17 9730
Tested for a couple weeks
Used for regular work as well as gaming
Used regularly unplugged
I used the Dell XPS 17 9730 for a couple weeks for work as well as for play. In particular, I wrote this review on it. I was able to play some demanding games like Battlefield 2042 on it, though with some adjustment to the settings, and spent some time streaming on it as well.
After spending time with the Dell XPS 17 9730, I was impressed by the fact that its power is more on par with a gaming computer than with its Ultrabook competition.
I’ve spent the last few years reviewing tech gear for gaming and otherwise, where I’ve gotten a feel for what to look for and how to put a piece of kit through its paces to see whether it’s worth the recommendation.
Google has replaced its Google Assistant with a new AI-based tool, Gemini, at least for those of us in the US daring enough to download the new app. I tried Gemini on my Pixel 8 Pro, testing it side-by-side against the older Google Assistant on my OnePlus 12. The experience is changing very quickly, and features that didn’t work yesterday may suddenly work tomorrow. Overall, Gemini is trying to be something very different than Assistant, without removing the features I’ve grown to rely upon.
Google Gemini hands-on: Design
It was obvious from the first time I opened Gemini that it’s trying a different approach. While Google Assistant asks “Hi, how can I help?,” Gemini posits “What would you like to do today?” Assistant waits for me to start speaking. Gemini listens, and also has a prompt below the question telling me to type, talk, or share a photo.
When I started using Gemini a week ago, there were many things it couldn’t do that Assistant could handle. Gemini couldn’t control my smart home equipment. It wouldn’t set a reminder. Gemini needed me to press a button after I gave it a command. There were many bugs at first, but in only a week the software has greatly improved. It can control my lights and thermostat, for instance, and its response is now automatic.
If you want more than just a basic Assistant, you can open up the full Gemini app. Up top, Gemini offers suggestions for things to try, with interesting options that change frequently.
Beneath sits a list of your three most recent queries. Gemini keeps track of everything you ask, and since it’s an AI it will also summarize the session and give it a title. You can look at your entire query history and delete an entry, or give it a more appropriate heading. You can also pin your best chat sessions.
Gemini’s hidden strength is its ability to talk to other Google apps. It can replace Google Assistant because it uses Assistant as one of its many tools, along with Maps, Search, and even others. You can save a chat session directly to Google Docs, or export it directly to a Gmail message.
If you don’t want to use Gemini as your Assistant replacement whenever you press the Power button or yell “Hey Google,” you can choose Assistant instead in Settings.
Google Gemini hands-on: The Gemini differences
While Google Assistant is just that, an assistant to do things on your phone, Google Gemini is trying to be smarter, more like a human helper with ideas than a cold machine.
For instance, among the suggested activities, Gemini suggests I “Brainstorm team bonding activities for a work retreat,” and offers “Ideas to surprise a friend on their birthday.” When I tap on the birthday ideas option, it adds “concert-loving” friend, which is clever because I can easily replace that with “table-top game loving” or whatever my friends are actually into.
For image generation, the suggestions from Google show the granularity of detail that Gemini can handle. To create a space hedgehog, Google started with a 36-word prompt with verbs, descriptions, and things to avoid in the final image.
Gemini is smart enough to continue a conversation after a prompt. I asked for suggestions for plans in a specific town nearby and it offered four suggestions. I said I liked the fourth option and asked it to expand and it complied, offering more options that were similar. I had no problem referring to previous prompts in a single chat, even if I’d veered off-topic a bit.
So, is Google Gemini the new Google Assistant, or is it an app that runs Assistant on my behalf? Assistant doesn’t have a full screen app, it’s always a pop-up window. Google Gemini starts as a pop-up, and you can open the app to dive into more detail.
Assistant can’t interact with photos like Gemini, though this is still a buggy feature. It would often refuse to help with a photo task, telling me it couldn’t work with images, or it wasn’t yet ready to handle photos of people. Sometimes these photos didn’t include humans, so I’m not sure what caused the error.
On some occasions, Gemini would tell me that it could not interpret an image, and then it would offer me detailed information. I asked about a bird in a photo I’d taken and it told me it couldn’t review the image, then offered me links for info about the Great Cormorant. I expect these bugs will be ironed out soon, but I’m still unclear what Gemini will be able to do with images I upload.
Google Gemini hands-on: Performance
Google Gemini is slow. When I tried the same tasks side-by-side with Google Assistant on my OnePlus 12, Assistant always finished first. That could be the faster Snapdragon 8 Gen 3 processor in the OnePlus 12, but I suspect there are bottlenecks slowing down Gemini. After all, Gemini isn’t replacing Assistant, it’s using Assistant, so that creates an extra step.
That said, there aren’t many tasks for which I need Gemini to respond with great haste. If I’m asking for weekend plans, I can wait an extra ten seconds for a good answer. If I’m turning off all the lights in my house, the longer pause is annoying.
The Gemini results can be impressive, and Gemini can expand or adapt its answers. In fact, it always suggests ways it could expand to be more helpful. If I ask for a destination, it might offer a few ideas that are bad and one that’s great, and when I identify the choice I like, it can find similar options. Of course, that’s what a machine does best, match patterns.
I tried using Gemini to plan a fiction novel about a robbery and it was surprisingly fun. Its suggestions were cliche, but it did a great job offering pathways to expand. After I gave it an initial plot synopsis, it offered to flesh out storylines, create plot twists, and even devise motivation for different character actions.
I’ll keep using and testing Gemini, and it has a lot of room for performance improvement, but the experience is still fun and satisfying, and the results are often worth the wait. The suggestions are not uniformly good, but they are occasionally great.
Google Gemini hands-on: First impressions
What is Gemini for? Approaching a new AI tool, it’s hard to know how to use it. It isn’t really a replacement for Assistant so much as a gatekeeper of all of Google’s apps that provide answers, especially Maps and Search.
Following Google’s suggestions in the app helps open doors. Google suggests using Gemini to help make plans, and that’s what I did most often. I made date night plans, weekend plans, and I’ll be using Gemini to help with a road trip soon. Gemini offered ideas that pointed me in the right direction, even if I didn’t use the options listed.
Google also has created a great tool for brainstorming. Gemini offered its most interesting results at the end when it suggested ways that I could ask it to expand. There were no one-step conversations. Every query ended with a call to action to go further. I liked that, it was very helpful.
What did I not like? I asked Gemini for a recipe for moist and fluffy muffins and it gave me a recipe but no attribution. An author can’t copyright a recipe, but Gemini didn’t invent muffins, or techniques to make them fluffy. It felt like something was being stolen.
I also didn’t like the faux humanity injected into every response. No matter what I suggested, I got a compliment from Gemini. Sometimes these were subtle words of encouragement, other times it was fawning and embarrassing.
Look, Gemini, I know that you’re a fake computer personality. It doesn’t make me feel good when you tell me I’m very creative and interesting. It’s less believable than when my Mom told me I was the most handsome … you get the idea.
I use Google Assistant often for the basic – timers, weather, and smart home control. Gemini can do all of that, so I won’t stop using Gemini. I’ll also try Gemini for help expanding on ideas and plans. I’m very curious to see how it grows its capabilities with all of the other Google Apps it can control.
Apple has spent almost a decade developing the Vision Pro, and it shows. Everything about it is spectacular, from the exquisite design to brilliant visuals that blend the real with the fantastical, to the versatility that puts other mixed-reality headsets to shame.
The fact that, even after all that work, there are still limitations is frustrating. Sure, I want a sub-1lb/500g headset that somehow integrates the battery; and, of course, I want it to cost $500. The state of the art, even Apple’s bleeding-edge form of it, isn’t there yet. None of that, however, makes me think less of the Vision Pro. It's a stunning achievement in industrial design and technology that translates the inscrutable worlds of AR, VR, and mixed reality into an experience that pretty much anyone can understand and enjoy.
Using your gaze and gestures (finger taps, long pinch, pull) to control a computer is the intuitive technology control you didn’t know you were missing – the millimeter precision is more like what you’d expect from a seasoned OS, not the brand new Vision Pro platform, visionOS, Apple introduced nine months ago. Apple got this right on the first try, and it could become as second-nature as tapping, pinching, and swiping on an iPhone or iPad is today.
As a new computing platform, the Vision Pro is rich with features and possibilities. The fact that it does so many things so well, and that they work and make sense, is a testament to Apple’s efforts. I’ve been marveling at the attention to detail, and at how a bleeding-edge, V1 product can feel so finished and complete. Apple has created a headset that I’m itching to wear almost every day, and if I did nothing but work in it the Vision Pro would transform my life. I’ve long dreamed of having a 150-inch or larger workspace, but I couldn’t imagine how it would be practical or, more importantly, viewable. With the Vision Pro, I get an almost unlimited desktop that makes me want to never return to the confined space of my laptop.
Image 1 of 2
Image 2 of 2
I’ve rarely tested a technology that has moved me in the way the Vision Pro does. Spatial videos are so achingly real that they instantly trigger an emotion that a flat image might not. Being up close with otherworldly or prehistoric creatures that seem to almost see me is at once jarring and thrilling. To pull this off, you need more than great apps, software, developers, and artists; you need a cohesive system that brings it all to life. The Vision Pro does it time and again, with 23 million pixels of imagery, spatial audio that travels the distance from band-bound speakers directly to your ears, and eye-tracking that knows your gaze better than you do.
There are frustrations and missteps, too.
I struggled to find the best fit, and while I can now wear the Vision Pro for hours, my face reminds me afterwards that it’s not built for this. I've struggled on occasion to find a fit that doesn’t cause me some niggling discomfort (although the more immersed I get, the less I feel anything).
I don’t mind the external battery, but it feels not quite up to the task when you want to watch a 3D movie and power seems to drain at double speed. Thank goodness the battery can be plugged in for continued use.
While I think the outside-in pass-through technology that marries the real and computer-generated worlds is among the best I’ve seen, Apple’s attempts to keep you connected to people in front of you and through, say, FaceTime calls, need work. Personas are just this side of creepy, and EyeSight, which shows a video of your eyes to those around you on the exterior screen, looks a bit scary. Then there's the price, which is overwhelming, and will be an instant turnoff for many. I wonder, though, if they might feel differently after their first experience – I’d argue that they will decide they want a Vision Pro, and the only question will be how they can afford it.
Apple Vision Pro: Price and availability
Expensive
Price does not include lens inserts
No yet available outside the US
Apple announced its Vision Pro headset on June 5, 2023, at WWDC 2023. It's available now in the US, and costs $3,499 for the 256GB model. Preorders opened on January 19, and the headset began shipping on February 2. Availability and pricing for other markets is yet to be confirmed, but Apple says that will follow in 2025.
Value score: 4
Apple Vision Pro: What's in the box
So what do you get for your $3,499 (Apple sent me the 1TB version, which starts at $3,899 – you can opt for a $3,699 512GB headset)? Essentially, in the box is everything you need to put on and start using the Vision Pro. In order of importance:
There’s the Vision Pro system
A battery with attached cable
USB-C charge cable and 30W adapter
The Solo Knit Band
A Dual Loop Band
Two Light Seals Cushions
A fabric cover
A polishing cloth
The only thing that's not included, and which you might need, as I did, are the Zeiss prescription lens inserts. These will run you $99 for reading-glass lenses, and $149 for full prescription lenses, which is what I need. The Vision Pro might be unusable for those with particular sight issues – Apple can let you know upfront if that's likely to be the case.
It all arrives in a large white box that has all the hallmarks of containing a high-end Apple product.
Image 1 of 3
Image 2 of 3
Image 3 of 3
You can buy an optional carrying case for $199, and, considering that you just dropped almost $3,500, I think it’s money well spent, although some might argue that Apple should include the case with the expensive headset. Apple sent me the case; it's compact, and has storage space for everything I mentioned above, and I think the hard-shell, soft-surface body will do wonders to protect your expensive new toy.
Apple Vision Pro: Specs
The headgear is, in some ways typical goggle size: it's roughly six inches wide by almost four inches deep from the edge of the Light Seal to the front of the glass, and almost four inches tall.
Perhaps the most important spec of all, though, is the weight. Depending on which bands you use, the Vision Pro clocks in at 1.3 to 1.4lbs, or 600 to 650g. The external battery, which I kept either in my pocket, on the table, or on the couch next to me (later I got a nice $49.95 Belkin Case, so I could attach it to my belt), weighs just over three-quarters of a pound, or around 350g. Thank goodness Apple opted to not integrate the battery in its first mixed-reality headset.
Apple Vision Pro: Performance
Powerful, proven M2 chip
R1 appears to take the visual load
Never a lag
Could do with more base storage
Apple’s Vision Pro works as well as it does not only because of the design and remarkably intuitive interface. As always, it’s what’s inside that really counts.
At its heart is the M2 chip, a second-generation piece of Apple silicon, featuring an 8-core CPU and a 10-core GPU that powers, for instance, the most recent MacBook Air (MacBook Pros now have M3 chips). In my experience, this is a powerful chip that’s well qualified to handle the demands of virtual and augmented reality.
However, the M2 has the support of a new piece of Apple silicon, the R1 chip, which appears to be primarily in charge of managing those 4K screens, and maintaining a 12-millisecond image update so that what you see is clear and butter-smooth.
The Vision Pro’s dual 4K micro-OLED displays with 23 million pixels are also industry-leading. My experience with the headset is that I get crystal-clear imagery wherever I look, and at whatever size I make the screens.
Apple has placed the stereo audio pod headphones on the stiff white rubber stems that feed into the Vision Pro frame, but they provide excellent spatial audio that’s arguably as good as anything you might get with in-ear buds. If you place an app on your virtual left, the audio will come from that space. At one point, I took a screen that was playing video and slowly dragged it from one side of my space to the other, and the audio tracked perfectly from my left side to the middle and then to the right. There’s also a six-microphone array that does a good job of picking up “Siri” commands. I love summoning Siri because, in the Vision Pro, the digital assistant is embodied as a small floating glass orb.
There are numerous cameras arrayed on the outside and inside of the Vision Pro. The two on the front handle the stereoscopic imagery that you can see and capture in photos and videos that you’ll play back later, and sometimes cry over.
The Vision Pro supports Wi-Fi 6 (though oddly not Wi-Fi 7, which in a few years will be ubiquitous) and Bluetooth 5.3, the latter of which allowed me to easily connect the Vision Pro to an Xbox controller, Magic Keyboard, and Magic Trackpad.
The Vision Pro also displays a virtual keyboard that you can poke at in space. This can be resized and positioned to your liking, but the lack of physical and tactile feedback makes it difficult, at best, to use. What’s worse is that because you may not be looking at the keyboard when you type, the gaze control system can’t really help you with accuracy. To be fair, I’m a terrible typist and do often look at the keyboard when touch-typing. Ultimately, if I want to type within visionOS while wearing the headset, I find that using the Magic Keyboard Apple sent me is best (they also sent a Magic Mouse that can work across both visionOS and my connected MacBook Pro’s virtual display).
Apple provided me with a 1TB Vision Pro test unit, which retails for $3,899. The base model, which costs $3,499, starts with just 256GB of storage. Apple pitches the Vision Pro as the first “spatial computer,” which makes me wonder why it didn’t start with at least a half-terabyte of storage as standard.
Performance score: 4.5
Apple Vision Pro: Design
Exquisite materials and build
Good looks hide impressive specs
Not that light
External battery
I know people joke about the Vision Pro looking like a pair of hyped-up snow goggles. Maybe so, but I’d argue there’s not one false note in this design. Apple, as it often does, chose the best material, with a particular focus on weight. So, it’s a mix of an aluminum body, magnesium, carbon fiber, and woven fabric. It’s virtually all curved, with a gleaming glass front that protects the screen marrying almost seamlessly with the body.
There’s a lot of technology packed into the Vision Pro, but Apple has made some effort to hide it. There are cameras behind the glass, and a set of them along the bottom edge that watch your face and hands. The AudioPod speakers that deliver near-perfect spatial audio are hidden behind stiff, white rubber.
Along one side of the top edge is a familiar piece of technology: the Digital Crown. When I first saw this, I wondered why Apple would pull an idea from Apple Watch and slap it on its newest device. But Apple considers the Vision Pro to be a wearable, so there’s some logical continuity here. Plus, the Digital Crown turns out to be one incredibly useful and important piece of Vision Pro technology.
The 'Top Button' on the top left side, which is mostly used for Spatial Photos and Video, gets far less use.
Image 1 of 11
Image 2 of 11
Image 3 of 11
Image 4 of 11
Image 5 of 11
Image 6 of 11
Image 7 of 11
Image 8 of 11
Image 9 of 11
Image 10 of 11
Image 11 of 11
There are some large vents on the top of the frame, and a grille on the bottom, and these seem to work together to draw air in and flow it up and away from your face. The Vision Pro does get warm, but never so much so that it’s uncomfortable.
The Vision Pro and the Light Shield both curve to meet your face, but the crucial bit that makes this design work is the foamy Light Seal cushion. This rests on and follows the contours of your face, and it’s about as comfortable as such a thing should be, though on occasion during my testing time I've wondered if it should be either thicker or denser.
Attaching this spatial computer to your head requires a band, or a pair of bands. The default Solo Knit Band is wide, and can be adjusted via a dial to make it tighter or looser. It’s never offered enough support for me, though.
It’s worth pausing here to remind you that the Vision Pro weighs 1.3lbs. That’s not particularly heavy in itself, but a typical set of eyeglasses might weigh 0.08lbs, and now imagine attaching a pound of ground beef to your head. This experience is far more enjoyable, but it needs support. In contrast to the Solo Knit Band, the Dual Loop Band supports the headset on the back of your head and across the top of it, and I bet most people will prefer it over the more attractive Solo Knit version.
Design score 4.5
Apple Vision Pro: Setup
I would not call any part of the Vision Pro setup process complicated or off-putting. There are several steps to go through, but mostly these are to customize the experience. As for assembling it, there’s the matter of which band you choose for comfort, and fitting the right Light Seal (the thinner one is standard, the thicker one is for glasses-wearers who have those Zeiss inserts). You also need to attach the battery cable, which snaps into the right side (if the headset’s glass is facing you) and locks on with a twist – there’s no chance of it popping off. The cable is long enough for you to drop the somewhat hefty battery into your pocket.
The bands snap on and off metal lugs using small orange pull tabs; I usually put the lens cover on, so that I can tip the Vision Pro on its face while I swap bands. Aside from the colorful screen behind the glass, which is just black when the device is off, that orange is the only bit of color pop on the headset.
If you ordered Zeiss prescription lenses, as I did, you’ll have to put them in before using the headset. They, like most other pieces on the Vision Pro (for instance the Light Seal and its cushion) attach magnetically. They’re also labeled, so you won’t get confused about which lens goes on which side. The box comes with a QR code that you’ll use to register the lenses with the headset.
Image 1 of 2
Image 2 of 2
There’s also a QR code moment when you want to sync your Vision Pro with your phone and iCloud account. None of this takes long, or is even remotely confusing.
The rest of the setup occurs mostly with the headset on. Apple fans will appreciate the startup, which includes a floating Apple logo and then the iconic 'Hello' spelled out in 3D scripted letters floating in your real-world space. It’s just a hint of what’s to come.
One of the keys to the Vision Pro’s technical excellence is its ability to track your gaze (along with the dual 4K micro-OLED screens inside the headset are cameras pointed directly at your eyes). So, the setup begins with pressing the crown to get the pupillary distance – that is, the distance between your pupils – right. The screens mechanically shift position to match your pupils.
Next, to ensure that the Vision Pro fully understands where you’re looking, you go through a series of vision calibration tests, during which a circle of dots appears and you look at each one, and pinch your index finger and thumb together. You do this three times in three different environmental light settings. I’d already tried this five times over the last seven months of demos, so I'd gotten quite good at it.
The system also needs to calibrate for your hands. This process consists of holding them up in front of the Vision Pro. You’ll see a faint glow around the outside edges of your digits, and then you flip your hands over and do it again.
While the system may occasionally ask you to reset the pupillary distance by holding down the Digital Crown, you won’t have to perform any other part of the setup again. For Guest Mode, if you want to let someone else to try the headset, they’ll go through the vision and hand calibration, but the results will not be used by the Vision Pro when their session ends.
As I've said, I found this setup simple, and quite effective in that from that point on the system has seemed to know me every time I've donned the Vision Pro, and it worked as well as the time before. The only step I need to follow now to get started is logging in with a PIN. You can also log on with an iris scan (OpticID) but, while I successfully registered my iris, I could never unlock the Vision Pro with my eyes. I've asked Apple about this issue and will update this review with its reply.
Apple Vision Pro: Software and Interface
Excellent intuitive OS
A true 'think-do' interface
Spatial computing turns the world into an unlimited digital space
You start on the home screen, or Home View, which can be summoned with one press of the Digital Crown. This screen will look somewhat familiar to most Apple fans: there's a grid of preinstalled app icons, many of which match what you’ll find on your iPad or iPhone (Notes, Safari, Photos, Podcasts, Calendar, Mail, Files, etc), horizontally arrayed and floating in your real-world environment. You can use the pinch-and-wave gesture to move through multiple screens of apps. This is where the apps you install will live – you can download many more from the App Store, which now has more than 600 visionOS apps. There’s also access to 'Compatible Apps' folder, which collected all the iOS and iPadOS apps I installed on Vision Pro. Apple claims that roughly a million iOS and iPadOS apps already work with Vision Pro, even if they were not designed for visionOS and the Vision Pro. It's not hard, though, to find ones, like most Google apps, that don't.
Compatibility is a mixed bag. Sometimes it works perfectly, other times less so. NBA2K got stuck on setup screens and wouldn’t let me access the game. In the case of Paramount Plus, which is not officially designed to work on the headset, it played, but I couldn't expand the live video to a full window. Apps that have not been custom-built for visionOS were the ones most likely to crash.
The Vision Pro defaults to a full pass-through mode, although that’s a misnomer; you’re never looking directly at your surroundings. Instead, the cameras on the front deliver a clear video feed of your surroundings to your eyes. It’s the best way to marry virtual information with reality, and it's very effective.
Image 1 of 6
Image 2 of 6
Image 3 of 6
Image 4 of 6
Image 5 of 6
Image 6 of 6
The Vision Pro is constantly keeping track of your gaze, and your hands and fingers, so as long as your hands are in view of the cameras you can control anything you see simply by looking and pinching. The headset will not register gestures made behind your head, or if you drop your hands down to your side; I usually have my hands in my lap, or raise them in front of my chest to pinch, drag, and zoom. To telegraph my intentions, I simply look at something and then pinch to, for instance, open an app.
Once an app is open, you can move it about the room by grabbing a thin white bar at the bottom of the app’s screen. This isn’t hard to do; you just look at it, pinch to grab, and move it. You can also use this gesture to pull the app window closer or move it further away. On the bottom right corner is a curved bar that you can use to resize from, say, a big-screen TV size to a wall-size app. Inside an app you can look at, for instance, a photo in the Photo app, and use both hands with a pinch gesture to zoom in and out.
While working and playing in apps, you can have them superimposed onto your real world, or select one of Apple’s Environments to change your surroundings. There’s Yosemite, Mount Hood, Joshua Tree, Haleakalā, Hawaii, and even the Moon, and each one is a 360-degree, 'live,' immersive experience, complete with spatial sounds. Once you've chosen an Environment, you can twist the Digital Crown to turn the immersion up or down (this is also how you raise and lower the volume – you just look at the Volume icon instead of the Environments one). Up means the real world increasingly fades, and you’re surrounded 360 degrees by, for instance, the dusty surface of the Moon. Your hands don’t disappear, but they’re not resting on your knees, and instead appear to be floating above the moon’s surface.
While the Vision Pro can drop you into virtual reality, it’s smart enough to keep you connected to the outside world. When my wife started to speak to me during one immersive session, I could see her gradually appearing in front of me as we talked. On her side, she sees a video feed of my eyes on the front of the headset. Apple called this EyeSight, and it’s a hit-or-miss affair.
EyeSight shows a video recreation of your eyes based on what the camera sees inside the headset. The color is blue and purplish, and it can look, well, weird. My wife never liked it, but I noticed that she got used to talking to me while I was wearing the headset.
Software and interface score: 5
Apple Vision Pro: The Experience
There's nothing quite like using a Vision Pro
It can be hard to convey the experience to people on the outside
Your work life will change, too
If you can get your fit right (this can take some trial and error) there's nothing quite like the experience of using the Vision Pro. Even when I wasn’t wearing the headset, I found myself thinking about wearing and using it.
It's an able virtual-reality and mixed-reality machine, and I have much more to share there, but its ability to integrate my real-world work environment has been transformative for me.
I’ve spent many hours now working in the Vision Pro. To do so, you use the Control Center, which you access by glancing up until you see a green arrow and pinching to open, to launch the Mac Virtual display. In my case this showed me my MacBook Pro as an option, so I selected this and my desktop appeared before me as a huge 55-inch display. I could expand that to, say, 150 inches (or more), and then pull it close to me – this is the endless desktop I’ve always dreamed of, and none of this would work if every window, app, and bit of text weren’t crystal clear.
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
Just because I had my work desktop in front of me didn’t mean I lost out on the rest of the Vision Pro experience. On my right, I had Messages open (when a new message arrives, a green message icon floats in front of my face), and, on my left, I had my Photos. Sometimes, I would AirDrop an image (often a Vision Pro screen-grab that I'd captured by saying “Siri, grab a screenshot”) from the Vision Pro to my desktop, and then edit it in Photoshop on my giant virtual display.
You can place apps anywhere you want around you, and not just in one space. Vision Pro understands your space, even different rooms in your home.
One day I wanted to try JigSpace, an app that lets you manipulate and pull apart giant 3D objects like a race car. I wanted ample space to work, so I walked over to my den. The Vision Pro’s pass-through capabilities are good enough that I was never worried about navigating around my home, though I did need to sometimes look down to see what was near my feet. I would not recommend wearing your Vision Pro while walking in the street, where you need to pay attention to curbs and other obstacles.
Image 1 of 2
Image 2 of 2
Once in the den, I opened JigSpace and started pulling apart a jet engine – whatever part I looked at, I could grab and drop somewhere around the room. Then I went back to work.
Hours later I was writing about JigSpace, but wanted to double-check something in the interface. I couldn’t find the app, so on a hunch, I got up and walked back into the den – and there was the app, where I’d left it hours before. What we have here with spatial computing and the Vision Pro is, maybe for the first time, the concept of space-aware computing. I could, perhaps, leave apps all over my house and the Vision Pro would remember where they were. To bring all open apps back into my local view, I just needed to hold down on the Digital Crown.
Experience score: 5
Spatial Computing is the only computing where if you forgot where you put an app, it might be in another room. #VisionPro pic.twitter.com/yzG84JRWmfFebruary 1, 2024
See more
Apple Vision Pro: Entertainment
An immersive entertainment experience
Spatial audio support is strong with or without AirPod Pros
Games designed for Vision Pro are often inspired
Compatible games don't always work as expected
I've tried watching movies on other virtual reality headsets, and it just never clicked. The Meta Quest Pro is an excellent device, but I still find it too uncomfortable to wear for more than 20 minutes at a time. There is, however, something special about watching a movie in the Apple Vision Pro, especially if you add in the AirPods Pro 2 (which include support for spatial audio and, thanks to their new H2 chip – also inside the Vision Pro – add support for lossless audio with ultra-low latency).
What makes it work so well isn’t only the near-perfect 3D fidelity (Disney Plus has a particularly excellent library of 3D films, including the trippy Doctor Strange in the Multiverse of Madness), but also the way in which you can immerse yourself in the theater experience with various Environments. The Disney Plus app provides some of my favorites: there’s an Avengers space, a Monsters Inc. Factory, and the Disney Theater, which is probably my top pick overall.
When the lights come down, and you’re staring at a giant 70mm or even IMAX-class virtual screen inside a darkened theater, you can almost smell the popcorn. There’s virtually no light leakage to break the illusion, and I had no trouble losing myself in the experience of watching a two-hour movie. One thing I did notice, though, is that 3D movies can chew through battery life. If you plan on watching, say, the three-hour and 26-minute Killers of the Flower Moon (an Apple production, by the way) you’ll want to plug in the battery using the included cable and charge adapter. They’ll provide all the pass-through power you need.
Marketing for the Vision Pro often shows people sitting down while using it, but, while I've spent most of my time seated, I have played games while standing. Synthriders is a Vision Pro-ready game that has elements of Beat Saber in it; your hands are orbs, and there are music-beat-based orbs flying at you that you must bounce back or ride the waves of with your hands. I played this while standing, and between waving my arms and ducking glass trapezoids flying at my head, it was quite a workout.
Entertainment score: 4.5
Apple Vision Pro: Spatial Photography
Vision Pro is a strong spatial photography machine
An excellent spatial imagery partner for your iPhone 15 Pro
The spatial photography and videography playback effect is often moving
I love how Apple always manages to cook up new terms for existing technology that somehow manage to capture the imagination of regular people.
Stereophotography is well over a century old, and many boomers and GenXers first experienced it in the 1970s with View-Master toys. The effect was okay, though not remotely immersive. With the Vision Pro, Apple has introduced the concept of spatial photography, a 21st-century upgrade of 3D photography and videography that puts 3D image capture in the hands of, or rather on the head of, everyone. It even extended the concept by building spatial video capture into the iPhone 15 Pro and iPhone 15 Pro Max.
Image 1 of 3
Image 2 of 3
Image 3 of 3
In some of my earlier Vision Pro demos, I played back spatial video that I'd captured on my iPhone 15 Pro Max on the Vision Pro. This was no View-Master experience: I could view the videos in a floating window, which put them at a certain remove, or use a pinch-and-expand gesture with both my hands to almost enter the video. The edges fade away, so the spatial video looks like it’s floating in a cloud.
The Vision Pro can capture both still and video spatial imagery. To capture, you use the dedicated button on the top-left side of the visor. A single press brings up the option to shoot photos or videos, and after you select one you press the button again to capture a single photo or start recording spatial video (you press it again to end the video capture). I did this with photos of my hand, and then a photoshoot with my son. Every time I view spatial imagery I have an immediate and automatic emotional reaction; when I replayed the spatial content it was like my son was standing before me, complete with his pained expression.
I must hand it to Apple, it doesn’t just invent new terms, it reinvents the experience.
Spatial photography score: 5
Apple Vision Pro: Communication and Personas
Personas can look strange, but they're more useful than you think
Communicating through iMessage and FaceTime is trouble-free
During setup, the Vision Pro will guide you to create a Persona, a 3D rendering of your head that you can use in FaceTime and other video calls. To build mine, I followed the instructions, removed the headset, and then pointed the display at my face. It captured me looking up, down, left, and right, as well as making a few facial expressions. All of this information enabled the spatial cameras to create a 3D map of my face.
When I put the Vision Pro back on, I could see my new Persona, which automatically started mimicking my facial expressions (the cameras inside and the ones pointed down at my face and hands capture my live expression). I added semi-translucent glasses to my Persona, and I was done and ready for a FaceTime call with my wife. She hated it. Even though I think my Persona is one of the better ones out there, I can’t deny the uncanny valley look of it.
Later, I conducted a call with a colleague who was also testing the Vision Pro. We both remarked on the limitations of our Personas, but throughout our 20-minute conversation those concerns faded away, and I forgot that we weren’t looking at either our real faces or our real hands. I’m still convinced that Apple can do better here, but then that’s why Personas are still in beta. By the time you finally decide to buy a Vision Pro (or some version of it), I expect Personas to be much more realistic and palatable.
Communication and Personas score: 4
Apple Vision Pro: Final thoughts
The Apple Vision Pro is expensive, but I’m not sure I can argue that it’s too expensive for what it does. Someone asked me if I would buy it. I now know that if I could afford it, the answer would be an enthusiastic yes.
There has never been a wearable quite like the Vision Pro, let alone a mixed-reality headset like it. It’s a true 'think-do' platform. It’s powerful, but also inviting. It’s fun to use, but also completely ready for work. It might make you look like a bug, but there’s also beauty in its design.
I wish it were lighter, but still, I can forget I’m wearing it and give myself over to the experience of work, play, or entertainment in a dark, virtual theater.
I love working in the Vision Pro, but I'm aware that if you spend hours working in it, it’s unlikely that you’ll want to keep the headset on at the end of the day. Over time, Vision Pro enthusiasts will likely achieve a balance between work and play, though I’m convinced that the tug of this one-of-a-kind technology will remain strong.
The Apple Vision Pro instantly goes to the top of our list of the best virtual reality headsets. It may not be a best-seller yet, but those who have one will talk about it endlessly, and they may even let you try it. I suggest you take that opportunity if offered, or at least get yourself to an Apple Store for a demo.
There has never been anything quite like the Vision Pro. It's my favorite mixed-reality headset ever, and I’m certain that it has reinvigorated the AR/VR market while also creating something completely new. Spatial computing is a thing. Better get used to it.
Should you buy the Apple Vision Pro?
Buy it if...
Don’t buy it if…
Also consider
How we test
For my Apple Vision Pro review, I spent almost a week, and as many hours each day as I could, wearing and using the mixed-reality headset.
I watched movies, played games, communicated with friends and co-workers, streamed live TV, moved apps around my home, and did a lot of work on my giant MacBook Pro virtual display.
Let’s get the easy part out of the way: I’m a fan of the Acer Swift X14. The short version of it is the fact that Acer managed to put a powerful CPU and GPU, not to mention a gorgeous OLED screen, in a slim Ultrabook package.
While it doesn’t feel as premium as a MacBook Pro 14, it truly belongs among the best Ultrabooks right now. There are certainly some trade-offs as the price is nowhere near close to the best cheap laptops and the battery life suffers a little since it has to power an Nvidia graphics card. I find the trackpad to be annoying to use as well. And, for a device legitimately vying for attention among the best laptops out there, it surprisingly skips out on a Windows Hello-capable webcam.
That said, the pros vastly outweigh the cons, especially if you don’t want to lug around a gaming computer and prefer the experience of using Ultrabooks, but still want the power of a gaming computer, whether that’s for booting up Cyberpunk 2077 or for some photo and video editing. When it comes to competing with the Dell XPSes of the world, the Acer Swift X14 may be one of the most surprising laptops I’ve used.
Acer Swift X14: Price and availability
How much does it cost? Starting at $1,099 (about £870 / AU$1,670)
When is it available? Available now
Where can you get it? Available in the US, UK, and Australia
The Acer Swift X14 is not the most affordable laptop out there. While its most basic configuration is available for $1,099 / AU$2,699 (about £870), which is certainly affordable – this is a model that comes with an 13th-Gen Intel Core i5 and a last generation Nvidia GeForce RTX 3050, not to mention 512GB of RAM and a lower 1600p resolution – it seems to only still be available in the US and Australia.
For everyone else or those wanting a current gen 4000 series GPU, you’re looking at $1,499 / £1,429 (about AU$2,277). That gets you a faster 13th-Gen Intel Core i7, an Nvidia GeForce RTX 4050, 16GB of RAM, 1TB SSD, and a 2880 x 1800p screen. And, if you’re in the UK, there are some slight variations as you can pay £170 more for 32GB of RAM.
The Acer Swift X14 isn’t the only light and thin laptop to come with a powerful GPU. The Samsung Galaxy Book3 Ultra covers a lot of the same ground. In fact, our Galaxy Book3 Ultra review unit, which is the base model, has the same specs as the Acer Swift X14 including the screen (well, it’s AMOLED vs OLED), but goes for a much pricier $1799.99 / £2,649 / around AU$4,875. Of course, you can pay even more – $2399.99 / £3,049 / around AU$5,610 – for a configuration with a 13th-Gen Intel Core i9 and Nvidia GeForce RTX 4070.
That said, many Ultrabooks come with that premium price tag without the kind of hardware to keep up with a gaming laptop. For instance, as great as its performance is, the Lenovo Yoga 9i Gen 8 only has Intel Iris Xe graphics and goes for $1,399.99 / £1,440. At least, it has that same OLED screen with HDR.
Value: 4 / 5
Acer Swift X14: Specs
There are basically two configurations of the Acer Swift X14. The more affordable one isn’t available in the UK, and comes with a 13th-Gen Intel Core i5, a last-generation Nvidia GeForce RTX 3050, and 512GB SSD.
The more expensive configuration that we've reviewed here upgrades the CPU to an i7, the GPU to a 4050, and the SSD to 1TB of storage. And, in the UK, you can spend a little more for 32GB of RAM instead of 16GB.
Beyond the internal components, there aren’t any additional variations as there aren’t different colorways except for the screen. If you go with the cheaper model, you also have a slightly lower resolution (2560 x 1600p).
Acer Swift X14: Design
Gorgeous display with HDR and accurate colors
Trackpad has issues with dragging and dropping
Webcam doesn’t support Windows Hello, but fingerprint reader does
The Acer Swift X14, like most Ultrabooks, comes in an elegant if discreet shade of gray called 'Steel Gray'. It doesn’t quite set itself apart from the pack visually, but it certainly looks good and is light and diminutive enough for easy on-the-go computing.
The display is probably the most impressive outward-facing feature on this laptop as the 14.5-inch OLED screen comes with a sharp 2.8K (2880 x 1800) resolution that runs natively at 120Hz for smoother results. Plus, it comes with Vesa Certified Display HDR True Black 500 to really make the colors pop.
The colors are definitely impressive. Not only is it incredibly accurate, measured at Delta E of 0.09, but it has fantastic color coverage, making this laptop more than good enough for video and photo editing. Specifically, it has 195% sRGB and 138.1% DCI-P3.
The keyboard is good enough, though I wouldn’t consider it to be the most comfortable I’ve ever used. The trackpad, however, gave me some issues. It’s nice to the touch and moving the cursor around is smooth, but the trackpad didn’t seem to want to cooperate when dragging and dropping unless I had my fingers positioned very accurately.
At least the port selection is robust enough for an Ultrabook with two USB-C ports (that are also Thunderbolt 4 / USB 4), two USB-A, an HDMI, and a microSD reader, along with the requisite headphone jack. Those worried about security will be happy to note that there’s a Kensington lock as well.
Beyond that, there’s a 1080p webcam that unfortunately doesn’t support Windows Hello Facial Recognition. However, there’s a fingerprint reader in the power button for that purpose.
There are some aspects of the Acer Swift X14 that are a bit hard to pin down but worth mentioning and that is in regards to its use in AI tech. The laptop supports Windows Copilot, enhancements for the webcam and mic, as well as using AI to accelerate the performance of a number of apps.
Design: 4 / 5
Acer Swift X14: Performance
Fast render scores
Powerful gaming performance
Good thermal performance
Acer Swift X14: Benchmarks
Here's how the Acer Swift X14 performed in our suite of benchmark tests:
Whether that AI acceleration puts the Acer Swift X14 over the top or it’s just a matter of powerful components, the performance of this laptop is a dream. With its 13th-Gen Intel Core i7, Nvidia GeForce RTX 4050, and 16GB of RAM, it’s no wonder that it can easily handle day-to-day work without breaking a sweat.
But, it can handle much more intensive workloads as well. Taking a quick look at the benchmarks, its 3DMark scores are much higher than the Lenovo Yoga 9i Gen 8 mentioned before and its Handbrake score, which measures how quickly a computer can render video, is two minutes faster. This is surely due to that powerful GPU.
I was also able to capably game on this laptop as well, running titles like Starfield and Gotham Knights on fairly high settings, certainly on par with settings I’ve used on gaming laptops equipped with the Nvidia GeForce RTX 4050.
Since the screen is an OLED panel with HDR as well as that great color coverage and accuracy, images pop and look rich and vibrant.
Really, every aspect of the Acer Swift X14’s performance is to be lauded in my opinion. Even its thermal performance is good, with it only really heating up underneath a bit when pushed.
The webcam is clear and sharp with auto framing, and comes with a feature that can make it look like you’re making eye contact with whomever you’re on a video call with (rather than looking down at the screen).
The audio quality is, as it is with most laptops, passable. It lacks some low-end and can be just a little hollow sounding, but it’s not bad and about what I would expect from a laptop like this.
Performance: 5 / 5
Acer Swift X14: Battery life
Good battery life considering hardware
Fast charging on hand
Since the Acer Swift X14 has to power some robust components, it’s no wonder that it doesn’t have the battery life of a lot of other Ultrabooks.
Make no mistake, a benchmark score of 7:26:37 for the Battery Informant test (though at 60Hz) is pretty good when compared to gaming laptops with similar internals, which are considered to have amazing battery lives when reaching the same scores, but don’t expect the 15 hours that you would get with a MacBook. And, if you run this laptop hard, expect that battery to go down pretty quickly.
It does seem to charge up pretty quickly when plugged in. However, it does seem to not quite hold onto its charge as well as it should when the lid is closed. But, this seems to be something that most Windows laptops don’t do as well as they should.
Battery life: 4 / 5
Should I buy the Acer Swift X14?
Buy it if…
Don't buy it if...
Also consider
If our Acer Swift X14 review has you considering other options, here are two laptops to consider...
How I tested the Acer Swift X14
Tested for a couple weeks
Used for regular work as well as gaming
Used regularly unplugged
I used the Acer Swift X14 for a couple weeks as a work computer. I did a decent amount of writing here, including this review. I also used it to do some gaming to see if it really had what it takes (clearly, it does). I took a look at all the features, not to mention used it regularly to see how it does unplugged.
After spending time with the Acer Swift X14, I was impressed by the fact that its power is more on par with a gaming computer than with its Ultrabook competition.
I’ve spent the last few years reviewing tech gear for gaming and otherwise, where I’ve gotten a feel for what to look for and how to put a piece of kit through its paces to see whether it’s worth the recommendation.
The Nvidia GeForce RTX 4080 wasn't my favorite graphics card - by a long shot - when it was released in late 2022, so in a lot of ways the Nvidia GeForce RTX 4080 Super that's just gone on sale has a pretty easy bar to clear, and it does so by delivering better performance than its non-Super sibling at a much lower price.
To be clear, you are mostly getting a price cut on the RTX 4080 with this card, with some extra performance akin to some well-tuned overclocking. In a different universe, this card could have been released in 2022 and pretty much buried the rival AMD Radeon RX 7900 XTX out the gate, rather than give Team Red a chance to own the sub-$1000 premium tier essentially all on its own.
However, now that the Nvidia RTX 4080 Super is available for a US MSRP of $999.99 (about £800/AU$1,400), the market as it stands has just had a GPU whale-bellyflop its way into it, and there's only so much you can relitigate the past in a review like this.
On its merits, the Nvidia GeForce RTX 4080 Super is essentially the best graphics card you can buy right now, other than the Nvidia GeForce RTX 4090. It provides a roughly 1.4% performance gain (geometric mean) on the Nvidia RTX 4080 with most of that coming in terms of synthetic benchmarks.
On the other side, it manages to outperform the RX 7900 XTX by about a geometric average of about 7% both in synthetic performance and gaming, across raster and ray-traced workloads, both with graphics upscaling and native resolutions.
It even comes within 18% overall of the RTX 4090, which is a hell of a thing considering it costs about 40% less at MSRP. In short, if you're an enthusiast looking for the best 4K graphics card but you're not a creative professional who needs the kind of power that an RTX 4090 brings to the table, then the RTX 4080 Super is the perfect card for you.
If you're lucky enough to pick up a Founders Edition of the RTX 4080 Super, you'll get the same gorgeous black finish that we saw on the Nvidia GeForce RTX 4070 Super, and considering how quickly these cards are probably going to sell out, these might be something of a showpiece for those wanting to flex.
That said, if you're the type of gamer who lives and breathes RGB, then one of Nvidia's AIB partners will likely offer a better card for your build. Expect them to run higher than MSRP, however, which unfortunately negates the key advantage of this card.
There's no doubt that this card is a seriously premium purchase no matter which way you go though, so someone looking for the best cheap graphics card or the best 1440p graphics card to satisfy a midrange price point will still only be window shopping the Nvidia GeForce RTX 4080 Super, but at least with the Super, it might not be so out of your reach as it once was.
Nvidia GeForce RTX 4080 Super: Specs & features
While you are getting more from the Nvidia GeForce RTX 4080 Super in terms of specs — including more compute units, faster clock speeds, and faster memory — the TGP of the card is still 320W, so even though you have more of everything, you everything ends up just a bit underpowered as a result, leading to a card that is slightly faster in terms of performance, but not by much.
You would need about 336W to power all of the added hardware in the RTX 4080 Super to the same extent that the base RTX 4080's hardware was, so if you have an overclocking target, you know where you need to be. Thankfully, Nvidia does let you overclock its cards, so you do have some headroom if you want to squeeze the most out of those extra shaders.
How much is it? US MSRP listed at $999.99 (about £800, AU$1,400)
When is it out? It was released on January 31, 2024
Where can you get it? You can buy it in the US, UK, and Australia
The Nvidia GeForce RTX 4080 Super is available in the US, UK, and Australia (among other markets) from January 31, 2024, with a US MSRP of $999.99 (about £800 / AU$1,400).
This puts it directly against the AMD Radeon RX 7900 XTX in terms of price, and effectively takes the lead as the best value among the top-tier premium cards. It's also about 17% cheaper than the RTX 4080 it effectively replaces.
Considering that the original launch price of the RTX 4080 essentially knocked 1.5 stars off its final score from me, that $200 price difference really is that big a deal at this stage since it puts enough room between it and the $1,600 price point of the RTX 4090 to make the RTX 4080 worth buying.
Nvidia GeForce RTX 4080 Super: Performance
In terms of performance, the Nvidia GeForce RTX 4080 Super offers the best bang for your buck of any of the top-tier premium cards, rivaling the performance of even the RTX 4090.
Of course, this was true of the base RTX 4080, but where the RTX 4080 Super really delivers is bringing this level of performance below the $1,000 price point that separates the enthusiast segment from the creative professional class who have company money to throw around.
Image 1 of 15
Image 2 of 15
Image 3 of 15
Image 4 of 15
Image 5 of 15
Image 6 of 15
Image 7 of 15
Image 8 of 15
Image 9 of 15
Image 10 of 15
Image 11 of 15
Image 12 of 15
Image 13 of 15
Image 14 of 15
Image 15 of 15
On synthetic performance, the RTX 4080 Super is second only to the RTX 4090 overall, and where it falters against the RX 7900 XTX is mostly when dealing with 3DMark Fire Strike and Time Spy workloads (both regular as well as Extreme and Ultra versions), but as expected, whenever ray tracing is involved, the RTX 4080 Super shines.
The RTX 4080 Super also outperforms the RX 7900 XTX when it comes to raw compute performance, so machine learning workloads will run much better on the RTX 4080 Super than the RX 7900 XTX.
Image 1 of 6
Image 2 of 6
Image 3 of 6
Image 4 of 6
Image 5 of 6
Image 6 of 6
In terms of creative performance, the RTX 4080 Super walks away the winner against the RX 7900 XTX, even if you don't factor in the fact that Blender Benchmark 4.0.0 workloads wouldn't even run on the RX 7900 XTX (though the RX 7900 XT was able to run them, just not nearly as well).
The RTX 4090 is still the card you'll want for creative workloads, 100%. But if money is a bit tighter than it used to be now that interest rates are non-zero and money means something again to VCs, the RTX 4080 Super is a good compromise.
Image 1 of 27
Image 2 of 27
Image 3 of 27
Image 4 of 27
Image 5 of 27
Image 6 of 27
Image 7 of 27
Image 8 of 27
Image 9 of 27
Image 10 of 27
Image 11 of 27
Image 12 of 27
Image 13 of 27
Image 14 of 27
Image 15 of 27
Image 16 of 27
Image 17 of 27
Image 18 of 27
Image 19 of 27
Image 20 of 27
Image 21 of 27
Image 22 of 27
Image 23 of 27
Image 24 of 27
Image 25 of 27
Image 26 of 27
Image 27 of 27
The gaming performance of the Nvidia RTX 4080 Super is likewise second only to the Nvidia RTX 4090, with strong native 4K ray tracing performance before you even factor in DLSS, which — thanks to Frame Generation and Nvidia Reflex — drives frame rates up even further on games that support it.
While barely a blip higher than the RTX 4080 non-Super in gaming performance (about 0.4% better overall), it is about 7% better than the RX 7900 XTX, which retails for the same price currently.
Image 1 of 9
Image 2 of 9
Image 3 of 9
Image 4 of 9
Image 5 of 9
Image 6 of 9
Image 7 of 9
Image 8 of 9
Image 9 of 9
With an average (geomean) frame rate of about 80 fps at 4K compared to the RTX 4090's 87 fps average, the RTX 4080 Super all but eliminates any real need to splurge on the RTX 4090 if you're just looking to game.
The RTX 4090 might be the best graphics card for creatives or machine learning scientists out there, but for gamers, you are much better off with the Nvidia GeForce RTX 4080 Super, since that extra $600 can buy you a whole lot of other components like the best SSD or best processor for gaming, something even the PC gaming enthusiasts out there can appreciate.
Should you buy the Nvidia GeForce RTX 4080 Super?
Buy the Nvidia GeForce RTX 4080 Super if…
Don’t buy it if…
Also consider
How I tested the Nvidia GeForce RTX 4080 Super
I spent about a week and a half with the Nvidia GeForce RTX 4080 Super all together, with much of that time spent benchmarking the card on our test bench.
I did use the card as my primary workstation GPU for content creation, and in addition to my standard battery of benchmark tests, I also used the card for high-end 4K gaming on titles like Cyberpunk 2077: Phantom Liberty.
I've been reviewing computer hardware for years now, and having extensively tested and retested every major graphics card release of the last two generations, with nearly two dozen graphics card reviews completed in the last 18 months, I know how a card like this is supposed to perform given its price and target audience, so I'm well-equipped to give you the buying advice you need to help you make the right decision with your money.
Well, not everything can be fixed by giving it an extra 8GB VRAM.
The AMD Radeon RX 7600 XT is Team Red's latest play at the lower midrange graphics card market, and for what it is, it's a decent enough graphics card, offering great 1080p performance with some very solid 1440p performance to boot.
It offers substantially better ray tracing performance than the AMD Radeon RX 7600, making one of the best 1080p graphics cards on the market right now, but the card unfortunately doesn't live in a vacuum.
With a US MSRP of $329.99 (about £265/AU$420) and with the Nvidia GeForce RTX 4060 offering better performance overall for about $30 less, the RX 7600 XT becomes a lot harder to recommend, which is frustrating since this market segment is much more price sensitive than the midrange and premium tiers are.
Image 1 of 3
Image 2 of 3
Image 3 of 3
There's no reference card model from AMD, so as with the AMD Radeon RX 7700 XT, you're limited to whatever AIB partners like XFX and ASRock produce. This means that the design of the cards are going vary considerably, as will their prices and clock speeds.
Essentially, what you're getting with the RX 7600 XT is an overclocked RX 7600 with an extra 8GB VRAM. It's specs are otherwise pretty much identical, which goes a long way to explaining the level of performance you're getting with this card.
Image 1 of 15
Image 2 of 15
Image 3 of 15
Image 4 of 15
Image 5 of 15
Image 6 of 15
Image 7 of 15
Image 8 of 15
Image 9 of 15
Image 10 of 15
Image 11 of 15
Image 12 of 15
Image 13 of 15
Image 14 of 15
Image 15 of 15
In terms of synthetic performance, the RX 7600 XT comes in about 2% better than the RX 7600, and about 2% worse than the RTX 4060, with the expected difference between stronger rasterization performance for the RX 7600 XT and better ray tracing for the RTX 4060 holding true here.
Image 1 of 9
Image 2 of 9
Image 3 of 9
Image 4 of 9
Image 5 of 9
Image 6 of 9
Image 7 of 9
Image 8 of 9
Image 9 of 9
For gaming, the RX 7600 XT is pretty great for a 1080p card, as was the RX 7600 before it, and the extra VRAM does help it get some extra 1440p performance, especially when it comes to minimum frame rates which help determine a game's stability. Its ray tracing performance is also generally improved over the RX 7600 as well.
Compared to the RTX 4060, however, it comes in about 5% behind overall, which is where we were really hoping to see the RX 7600 XT differentiate itself thanks to the extra VRAM.
Image 1 of 9
Image 2 of 9
Image 3 of 9
Image 4 of 9
Image 5 of 9
Image 6 of 9
Image 7 of 9
Image 8 of 9
Image 9 of 9
In 1440p gaming performance, the extra VRAM does give the RX 7600 XT about 10% better performance on average compared to the RX 7600, and it ekes out a win against the RTX 4060 in non-ray-traced gaming performance at 1440p, but it gets about 25% lower ray-traced 1440p performance than the RTX 4060.
Given the price difference here, the RTX 4060 is simply the better value for both 1080p and 1440p gaming, despite having 8GB less VRAM.
In the end, the memory bandwidth, constrained by a 128-bit memory bus, is simply too small for the larger frame buffer to really make a substantial enough impact. For that, the AMD Radeon RX 7600 XT becomes harder to recommend than it should be.
AMD Radeon RX 7600 XT: Price & availability
How much is it? US MSRP listed at $329.99 (about £265/AU$420)
When is it out? It is available January 24, 2024
Where can you get it? You can buy it in the US, UK, and Australia
You can get the AMD Radeon RX 7600 XT starting on January 24, 2024 with an AMD MSRP of $329.99 in the US, which converts to about £265 in the UK and AU$420 in Australia.
Since there's no reference model for this card, you'll also likely find this GPU selling for higher once all the added bells and whistles like RGB lighting and such are factored in.
This is easily the biggest disadvantage of this card, considering that the Nvidia RTX 4060 is $30/£24/AU$36 cheaper, and the RX 7600 is even cheaper than that at $269.99 (about £215/AU$380). Still, for some who want to make sure that their gaming PC is setup for the future with the extra VRAM, the extra investment might be worth it.
Should you buy the AMD Radeon RX 7600 XT?
Buy the AMD Radeon RX 7600 XT if…
Don’t buy it if…
Also consider
How I tested the AMD Radeon RX 7600 XT
I spent about a week with the AMD Radeon RX 7600 XT, putting it through a full battery of benchmarks and general performance testing, including playing some of the best PC games on the market on the highest settings available at 1080p.
I've been reviewing PC hardware for several years now, and I have extensively tested and retested every major graphics card release of the past two generations of graphics cards, so I am intimately familiar with their varying degrees of performance. As such, I know how well a card needs to perform for a given price, relative to the market, and given my lifelong passion for PC gaming, I also know what the intended audience for a graphics card is going to demand, and whether the cards I review meet these exacting standards.
Nvidia GeForce RTX 4070 Ti Super: two minute review
The Nvidia GeForce RTX 4070 Ti Super is a difficult card to rate, despite it being without a doubt one of the best graphics card releases of this generation.
For a US MSRP of $799.99, you're essentially getting a pared-down Nvidia GeForce RTX 4080, including 16GB GDDR6X VRAM, a wider 256-bit memory bus so you can actually play the best PC games at 4K (with tweaks), with noticeably less power draw. On the flipside, you're also getting a card that is only marginally better than the Nvidia GeForce RTX 4070 Ti, despite its seriously upgraded specs.
As far as the design goes, unfortunately, there is no Founders Edition for the Nvidia GeForce RTX 4070 Ti Super, so you don't get the sleek-as-hell all-black metallic finish of the Nvidia GeForce RTX 4070 Super. Instead, you'll have about a half-dozen or so board partners like Asus, MSI, and others making these GPUs. Which card you get will determine a lot from overclock settings to cooling and RGB lighting effects.
The Asus Tuf Gaming model I reviewed is identical to the Asus Tuf Gaming RTX 4070 Ti I reviewed, and it's a massive chonky card for what it's worth. Given the power requirements and the need to dissipate a lot of heat, you can fully expect that whatever RTX 4070 Ti Super card you pick up, it's going to be a triple-slot monster.
In terms of performance, there's no getting around the fact that this is essentially the Nvidia RTX 4080 we should have gotten a year ago (it's built off the RTX 4080's AD103 GPU, rather than the RTX 4070 Ti's AD104), and for that, it is one of the best 4K graphics card models to hit the market this year. It's gaming performance is superb pretty much across the board, with the only area it struggles to be playable is where every other graphics card struggles other than the Nvidia GeForce RTX 4090, and that's native ray tracing at 4K.
Even there, however, this card manages to average about 32fps (though its average minimum/1% fps dips below the 24fps that registers as fluid motion, so yeah, it can sometimes be a bit of a slideshow).
Unfortunately, the AMD Radeon RX 7900 XT is also right there further complicating the picture for this card. Outside of creative workloads that rely on CUDA (like Blender or V-Ray), the RX 7900 XT goes toe-to-toe with the RTX 4070 Ti Super in terms of gaming performance, though the RTX 4070 Ti Super will generally handle ray tracing better.
Surprisingly though, AMD's FSR seems to be at the point where it is battling Nvidia DLSS to a draw by and large, with the only real difference being whether you have ray tracing turned up to its highest setting or not. Given the recent price cut for the RX 7900 XT down to $749.99 in the US, anyone looking at the RTX 4070 Ti Super will have to ask themselves some hard questions.
This is especially true given the big looming release set to drop at the end of January, the Nvidia GeForce RTX 4080 Super. Though 25% more expensive at MSRP than the RTX 4070 Ti Super, given the high cost of entry for this card, it suffers from a similar problem as the original RTX 4080 had; it's too close in price to a clearly better card, and so ultimately, you'll almost certainly be better off buying the RTX 4080 Super in a week's time. Which is a shame, because the Nvidia GeForce RTX 4070 Ti Super is a fantastic card that really should have hit the scene a whole lot sooner than it is.
Nvidia GeForce RTX 4070 Ti Super: Price & availability
How much is it? MSRP listed at $799.99 (about £640, AU$1,120)
When is it out? It was released on January 24, 2024
Where can you get it? You can buy it in the US, UK, and Australia
The Nvidia GeForce RTX 4070 Ti Super goes on sale January 24, 2024 for a listed MSRP of $799.99 in the US, which is about £640 in the UK and AU$1,120 in Australia.
This is the same MSRP as the Nvidia RTX 4070 Ti is replaces, which is definitely a positive given the generally terrible pricing of Nvidia best graphics cards this generation.
As stated above though, AMD isn't sitting on its laurels, and it's dropped the price of its competing RX 7900 XT graphics card to counter this release. And, given, their relative levels of performance, it's a smart move as it makes the RX 7900 XT a better value as a result, at least enough to be competitive in the absence of any RX 7050 XT-series releases thus far.
Nvidia GeForce RTX 4070 Ti Super: Specs & features
16GB VRAM
Wider memory bus
Slightly faster clock speed
Nvidia GeForce RTX 4070 Ti Super: Performance
Marginally better than RTX 4070 Ti
Loses to RX 7900 XT in gaming performance
Extra VRAM makes 4K gaming much smoother
Where it counts, the RTX 4070 Ti Super is a fantastic graphics card for work and play, though it's definitely more of a refresh of the RTX 4070 Ti, which is unfortunate since it really should have shown better performance given its specs.
Image 1 of 15
Image 2 of 15
Image 3 of 15
Image 4 of 15
Image 5 of 15
Image 6 of 15
Image 7 of 15
Image 8 of 15
Image 9 of 15
Image 10 of 15
Image 11 of 15
Image 12 of 15
Image 13 of 15
Image 14 of 15
Image 15 of 15
In terms of synthetic performance, the RTX 4070 Ti Super averages about 4.5% better performance than the RTX 4070 Ti, and about 13% slower performance than the RTX 4080, which more or less matching the RX 7900 XT. On this latter point, much like other AMD vs Nvidia comparisons, in pure rasterization, AMD comes out on top with Nvidia winning out in ray tracing workloads.
Image 1 of 4
Image 2 of 4
Image 3 of 4
Image 4 of 4
For creative performance, the RTX 4070 Ti Super greatly outperforms the RX 7900 XT, but falls well short of where the RTX 4080 lands. If you're looking for the best creative graphics card, then, the RTX 4080 Super is definitely going to be one to look out for considering it's only going to retail for $200 more and should be much more powerful.
Image 1 of 13
Image 2 of 13
Image 3 of 13
Image 4 of 13
Image 5 of 13
Image 6 of 13
Image 7 of 13
Image 8 of 13
Image 9 of 13
Image 10 of 13
Image 11 of 13
Image 12 of 13
Image 13 of 13
In gaming performance, no card really comes close to the RTX 4080, but the RTX 4070 Ti Super more or less ties the RTX 4070 Ti and the RX 7900 XT.
Image 1 of 13
Image 2 of 13
Image 3 of 13
Image 4 of 13
Image 5 of 13
Image 6 of 13
Image 7 of 13
Image 8 of 13
Image 9 of 13
Image 10 of 13
Image 11 of 13
Image 12 of 13
Image 13 of 13
In 1440p gaming performance, the additional 4GB VRAM in the RX 7900 XT starts to become a factor to the RTX 4070 Ti's detriment, but overall, the RTX 4070 Ti Super comes up about 3.5% behind the RX 7900 XT, and roughly tied with the RTX 4070 Ti.
Image 1 of 13
Image 2 of 13
Image 3 of 13
Image 4 of 13
Image 5 of 13
Image 6 of 13
Image 7 of 13
Image 8 of 13
Image 9 of 13
Image 10 of 13
Image 11 of 13
Image 12 of 13
Image 13 of 13
At 4K, the additional 4GB VRAM in the RTX 4070 Ti Super vs the RTX 4070 Ti starts to really have an impact, giving the RTX 4070 Ti Super about 6.5% better performance on average but a nealy 10% higher fps floor at 4K than the RTX 4070 Ti.
The RTX 4070 Ti Super also runs about even with the RX 7900 XT as well.
Image 1 of 9
Image 2 of 9
Image 3 of 9
Image 4 of 9
Image 5 of 9
Image 6 of 9
Image 7 of 9
Image 8 of 9
Image 9 of 9
Wrapping things up, a couple of other points I definitely want to hit on, namely that the biggest thing holding this card back in a lot of ways is its TGP. With the same TGP as the RTX 4070 Ti, you get the sense that this card leaves something on the table in terms of performance given its specs. On the plus side though, it does manage to squeeze some extra performance from the same amount of power, which is a good thing. If you want to try and overclock this card to tap into its full potential, have at it.
It's not going to run particularly hot (unless you overclock the hell out of it) and it's gaming performance is exceptional, even into 4K, where you can expect to average about 70 fps at 4K when not using ray tracing, or you can turn ray tracing on and flip DLSS to balanced or performance for the same amount of frames or better.
It's ultimate value proposition isn't as good as the RX 7900 XT's, but it's better than the RTX 4070 Ti's, and since AIBs are likely to be switching themselves over to the Super rather than keep putting out non-Super RTX 4070 Ti's, this card is effectively giving you something extra for no additional cost.
The ultimate value tell though will be how the Nvidia RTX 4080 Super performs, and it's unfortunately the case that the success of this card depends very much on how well Nvidia's last major graphics card of this generation is received.
Should you buy the Nvidia GeForce RTX 4070 Ti Super
Buy the Nvidia GeForce RTX 4070 Ti Super if…
Don’t buy it if…
Also consider
How I tested the Nvidia GeForce RTX 4070 Ti Super
Test system specs
This is the system we used to test the Nvidia RTX 4070 Ti Super:
I spent about a week working with the Nvidia RTX 4070 Ti Super, running our standard battery of tests on it and similar cards in its price category.
I ran it through a number of real world use cases where it will be used, primarily gaming and content creation.
I've been reviewing computer hardware, including graphics cards, for years now, and I am intimately familiar with the kind of performance you should expect from a graphics card at this price point. I bring that knowledge to bear on my graphics card reviews and make sure that every graphics card I compare to the card under review is retested using the most up-to-date drivers to get the best relevant data for comparison, even (as in this case) it means I only test the most relevant competing cards to provide the reader with the most important comparative data when they are considering making their next graphics card purchase.
First reviewed in January 2024
I spent about a week working with the Nvidia RTX 4070 Super, including using it as my main work PC graphics card for content creation work. I ran our standard battery of tests on it and its two main competitor cards due to time constraints (you can see my RTX 4070 review for its relative performance versus many more cards, and than consider a roughly 12%-15% better performance over that for the RTX 4070 Super).
I've been reviewing computer hardware, including graphics cards, for years now, and I am intimately familiar with the kind of performance you should expect from a graphics card at this price point. I bring that knowledge to bear on my graphics card reviews and make sure that every graphics card I compare to the card under review is retested using the most up-to-date drivers to get the best relevant data for comparison, even (as in this case) it means I only test the most relevant competing cards to provide the reader with the most important comparative data when they are considering making their next graphics card purchase.
XPG Core Reactor II 1200W ATX 3.0 80 PLUS Gold: Two-minute review
The XPG Core Reactor II 1200W ATX 3.0 80 Plus Gold PSU is the company's latest compelling mid-range component for builders and follows off its success in the more premium-tier Platinum-certified power supply segment with its Cybercore II series.
The Core Reactor II series then, which covers the spectrum from 650W units to its highest wattage 1200W PSU, leads this venture, showcasing XPG's ability to strike a crucial balance between performance, quality, and cost. This series is designed for users who want reliable performance without splurging, but also need some higher-tier power to power the best gaming PCs you can build.
Those high-end motherboards, processors, and graphics cards don't come cheap in terms of power draw, and so the Core Reactor II 1200W, an 80 Plus Gold certified unit, stands out for its practical design and consistent performance. It represents XPG's commitment to affordable quality and aims to meet the diverse needs of mid-range computing environments.
In terms of packaging, the Core Reactor II 1200W PSU comes in a robust, visually appealing box, complete with essential accessories like mounting screws, an AC power cable, and decorative stickers. The PSU itself is a blend of aesthetics and functionality, featuring a sleek matte black finish with embossed geometric patterns and a geometric fan cutout. Its 160mm length slightly exceeds conventional ATX size, but it is short enough to ensure compatibility with the best PC cases with ATX compliance.
The front of the PSU is minimalist, housing only the on/off switch and AC receptacle, while the rear is thoughtfully designed for easy and accurate cable connections. The unit's modular cable system includes an array of uniformly black cables, with most being neatly sleeved.
Performance-wise, the Core Reactor II 1200W PSU aligns with its 80 Plus Gold certification, demonstrating commendable efficiency and thermal management. The fan operates optimally, maintaining reasonable internal temperatures, even at significant power output and under various testing conditions.
After running the XPG Core Reactor II in my main workstation at the office under some pretty heavy loads, the fan stayed mostly quiet and the temperature stayed well below its rated operating temperature without issue. This PSU is rated for operation at an ambient temperature of 50°C, a testament to its robustness and reliability, especially in demanding environments.
Electrical performance is a highlight, with the primary 12V rail showing impressive regulation and effective voltage filtering. The PSU also passes tests for primary protections like Over Current, Over Voltage, Over Power, and Short Circuit, ensuring reliable performance.
In terms of internals, the Core Reactor II combines a robust build quality with a unique design, incorporating high-grade 105°C Japanese capacitors for enhanced reliability and durability. The PSU excels in power quality, achieving good energy conversion efficiency and maintaining steady efficiency across most load ranges. Its thermal management is effective, with the fan adjusting speed according to the load, ensuring efficient cooling while keeping noise levels minimal.
That said, if you're pushing this unit hard, such as with overclocking or loading up on the add-in cards, it can get a bit loud when load nears 100%, though never so much to be bothersome.
That said, if you're pushing this unit hard, such as with overclocking or loading up on the add-in cards, it can get a bit loud when load nears 100%, though never s much to be bothersome.range should add to their component shortlists.
XPG Core Reactor II 1200W ATX 3.0 80 PLUS Gold: Price & availability
How much does it cost? $204.99 (about £165 / AU$290)
When is it available? Available now
Where can you get it? Available in the US. UK and Australia availability is spotty
The XPG Core Reactor II 1200W ATX 3.0 80 PLUS Gold is currently priced at $204.99 in the US and backed by a 10-year warranty, giving the Core Reactor II 1200W PSU a good value for its performance. As far as midrange PSUs go, this one is positioned as an appealing option for those seeking a balance between cost, efficiency, and reliability.
While it might not have a platinum rating, its performance is more than enough for most builders out there who need to run some high-powered components like the best processors and best graphics cards for gaming or content creation, without worrying about running hot at all hours under heavy industrial-grade workloads.
XPG Core Reactor II 1200W ATX 3.0 80 PLUS Gold: Specs
Should you buy the XPG Core Reactor II 1200W ATX 3.0 80 PLUS Gold?
Buy the XPG Core Reactor II 1200W ATX 3.0 80 PLUS Gold if...
You want a high-powered PSU for a decent price For the price you're paying, this is one of the highest wattage 80Plus Gold-rated ATX 3.0 power supplies going.
You want a modular PSU
As a modular PSU, cable management is much easier when you only use what you need.
Don't buy it if...
You need something more heavy-duty
While 80Plus Gold-rated is fantastic, if you need something more robust for a heavy-duty workstation, you might want to check out the Cybercore II Platinum-rated PSUs from XPG.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.