Apple is taking its time with iOS 18 development and a new report from Bloomberg’s Mark Gurman suggest next year’s iPhone 16 generation will come with exclusive generative AI features which will be the key selling points for the new devices.
The new report does not reveal any specifics on what these AI features will do but we can assume they will be similar to what Google and Samsung are doing with text-to-image generation, the ability to summarize documents and translate content among others. Apple is also rumored to be using its own LLM model to overhaul Siri.
The iPhone 16...
The Psync Camera Genie S is a new indoor home security camera with an AI-infused twist, and it could remedy a host of pet peeves you have with your current home security devices.
While I don't consider myself paranoid, I do have, at any given time, at least three of the best home security cameras surveilling my home. I've gotten used to all the alerts that usually let me know one of the cams detected some movement. The details are scant and because I am too cheap to play for cloud-based storage and services, I can almost never see said "movement".
The extent of detail most webcam image analysis offers is usually "person detected," "movement detected," or "sound detected."
Psync, though has a different idea with something it calls "ViewSay", featured on its new Psync Camera Genie S indoor smart home camera.
ViewSay uses GPT algorithms (actually a visual language model or VLM) to analyze the movement or any activity within the security camera's field of view to offer detailed descriptions of what's happening in your home. It's a very new implementation of this technology and the messages I've received over the last week have been almost uniformly hilarious. Psync's AI rarely gets the description right but I love how hard it tries.
Some users may not be thrilled that the AI-based image analysis (which will cost $0.99 a month) is done in the cloud using one frame of the video, but the good news is that the data is encrypted when it's stored on Amazon Web Servers (AWS) for analysis.
Leaving aside this kitschy highlight, this is a rather adept home security cam. It's compact, easy to set up, and is the rare webcam with a motorized remote control that lets you change its point of view (left and right or up and down) via the Psync app and from anywhere in the world. It can even, thanks to that physical versatility, track moving objects.
There's support for smart home majors like Amazon Alexa and Google but not for Apple HomeKit and, more distressingly, Matter - though full Matter support rollout for cameras doesn't come until next year, so there's hope yet.
Even so, $34.99 for a Webcam with 32GB of onboard storage, LED lights for night imagery, remote control, and auto-tracking capabilities adds up to a nice starter home security webcam.
Psync Camera Genie S review: price and availability
List price: $34.99 (32GB) $39.99 (64GB)
$0.99 for AI features
Available in the US only
At just $34.99, the Psync Camera Genie S is among the more affordable home security cameras, falling in line with the WYZE Cam Pan v3 but without the outdoor capabilities. Of course, the WYZE Cam doesn't include any form of GPT intelligence.
Psync launched the product on November 1, 2023, and it's available online at the Psync website.
The box ships with the security cam, a USB-3-to-USB-C cable, and a power adapter (there's no battery option). There's also 32GB of internal storage.
Psync Camera Genie S: specifications
Psync Camera Genie S review: design
Rubberized bottom with tripod mount
I wouldn't normally say this, but Psync Camera Genie S is cute. Yes, it's a bit lightweight and boxy, but its ability to turn and look at whatever it's tracking adds a bit of life and, yes, might lead you to anthropomorphize it.
Where most modern smart home security cams are all curves and smooth lines, this is like a set of white blocks. There's a white base that houses the speaker and activity light (blue for live), and the L-shaped head that unfolds from the body and can point its small face, featuring an HD+ camera (it captures up to 1944 x 2592) and four LED lights, wherever you choose, or it can auto-track moving objects.
There are microphones and a speaker for conducting two-way communication but the speaker is also used to deliver messages from the Psync device. It can tell you, for example, that you have an incoming call.
Open, the security cam is 2.09 in. x 2.09 in. x 4.41 in. Folded, it's 2.09 in. x 2.09 in. x 2.41
Despite its diminutive size, this lightweight security cam isn't prone to sliding or falling over; there's a nice grippy rubberized base to keep the Psync Camera Genie S in place. If you want to attach it to a tripod, the base also includes a tripod screw mount.
I also appreciated that the included power cable is long enough to stretch it from an outlet near the floor to a window more than halfway up the wall.
One of my favorite Psnyc Camera Genie S tricks, however, is what it does when you power it down (through the app). It says "shutting down" and then folds into a near-perfect cube.
Psync Camera Genie S review: performance
Can detect and track motion
AI does its best to ID scenes
2K video is sharp
The best smart home devices are not just easy to use, they make setup a breeze. Psync's Camera Genie S fulfills that mission with a no-brainer setup through the app which (after one firmware update that virtually every new smart home hardware needs) had me keeping an eye on my home in no time.
The app is fairly well organized and should be obvious to even the most novice users. Most of the screen is filled with video feedback. The first thing I noticed is the relatively narrow field of view, especially compared to the indoor camera competition; it's just 84.9 degrees. However, the remedy to this narrow viewport can be found on the app, which offers a thumb-pad-style control for the camera's remote control features. The camera can tilt vertically by 135 degrees. More crucially, the entire camera can rotate a full 350 degrees on its round base.
Initially, I used these controls to look around a room from wherever I was. I've often been frustrated that I left an indoor home security camera pointing in one direction when a sound was coming from another. With the Psync camera, I can pan or tilt the camera to look directly at the action. There are no voice controls, despite its compatibility with Alexa and Google compatibility, but that's not too uncommon in cameras.
The other option, though it's hidden under a sub-menu, is the ability to let the camera track action on its own. When I turned this on and I walked in front of the camera, it would look me up and down and turn to watch me.
The default 2K resolution camera provides video and images you can zoom in on to see additional detail. Of course, the higher-resolution imagery might eat up your local storage a bit more quickly.
Next to the 2K camera is a series of four LEDs that it calls the "Spotlight". Instead of night vision, you can turn on these LEDs to illuminate a dark room. You can also control the brightness level and even set a schedule for the Spotlight. I'd prefer actual infrared night vision and the ability to turn on when the camera detects motion, but then you shouldn't expect that technology in a $34 indoor smart camera.
The app lets you switch to event view and then swipe vertically through all previously captured events. You can also save any of the videos to your phone camera roll and then share them more broadly.
Of course, the marquee feature is the new AI-based image analysis that translates into a description of what Psync Camera Genie S is spotting through its camera. Psync will charge $0.99 a month for this privilege and, to be honest, I'm not sure yet if it's worth it.
I got a ton of alerts from the camera and the majority of descriptions were comical. It could usually identify a human but often said they were carrying something, such as a basket when they weren't. It said an entire family was sitting around my dining room table when it was empty, and it misidentified scenes as well as objects. My favorite might have been when it was looking at my backyard and shed and said, "A person is walking past a garage with a motorcycle parked inside, and a child is playing in the yard." The backyard was empty, my shed was closed, I don't own a motorcycle, and no one was in my backyard.
Generally, Psync's powers of observation were average at best. It did eventually properly identify my wife standing in our living room looking at her phone, but that was a rare hit.
To be fair, these are early days, and I suspect that this AI will get smarter and more effective in identifying people, places, and things.
Should I buy the Psync Camera Genie S indoor camera?
Buy it if...
Don't buy it if...
Psync Camera Genie S review: Also consider
Decided against the Psync Camera Genie S? Why not check out these alternatives...
How I tested the WiZ Indoor camera
I tested the Psync Camera Genie S for two weeks in my home
I set it up in various rooms in my house and allowed the app to send me alerts based on what the camera detected.
I used the Psync Camera Genie S for two weeks in my home, monitoring my kitchen, living room, den, and backyard (through a window).
I had the benefit of using a system whitelisted for access to the $0.99-a-month AI features. this meant all of the object and motion detection descriptions I received were quite descriptive, if not always accurate.
I compared the camera to the other indoor and outdoor cameras I have throughout my home.
There's a startup in San Francisco called Humane (or rather, hu.ma.ne if you go by the official logo and domain name), and today it's unveiled the first product it's been working on. It's called Ai Pin and seems to answer the following question that no one asked: What if almost a smartphone, but with no screen and attached to your shirt? Oh, and with AI!
And that's what it is. A honky, dorky-looking pin you put on your shirt or hoodie or blazer or whatever, and wear all day. It's not meant to blend in at all, with a "look at me!" design that may polarize people who don't live in Silicon...
Samsung announced its own generative AI model, called Gauss. The on-device AI model consists of Samsung Gauss Language, Samsung Gauss Code, and Samsung Gauss Image - each part tailored for its specific task.
Samsung Gauss Language is a generative language model that can compose emails, summarize documents, and translate content. It can also enable smarter device control (think Google Home or Amazon Alexa).
Samsung Gauss Code uses the coding assistant code.i, which allows developers to code easily and quickly, perform code description and test case generation.
Samsung Gauss Image is a...
Removing backgrounds from images used to mean manually masking subjects. Adding different backdrops and drop shadows required a deft touch, too. Not with PhotoRoom: harnessing the power of AI, it automates the background removal process and makes it a cinch to place people and products in new settings, complete with accurate shadows.
Available as a mobile app for iOS and Android, as well as a web-based tool, it has the potential to revolutionize promotional imagery for online businesses. With just a few taps, PhotoRoom makes it possible to cut out subjects, place them against virtual backgrounds, and then export them at the perfect size for different social platforms.
A Pro subscription removes the PhotoRoom logo from high-resolution exports, unlocks powerful batch editing support, and grants access to a catalog of templates covering everything from birthday cards to seasonal sales. You can make mock magazine covers, create studio-style imagery for your online store, or instantly upgrade your marketing graphics – all without expert knowledge of design software. It can even transform self-portraits into headshots that pass for professional.
It’s easy to use, yet the results are broadly believable. While it's not infallible, PhotoRoom is impressively effective at matching backgrounds and shadows to the lighting of your original subject. And though not every virtual backdrop is photorealistic, the majority give the impression of genuine placement within the scene – or at least that the image has been professionally produced. If PhotoRoom’s presets don’t fit the bill, you can generate custom backgrounds using simple text prompts, giving you theoretically infinite possibilities.
With layered editing, including the option to place and edit text and graphics, PhotoRoom firmly plants its flag as an alternative to Adobe Express – a rival web-based design tool with AI features from the maker of Photoshop. Editing professionals may find some limitations, but for most users, PhotoRoom is slick, quick, and surprisingly powerful.
If you only want to separate subjects from their backgrounds, there are cheaper – and even free – alternatives. Equally, if you want a fully-fledged image editing solution, you should look at our round-up of the best photo editors.
But if you want an accessible toolkit that lets you rapidly generate pro-grade promotional imagery, and seamlessly place your subjects within a vast library of settings – all from your smartphone or web browser – PhotoRoom is well worth the cost of a monthly subscription.
PhotoRoom: Pricing & availability
Weekly prices start at $4.99 / £3.99 / AU$17.99
Many features are available for free, including background removal
Pro subscription removes branding and unlocks full feature set
PhotoRoom is available as a smartphone app for iOS and Android and online via the PhotoRoom website. Many of its features can be accessed without a subscription, including the background and object removal tools. With the free version, you are limited to 250 image exports, all of which will be watermarked with the PhotoRoom logo.
If you plan to use PhotoRoom for your business, you’ll probably want to pay for the additional features included with a Pro subscription. Paying for full access removes the PhotoRoom logo and export cap. It also unlocks batch editing, design resizing, and high-resolution exports, as well as the full collection of instant backgrounds.
Pricing for the Pro plan varies based on the frequency of the renewal period you choose. In the US, the rolling weekly plan costs $4.99, while the annual plan costs $89.99 per year. In the UK, you can subscribe for £3.99 a week or £69.99 a year. In Australia, the weekly cost is AU$17.99, while the yearly price comes in at AU$139.99.
PhotoRoom offers a free trial of the Pro features through its smartphone apps. Be aware that this will auto-renew if not cancelled before the end of the trial.
PhotoRoom can be accessed using the app or via the web
Simple, icon-based interface is easy to use and understand
One-click templates and AI backgrounds can be easily edited
Whether you use PhotoRoom on the web or through the smartphone app, the core user experience is very similar. The icon-based interface is designed to be simple and self-explanatory, even if you’ve never used a photo editor or any kind of graphic design software.
You start by selecting a tool, template, or photo: all options lead to the same destination, only the route is different. With a Pro subscription, you also have the option to edit in batches, applying the same template or background swap to several images at the same time.
Pick an image, and PhotoRoom will automatically run the background removal tool, separating the subject from its surroundings. From there, you can edit the outline of the cutout, either manually – by painting with a brush to adjust the masking – or with AI assistance, which automatically detects the objects you click.
Once you’re happy with the cutout, your first option is to explore PhotoRoom’s preset templates. Some are purely functional, making the background transparent or resizing the canvas for different social platforms. Others keep it simple and professional, with a drop shadow and a plain backdrop. The most dynamic add everything from text headlines and graphic elements to photorealistic settings and artistic blurring.
A tap or a click is all it takes to apply any of the templates. What’s more, every component in the template can be repositioned, rotated and resized to suit. Each occupies a separate layer, which you can drag to rearrange. You can also tweak each layer using the adjustment toggles and sliders. While PhotoRoom is far from a Photoshop alternative, you can make a range of color tweaks, change the perspective and add creative effects, including several blur and texture options. All of these tools are intuitive to use, and the learning curve is minimal.
The instant backgrounds tab is where AI really steps into the spotlight. You can browse by background category, spanning all manner of materials, scenes and settings, or pick from suggested backdrops based on the subject detected by PhotoRoom. Pick one, and you’ll get four versions of that theme. If none fit the bill, you can tap to generate four more.
Where backgrounds get really interesting is with custom prompts. Under the assisted tab, you can specify the surface and any background details you’d like. Or you can go full manual and write the entire prompt yourself, together with a negative prompt to exclude specific elements. You can also upload an image as a prompt. All three are straightforward to use and genuinely accessible.
AI is also deployed for the magic retouch tool, which allows you to paint over objects to remove them from an image, and for instant shadows, which can read the object placement within a given composition and introduce soft, hard, or floating shadows accordingly.
AI effectively recognizes and cuts out subjects
Manually brushing to edit the mask can feel clumsy
Any background removal tool is only as effective as its subject recognition. Luckily, PhotoRoom is remarkably good at detecting and selecting. From clothes to creatures, it effortlessly traced around countless subjects in our tests. What’s more, because the AI engine tries to understand what it’s looking at, it’s also effective at cutting out elements within an object – such as the spokes of a bicycle.
That’s not to say it’s perfect. It can struggle when presented with busy scenes that feature multiple potential subjects. Outlining can also falter when there is poor contrast, dim lighting or a low resolution, all of which affect the definition of a subject’s edges. But upload a well-lit image in clear focus and PhotoRoom will almost always extract it cleanly from its surroundings.
This is good, because editing a cutout isn’t the most precise affair. The guided mode is useful if you want to simply remove specific objects from the selection. But when it comes to manually refining the mask, you’ll need fine fingerwork to paint around objects. And while you can change the brush size, there’s no option to adjust its softness. Strangely, there's no undo option: if you accidentally paint to erase part of a selection, you’ll need to repaint it with the Restore brush – or cancel and start again.
So while the option to edit cutouts is necessary, the slight clumsiness means you’re often better trusting in the automated selection and accepting any minor deviations, which usually won’t be visible when viewed at standard screen sizes.
Instant backgrounds can deliver remarkable realism
Post templates require more creative input
Retouch tool is a useful addition for removing objects
A creative eye is still required when it comes to working with templates. Unlike backgrounds, PhotoRoom won’t suggest templates to suit your image, so you’ll need to pick one that works for your subject and purpose.
Realism is rarely the aim here, with many of the presets featuring bold graphics, text, and obviously simulated settings. It’s important to choose one that matches the theme of your subject, otherwise, your cutout can look out of place. Perspective is also important to bear in mind, particularly with product photography: the templates are most useful for subjects shot straight-on.
The best of PhotoRoom’s templates are those that keep things subtle, specifically those that place your subject against a relatively plain background with a soft drop shadow. Even then, you’ll often want to adjust the placement of your subject within the template and consider adding a shadow using the adjustment menu. With the right touch, results can be fun and dynamic.
The believability factor is far higher with PhotoRoom’s instant backgrounds. That’s not to say that every placement looks like a real photo. You will still encounter AI artifacts, especially with backgrounds generated from prompts. You’ll also need to ensure that the scale appears consistent: many have defocused elements that can throw off the perspective. One background made a bicycle look like a miniature model on a shelf, for example.
On the whole, PhotoRoom is remarkably good at placing subjects in scenes with a high degree of realism, particularly when that subject is an object. Whether you pick from PhotoRoom’s suggested backgrounds or select a theme for yourself, it will incorporate the item contextually – often with breathtaking effectiveness.
When tasked with placing a camera on a table surrounded by succulents, PhotoRoom generated shadows that looked true to life, weighting the camera in the scene. Swapping the table for a concrete step, it added a subtle reflection in the polished stone below. Equally, when situating a lamp on a bedside table, it cast a glow on the neighboring wall. It’s these details that make PhotoRoom stand out as a tool for product photography.
Not every result appears authentic, but for every four variations generated, at least half could pass for a genuine image from a photoshoot – certainly to the untrained eye on social media. Even those that don’t quite pass for genuine frequently look as if they’ve been manipulated by a professional graphic designer. And it’s worth remembering that these aren’t fixed scenes into which different subjects are substituted: each is generated specifically for the given subject, based on your chosen theme or prompt.
What’s so impressive is that new versions can be generated with just a click, and every result can be reworked simply by changing a few words in the text prompt. As with many generative tools, prompts won’t always yield the exact visual you had in mind. But the option does put an arsenal of creative possibilities at your fingertips, with no training required. With batch editing available to Pro subscribers, you can achieve a consistent look for up to 50 image subjects at once. This works best with similarly framed shots.
In short, PhotoRoom is a powerful tool for producing virtual photoshoots. If you’re an online seller, it has the potential to eliminate the need for expensive and time-consuming seasonal shoots. From one clear, well-lit set of images, you can produce realistic visuals of your subjects in all manner of scenarios, from minimalist retail stores to kitchens at Christmas. When you compare the relative costs and the quality of its output, PhotoRoom makes a convincing case for itself.
Should I buy PhotoRoom?
Buy it if...
Don't buy it if...
PhotoRoom: Also consider
The closest competitor to PhotoRoom is Adobe Creative Cloud Express. Like PhotoRoom, this is a web- and app-based editing platform that makes it easy to produce professional marketing graphics.
It doesn’t have PhotoRoom’s ability to produce instant AI backgrounds for a given subject, which means PhotoRoom remains the top choice if you want to create realistic virtual product shoots. But Adobe Express does include a free cutout tool, social templates, and AI-powered generative fill, which can be used to add and remove elements within existing images.
If you’re looking for an intuitive online solution for producing polished marketing content in a range of sizes and formats, Adobe Express is arguably the more comprehensive option in terms of templates, font styles, graphic elements, and overall versatility. At $9.99 per month, it’s more affordable on a monthly basis, although the yearly fee of $99.99 makes PhotoRoom the cheaper annual choice.
It’s worth noting that if you have a Creative Cloud subscription, you’ll likely have access to Adobe Express already.
Samsung posted a video revealing new capabilities of the ISOCELL 200 MP sensor, powered by the new Snapdragon 8 Gen 3 chipset by Qualcomm. There isn't a smartphone on the market with such hardware, so this is clearly a teaser for the upcoming Galaxy S24 Ultra.
Here's the 60-second video and keep reading for a breakdown of what's inside.
Zoom Anyplace automatically tracks subjects in a video but also keeps capturing full-frame zoomed-out footage of the entire scene. This means the sensor essentially allows capturing two 4K video streams - of the cropped area and one of the whole...
Google played up AI features a lot during its launch event for the Pixel 8 and Pixel 8 Pro earlier this month, but if a new rumor is to be believed, those devices' AI smarts will soon be one-upped by none other than Samsung.
According to this yet-unconfirmed rumor, the Korean company's upcoming Galaxy S24, Galaxy S24+, and Galaxy S24 Ultra will be "the smartest AI phones ever", clearly ahead of what even the Pixels have to offer.
The S24 series will allegedly have features very reminiscent of ChatGPT and Google's own Bard - such as the ability to create content and stories based on...
The Intel Arc A770 has had quite a journey since its release back on October 12, 2022, and fortunately, it has been a positive one for Intel despite a somewhat rocky start.
Right out the gate, I'll say that if you are looking for one of the best cheap graphics cards for 1440p gaming, this card definitely needs to be on your list. It offers great 1440p performance for most modern PC titles that most of us are going to be playing and it's priced very competitively against its rivals.
Where the card falters, much like with my Intel Arc A750 review earlier this year, is with older DirectX 9 and DirectX 10 titles, and this really does hurt its overall score in the end. Which is a shame, since for games released in the last five or six years, this card is going to surprise a lot of people who might have written it off even six months ago.
Intel's discrete graphics unit has been working overtime on its driver for this card, providing regular updates that continue to improve performance across the board, though some games benefit more than others.
Naturally, a lot of emphasis is going to be put on more recently released titles. And even though Intel has also been paying attention to shoring up support for older games as well, if you're someone with an extensive back catalog of DX9 and DX10 titles from the mid-2000s that you regularly return to, then this is not the best graphics card for your needs. Nvidia and AMD drivers carry a long legacy of support for older titles that Intel will honestly never be able to match.
But if what you're looking for is the best 1440p graphics card to play the best PC games of the modern era but you're not about to plop down half a grand on a new GPU, then the Intel Arc A770 is going to be a very solid pick with a lot more to offer than many will probably realize.
Intel Arc A770: Price & availability
How much is it? US MSRP for 16GB card: $349 (about £280/AU$510); for 8GB card: $329 (about £265/AU$475)
When was it released? It went on sale on October 12, 2022
Where can you buy it? Available in the US, UK, and Australia
The Intel Arc A770 is available now in the US, UK, and Australia, with two variants: one with 16GB GDDR6 VRAM and an official US MSRP of $349 (about £280/AU$510), and one with 8GB GDDR6 VRAM and an official MSRP of $329 (about £265/AU$475).
Those are the launch MSRPs from October 2022, of course, and the cards have come down considerably in price in the year since their release, and you can either card for about 20% to 25% less than that. This is important, since the Nvidia GeForce RTX 4060 and AMD Radeon RX 7600 are very close to the 16GB Arc A770 cards in terms of current prices, and offer distinct advantages that will make potential buyers want to go with the latter rather than the former.
But those decisions are not as cut and dry as you might think, and Intel's Arc A770 holds up very well against modern midrange offerings, despite really being a last-gen card. And, currently, the 16GB variant is the only 1440p card that you're going to find at this price, even among Nvidia and AMD's last-gen offerings like the RTX 3060 Ti and AMD Radeon RX 6750 XT. So for 1440p gamers on a very tight budget, this card fills a very vital niche, and it's really the only card that does so.
Price score: 4/5
Intel Arc A770: Design
Intel's Limited Edition reference card is gorgeous
Will fit most gaming PC cases easily
Intel Arc A770 Limited Edition Design Specs
Slot size: Dual slot Length: 11.02 inches | 280mm Height: 4.53 inches | 115mm Cooling: Dual fan Power Connection: 1 x 8-pin and 1 x 6-pin Video outputs: 3 x DisplayPort 2.0, 1 x HDMI 2.1
The Intel Arc A770 Limited Edition that I'm reviewing is Intel's reference model that is no longer being manufactured, but you can still find some stock online (though at what price is a whole other question).
Third-party partners include ASRock, Sparkle, and Gunnir. Interestingly, Acer also makes its own version of the A770 (the Acer Predator BiFrost Arc A770), the first time the company has dipped its toe into the discrete graphics card market.
All of these cards will obviously differ in terms of their shrouds, cooling solutions, and overall size, but as far as Intel's Limited Edition card goes, it's one of my favorite graphics cards ever in terms of aesthetics. If it were still easily available, I'd give this design five out of five, hands down, but most purchasers will have to opt for third-party cards which aren't nearly as good-looking, as far as I'm concerned, so I have to dock a point for that.
It's hard to convey from just the photos of the card, but the black finish on the plastic shroud of the card has a lovely textured feel to it. It's not quite velvety, but you know it's different the second you touch it, and it's something that really stands out from every other card I've reviewed.
The silver trim on the card and the more subtle RGB lighting against a matte black shroud and fans really bring a bit of class to the RGB graphics card I typically see. The twin fans aren't especially loud (not any more so than other dual-fan cards, at least), and the card feels thinner than most other similar cards I've reviewed and used, whether or not the card is thinner in fact.
The power connector is an 8-pin and 6-pin combo, so you'll have a pair of cables dangling from the card which may or may not affect the aesthetic of your case, but at least you won't need to worry about a 12VHPWR or 12-pin adapter like you do with Nvidia's RTX 4000-series and 3000-series cards.
You're also getting three DisplayPort 2.0 outputs and an HDMI 2.1 output, which puts it in the same camp as Nvidia's recent GPUs, but can't match AMD's recent move to DisplayPort 2.1, which will enable faster 8K video output. As it stands, the Intel Arc A770 is limited to 8K@60Hz, just like Nvidia. Will you be doing much 8K gaming on a 16GB card? Absolutely not, but as we get more 8K monitors next year, it'd be nice to have an 8K desktop running at 165Hz, but that's a very speculative prospect at this point, so it's probably not anything anyone looking at the Arc A770 needs to be concerned about.
Design Score: 4 / 5
Intel Arc A770: Specs & features
Good hardware AI cores for better XeSS upscaling
Fast memory for better 1440p performance
Intel's Xe HPG architecture inside the Arc A770 introduces a whole other way to arrange the various co-processors that make up a GPU, adding a third, not very easily comparable set of specs to the already head-scratching differences between Nvidia and AMD architectures.
Intel breaks up its architecture into "render slices", which contain 4 Xe Cores, which each contain 128 shaders, a ray tracing processor, and 16 matrix processors (which are directly comparable to Nvidia's vaunted tensor cores at least), which handle graphics upsampling and machine learning workflows. Both 8GB and 16GB versions of the A770 contain eight render slices for a total of 4096 shaders, 32 ray processors, and 512 matrix processors.
The ACM-G10 GPU in the A770 runs at 2,100MHz base frequency with a 2,400MHz boost frequency, with a slightly faster memory clock speed (2,184MHz) for the 16GB variant than the 8GB variant's 2,000MHz. This leads to an effective memory speed of 16 Gbps for the 8GB card and 17.5 Gbps for the 16GB.
With a 256-bit memory bus, this gives the Arc A770 a much wider lane for high-resolution textures to be processed through, reducing bottlenecks and enabling faster performance when gaming at 1440p and higher resolutions thanks to a 512 GB/s and 559.9 GB/s memory bandwidth for the 8GB and 16GB cards, respectively.
All of this does require a good bit of power, though, and the Arc A770 has a TDP of 225W, which is higher than most 1440p cards on the market today.
As far as features all this hardware empowers, there's a lot to like here. The matrix cores are leveraged to great effect by Intel's XeSS graphics upscaling tech found in a growing number of games, and this hardware advantage generally outperforms AMD's FSR 2.0, which is strictly a software-based upscaler.
XeSS does not have frame generation though, and the matrix processors in the Arc A770 are not nearly as mature as Nvidia's 3rd and 4th generation tensor cores found in the RTX 3000-series and RTX 4000-series, respectively.
The Arc A770 also has AV1 hardware-accelerated encoding support, meaning that streaming videos will look far better than those with only software encoding at the same bitrate, making this a compelling alternative for video creators who don't have the money to invest in one of Nvidia's 4000-series GPUs.
Specs & features: 3.5 / 5
Intel Arc A770: Performance
Great 1440p performance
Intel XeSS even allows for some 4K gaming
DirectX 9 and DirectX 10 support lacking, so older games will run poorly
Resizable BAR is pretty much a must
At the time of this writing, Intel's Arc A770 has been on the market for about a year, and I have to admit, had I gotten the chance to review this card at launch, I would probably have been as unkind as many other reviewers were.
As it stands though, the Intel Arc A770 fixes many of the issues I found when I reviewed the A750, but some issues still hold this card back somewhat. For starters, if you don't enable Resizable BAR in your BIOS settings, don't expect this card to perform well at all. It's an easy enough fix, but one that is likely to be overlooked, so it's important to know that going in.
Image 1 of 15
Image 2 of 15
Image 3 of 15
Image 4 of 15
Image 5 of 15
Image 6 of 15
Image 7 of 15
Image 8 of 15
Image 9 of 15
Image 10 of 15
Image 11 of 15
Image 12 of 15
Image 13 of 15
Image 14 of 15
Image 15 of 15
In synthetic benchmarks, the A770 performed fairly well against the current crop of graphics cards, despite its effectively being a last-gen card. It is particularly strong competition against the Nvidia RTX 4060 Ti across multiple workloads, and it even beats the 4060 Ti in a couple of tests.
Its Achilles Heel, though, is revealed in the PassMark 3D Graphics test. Whereas 3DMark tests DirectX 11 and DirectX 12 workloads, Passmark's test also runs DirectX 9 and DirectX 10 workflows, and here the Intel Arc A770 simply can't keep up with AMD and Nvidia.
Image 1 of 24
Image 2 of 24
Image 3 of 24
Image 4 of 24
Image 5 of 24
Image 6 of 24
Image 7 of 24
Image 8 of 24
Image 9 of 24
Image 10 of 24
Image 11 of 24
Image 12 of 24
Image 13 of 24
Image 14 of 24
Image 15 of 24
Image 16 of 24
Image 17 of 24
Image 18 of 24
Image 19 of 24
Image 20 of 24
Image 21 of 24
Image 22 of 24
Image 23 of 24
Image 24 of 24
In non-ray-traced and native-resolution gaming benchmarks, the Intel Arc A770 managed to put up some decent numbers against the competition. At 1080p, the Arc A770 manages an average of 103 fps with an average minimum fps of 54. At 1440p, it averages 78 fps, with an average minimum of 47, and even at 4K, the A770 manages an average of 46 fps, with an average minimum of 27 fps.
Image 1 of 12
Image 2 of 12
Image 3 of 12
Image 4 of 12
Image 5 of 12
Image 6 of 12
Image 7 of 12
Image 8 of 12
Image 9 of 12
Image 10 of 12
Image 11 of 12
Image 12 of 12
Turn on ray tracing, however, and these numbers understandably tank, as they do for just about every card below the RTX 4070 Ti and RX 7900 XT. Still, even here, the A770 does manage an average fps of 41 fps, with an average minimum of 32 fps) at 1080p with ray tracing enabled, which is technically still playable performance. Once you move up to 1440p and 4K, however, your average title isn't going to be playable at native resolution with ray tracing enabled.
Image 1 of 9
Image 2 of 9
Image 3 of 9
Image 4 of 9
Image 5 of 9
Image 6 of 9
Image 7 of 9
Image 8 of 9
Image 9 of 9
Enter Intel XeSS. When set to "Balanced", XeSS turns out to be a game changer for the A770, getting it an average framerate of 66 fps (with an average minimum of 46 fps) at 1080p, an average of 51 fps (with an average minimum of 38 fps) at 1440p, and an average 33 fps (average minimum 26 fps) at 4K with ray tracing maxed out.
While the 26 fps average minimum fps at 4K means it's really not playable at that resolution even with XeSS turned on, with settings tweaks, or more modest ray tracing, you could probably bring that up into the low to high 30s, making 4K games playable on this card with ray tracing turned on.
That's something the RTX 4060 Ti can't manage thanks to its smaller frame buffer (8GB VRAM), and while the 16GB RTX 4060 Ti could theoretically perform better (I have not tested the 16GB so I cannot say for certain), it still has half the memory bus width of the A770, leading to a much lower bandwidth for larger texture files to pass through.
This creates an inescapable bottleneck that the RTX 4060 Ti's much larger L2 cache can't adequately compensate for, and so takes it out of the running as a 4K card. When tested, very few games managed to maintain playable frame rates even without ray tracing unless you dropped the settings so low as to not make it worth the effort. The A770 16GB, meanwhile, isn't technically a 4K card, but it can still dabble at that resolution with the right settings tweaks and still look reasonably good.
Image 1 of 9
Image 2 of 9
Image 3 of 9
Image 4 of 9
Image 5 of 9
Image 6 of 9
Image 7 of 9
Image 8 of 9
Image 9 of 9
All told, then, the Intel Arc A770 turns out to be a surprisingly good graphics card for modern gaming titles that can sometimes even hold its own against the Nvidia RTX 4060 Ti. It can't hold a candle to the RX 7700 XT or RTX 4070, but it was never meant to, and given that those cards cost substantially more than the Arc A770, this is entirely expected.
Its maximum observed power draw of 191.909W is pretty high for the kind of card the A770 is, but it's not the most egregious offender in that regard. All this power meant that keeping it cool was a struggle, with its maximum observed temperature hitting about 74 ºC.
Among all the cards tested, the Intel Arc A770 was at nearly the bottom of the list with the RX 6700 XT, so the picture for this card might have been very different had it launched three years ago and it had to compete with the RTX 3000-series and RX-6000 series exclusively. In the end, this card performs like a last-gen card, because it is.
Despite that, it still manages to be a fantastic value on the market right now given its low MSRP and fairly solid performance, rivaling the RTX 4060 Ti on the numbers. In reality though, with this card selling for significantly less than its MSRP, it is inarguably the best value among midrange cards right now, and it's not even close.
Performance score: 3.5 / 5
Should you buy the Intel Arc A770?
Buy the Intel Arc A770 if...
Don't buy it if...
If my Intel Arc A770 review has you considering other options, here are two more graphics cards for you to consider.
How I tested the Intel Arc A770
I spent several days benchmarking the card, with an additional week using it as my primary GPU
I ran our standard battery of synthetic and gaming benchmarks
These are the specs for the test system used for this review: CPU: Intel Core i9-13900K
CPU Cooler: Cougar Poseidon GT 360 AIO Cooler Motherboard: MSI MPG Z790E Tomahawk Wifi
Memory: 64GB Corsair Dominator Platinum RGB DDR5-6000 SSD: Samsung 990 Pro PSU: Thermaltake PF3 1050W ATX 3.0 Case: Praxis Wetbench
I spent about two weeks with the Intel Arc A770 in total, with a little over half that time using it as my main GPU on my personal PC. I used it for gaming, content creation, and other general-purpose use with varying demands on the card.
I focused mostly on synthetic and gaming benchmarks since this card is overwhelmingly a gaming graphics card. Though it does have some video content creation potential, it's not enough to dethrone Nvidia's 4000-series GPUs, so it isn't a viable rival in that sense and wasn't tested as such.
I've been reviewing computer hardware for years now, with an extensive computer science background as well, so I know how graphics cards like this should perform at this tier.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
A luxury headset that makes use of AI features seems like an obvious concept now, but it still surprised me seeing Logitech Zone Wireless 2 take full advantage of modern advancements to implement AI in a mostly successful way. According to Logitech, the AI tech was built from the ground up into the hardware, which is apparent from how every aspect of this headset revolves around it.
It’s not to its detriment either and only feels a little gimmicky. ANC is run by AI, and you’re able to adjust between several different levels or turn it off completely. Plus, there’s a feature called Personal EQ that can even adjust how it adapts to your hearing through a brief set of questions through the app.
Other than traditional noise-cancelling, there are other interesting variations. For instance, when making phone calls, you can not only cancel out background noise on your end but the AI can recognize the other caller’s voice and filter it from their background noise as well.
There’s also a slew of other AI quality-of-life tools and features, like connecting to up to two devices via Bluetooth and switching between them on the fly. And if you’re using the wireless dongle instead, there’s a feature called Smart Enumeration where if a device is not in use audio will not come out from it.
If you remove the headset while listening to audio, it automatically pauses. Meanwhile, putting them back on resumes and tilting either speaker can also mute any audio. An option lets you automatically answer a phone call by putting on the headset if you’re connected to a smartphone.
There are health and safety options too, like anti-startle protection that limits sudden high-pitched noises as well as noise exposure control that measures daily noise levels in a call and ensures it doesn’t exceed a certain amount.
So how do all these AI tools measure up? Pretty well but not perfect.
At the very least, this headset could easily land as one of the best wireless headphones but not take the top spot. The auto-pause feature is good except when it doesn’t pause because it doesn’t recognize the action of pulling off the headset or when stays paused even when I put it back on. The tilt mute feature is very spotty, and I found that it either doesn’t register me tilting it deliberately, or it’s overly sensitive and mutes at a slight nudge.
The noise-canceling is probably the only feature that works just as promised, with every ANC setting reducing outside and background sounds to a near-perfect degree. It still shocks me how instantaneous the effect is.
The Logitech Zone Wireless 2 has a lovely audio quality, with a great soundscape that’s able to handle a wide range of highs and lows. Even the bass is robust and loses very little quality at max volume, and I could suitably feel it in my teeth, which is a great sign to me. It’s also flexible to handle music and audio from video games, movies, music from streaming services, and more.
I love that for a headset with an adjustable microphone, the 90-degree swivel on the earphones makes it completely ambidextrous. Unfortunately, the build quality is a little disappointing for such an expensive product. It’s admirable that the headset is made of 22% recycled plastic and low-carbon aluminum, but the seams of the headband cover split a little from the headband when I pull on it.
Of course, not many people will be stretching the headband to that extent but when I compare it to the Razer Barracuda Pro, which has a similar price point even if it doesn't exactly make it to our best PC gaming headsets list, it doesn't measure up in build quality. To offset this, Logitech does make it so that buyers can completely repair and replace parts on their own, which is something that should be a standard for any of these devices.
One advantage it does has over other headsets is just how light and comfortable it feels. The ear cushions and headband cover are clearly made of memory foam and, coupled with the light weight, make it feel like I’m wearing a cloud.
Another issue is the battery, which lasts up to 40 hours listening with ANC off, up to 22 hours listening with ANC on, and a talk time of 15 hours with ANC on and up to 18 hours with ANC off. Not amazing. At best, you're required to charge it every two days of regular use, which can get a bit cumbersome. It gets even worse, as I noticed that when the power dips below 40% the volume and audio quality dip as if there’s some background battery-saver mode that's activating.
Compared to the Razer Barracuda Pro's 40-hour battery life that lets you drain every last drop of power while maintaining perfect audio quality, Logitech's own offering pales in comparison.
Logitech Zone Wireless 2: Price & availability
How much does it cost? $299.99 / £299.99 / AU$499.95
When is it available? October 2023
Where can you get it? Available in the US, UK, and Australia
The Logitech Zone Wireless 2 will be available in the US, UK, and Australia in October 2023 for an MSRP of $299.99 / £299.99 / AU$499.95.
The price point is quite steep, putting it firmly in the luxury headset market. That on its own isn’t bad, as there’s plenty of tech and development that’s gone into it. But considering that the build quality isn’t as high as similar headsets like the Razer Barracuda Pro and Logitech’s own Pro X 2 Lightspeed, the MSRP sticks out like a sore thumb.
Value: 4 / 5
Logitech Zone Wireless 2: Specs
Should you buy the Logitech Zone Wireless 2?
Buy it if...
You want a headset with AI features The AI features are quite good, elevating the noise canceling to greater heights and adding cool abilities like smart pausing.
You want a light and comfortable headset One of the best features of this headset is that, unlike many other bulky headphones, this one is super light and the memory fits your ears like a cloud.
Don't buy it if...
You need a long-lasting battery Battery life isn't bad but you'll find it draining faster than you realize, needing a charge every couple days of regular use.
You're on a tight budget This is an extremely pricey headset, so if you're on a budget there are plenty of cheaper alternatives.
Logitech Zone Wireless 2: Also consider
How I tested the Logitech Zone Wireless 2
I spent about a week testing this keyboard
I tested it for productivity work, gaming, phone calls, and music
I used it extensively in a home office environment and outdoors
I tested the Logitech Zone Wireless 2 in a home office environment, as well as in high-volume areas, like public transit and parks, to see how well the noise-canceling worked. I also tested out the various AI and ANC features to check for effectiveness and reliability.
The Logitech Zone Wireless 2 is a Bluetooth-compatible headset that's meant for extensive use over a period of years. I made sure to quality-test it to see if it held up to those standards while maintaining maximum comfort levels.
I've tested headsets including gaming ones, and understand how to properly rate and test them out to ensure that they reach a certain level of quality.
We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.
Samsung unveiled the Exynos 2400 chipset with a 14.7x boost in AI performance and we expect to see the platform debut with the Galaxy S24 series. According to the leakster Ice Universe, the SoC will be the biggest update in terms of artificial intelligence with “many new AI features” and a virtual assistant “more powerful than Bixby”.
The leakster claimed that the Samsung Galaxy S24 would bring an increased focus on AI. The series will introduce the OneUI 6.1 with vastly improved Bixby virtual assistant and "the largest AI update in history".
The Galaxy S24 smartphones will have a...