Organizer
Gadget news
Awara Natural Hybrid Mattress review: sustainable slumber at a fantastic value
10:54 am | November 5, 2023

Author: admin | Category: Computers Gadgets Health & Fitness Mattresses Sleep | Tags: | Comments: Off

Awara mattress review: Two-minute review

The Awara Natural Hybrid mattress is something of an anomaly among latex beds due to its affordable price. Most of today's best organic mattresses are expensive, but the Awara consistently sits in the mid-range price bracket – a queen size goes for less than $950 during regular sales. But does this affordable natural mattress have a glaring compromise in quality? Quite the contrary – it's an impressively durable bed.

In January 2022, I slept on a twin Awara mattress and assembled a panel of five diverse testers to help me assess its features. Our collective verdict? The Awara ranks among the best mattresses for those who favor a firmer sleeping surface with gentle pressure relief. My full review is below but if the Internet has spoilt your attention span, here's the two-minute version...

Awara mattress on a twin platform bed

(Image credit: Future / Alison Barretta)

The Awara is a mattress in a box constructed of 8-inch springs, two inches of Rainforest Alliance-certified Dunlop latex, and a blend of organic cotton and New Zealand wool on top. Setup is seamless – and four side handles make moving the mattress a much less taxing task. The Awara boasts a number of highly-regarded third-party environmental and safety certifications to add to its eco-friendly cred. 

When I first lay on the Awara, I was shocked by its firmer surface, but its latex comfort layer immediately molded to the shape of my body for ample support and just enough pressure relief – no matter if I rested on my side, stomach, or back. Everyone in my testing panel found it comfortable but side sleepers who crave more cushioning, as well as sleepers under 130lbs, may find it too unyielding (the best mattresses for side sleepers tend to be on the softer side, with plenty of contouring). 

Good news if you're prone to overheating at night (like me): the Awara is one well-ventilated mattress. It's not a specialty cooling mattress, but latex, cotton, and wool are some of the most breathable materials on the planet. The individually wrapped coils help keep the air flowing, too. 

Edge support is excellent so you can sprawl out or sit on the sides or corners without fear of falling off the bed. However, Awara's one area of weakness is motion isolation. The buoyant latex and springy coils make for a bouncy, responsive bed. Couples will be more inclined to feel each other's movements, which could lead to frequent and unpleasant nightly wakeups. On the other hand, solo sleepers who switch positions during the night will love it.

The amenities are impressive. Awara includes a one-year sleep trial plus a forever warranty with purchase. Returns are also free, and the brand will help you donate it to charity or responsibly dispose it. Among current Black Friday mattress deals, Awara's is already one of the best out there, yielding historical price lows after up to $765 off. Given the effects of inflation over the last several years, this is a rare thing to witness now.

Awara mattress review: Materials & design

  • A 10-inch hybrid mattress with three layers
  • Uses Rainforest Alliance-certified Dunlop latex
  • Includes four side handles for easy moving

There are three primary layers that make up the Awara Natural Hybrid mattress: a sturdy base of 8-inch individually wrapped coils, a 2-inch comfort layer of Rainforest Alliance-certified Dunlop latex, and a soft cover that's a blend of organic cotton and New Zealand wool. Combined, these layers offer a responsive and breathable sleep surface, with gentle contouring to ease your joints. Latex is often used in organic mattresses as a natural alternative to synthetic foams (see how the two compare in our memory foam versus latex mattress explainer). Bonus: latex is hypoallergenic so it's also great for sleepers with asthma or airborne allergies.

Setup is simple – just remove it from the box, unroll it on your bedframe, and remove the plastic (a process made easier thanks to the included credit card-sized cutter). Everything is structurally kept in place via a shift-resistant bottom cover. Four reinforced side handles will make the mattress much easier to move, which will be useful if you move house often.

Awara's commitment to producing an eco-friendly bed is highlighted by its array of environmental certifications, which include the aforementioned Rainforest Alliance, Standard 100 by OEKO-TEX, UL GREENGUARD Gold, and the Forest Stewardship Council. These standards ensure that the Awara's materials are sustainably sourced, low in volatile organic compounds (VOCs), and free from toxic chemicals. The Awara is also a fiberglass-free mattress; it uses a chemical-free flame retardant.

  • Design score: 4.5 out of 5

Awara mattress review: Price & value for money

  • Never sold at MSRP, sits in the mid-range price bracket
  • One of the cheapest and best value natural mattresses around
  • Full year's trial and forever warranty are very generous

Like many bed brands, Awara runs a perpetual discount; you'll never have to pay full MSRP. Based on the regular discounted price, the Awara Natural Hybrid sits in the mid-range price bracket, with a queen size costing around $999. That makes it one of the cheapest natural mattresses around, and excellent value for money.

Deals on the Awara don't tend to fluctuate as regularly as they do with other sleep brands, but if it is going to unveil an especially good price, it'll be for the Black Friday mattress deals in November. 

Awara is the natural mattress brand within the Resident Home umbrella, which also includes mattress heavyweights Nectar and DreamCloud. As such, you'll get basically the same, ultra-generous package of extras, including a full year's trial period and forever warranty. All of Resident's brands shine when it comes to value for money.

Awara mattress review: Comfort & support

  • A firm (8 out of 10) mattress with subtle contouring
  • Offers ample support and comfort for most sleepers
  • Side sleepers and petite individuals may want a plusher bed

In addition to myself – a 5-foot-4, 140lb side/stomach sleeper with back pain – I also asked five other adults to sleep on the Awara mattress. We have diverse body types and sleep preferences, which afforded me a broader look at how well this organic hybrid mattress performs.

Awara calls its mattress 'luxury firm,' or a 7 out of 10 on the firmness scale – but my group collectively rated it an 8 out of 10.  While some of us initially found it a bit too unyielding, we appreciated how quickly the Dunlop latex subtly contoured our bodies, offering just enough pressure relief without significant sinkage. 

Mattress tester lying on her side on the Awara mattress

(Image credit: Future / Alison Barretta)

My lone back-sleeping participant said he felt like he was floating on top of the mattress yet adequately supported. Meanwhile, the side sleepers in my panel (myself included) were comfortable on the Awara – despite its firmer-than-average surface, all of us felt just enough give along our shoulders and hips. Even the pregnant side sleeper in my group liked how the Awara gently cradled her belly.

As the only combi sleeper among all the testers, shifting from my side to my stomach was effortless thanks to the responsiveness of the Awara's Dunlop latex and springs. Plus, when resting on my front I didn't feel my pelvis dip below the rest of my body, helping me avoid my nagging lower back pain.

Pressure relief test using a 50lb weight on the Awara mattress

(Image credit: Future / Alison Barretta)

To further test the Awara's pressure relief, I placed a 50lb weight at the center of the mattress. This created a minimal dip (about an inch), and the bed quickly returned to form once I removed the weight. This assessment aligns with the minimal sinkage we human testers experienced.

While everyone in my testing group found the Awara's comfort to their liking, side sleepers who prefer a plusher feel and smaller-framed people who weigh under 130lbs might think it's too firm. For them, a memory foam mattress (or a memory foam hybrid) may be a better fit. 

  • Comfort score: 4.5 out of 5

Awara mattress review: Performance

  • Excellent temperature regulation – good for hot sleepers
  • Too much motion transfer so not ideal for couples
  • Edges are sturdy for sitting and sprawling

I slept on a twin Awara Natural Hybrid Mattress for one month, during which I tested it in all key areas of performance according to TechRadar's mattress methodology. Here's how it fared...

 Temperature regulation

I slept on the Awara mattress in the wintertime, so the real test here was to see how well it could regulate my body temperature upon cranking the heat and layering the fabrics. (I'm also prone to overheating at night, regardless of the season.)

Between latex's natural ability to draw away heat and the airflow created by the layer of coils, I didn't break a sweat once and remained perfectly cozy, even on the coldest nights. The organic cotton and New Zealand wool cover was not only lusciously soft, but it did a stellar job of wicking away moisture, too.

The Awara may not be a proper cooling mattress, but given the breathability of its materials, I think it's a sound choice for sleepers who don't want night sweats or hot flashes to keep them up at night. 

  • Temperature regulation score: 4 out of 5

Motion isolation

The Awara is a remarkably bouncy, responsive mattress. This appeals to me as a solo sleeper who switches positions at night. However, anyone who shares a bed with their partner, kids, and/or pets will feel less enthused. 

To gauge the Awara's motion isolation on my twin-size bed, I conducted a drop test using an empty wine glass and a 10lb weight. Mimicking the actions of a restless partner or a lively pet, I simulated three common bed disturbances: tossing and turning, getting in and out of bed, and jumping on the bed. I dropped the weight from 4, 8, and 12 inches above the bed to represent each scenario, respectively, and measured the effect roughly 25 inches away from the wine glass.

Awara mattress drop test for motion isolation with a 10lb weight and empty wine glass

(Image credit: Future / Alison Barretta)

The wine glass didn't topple too much but I did notice the surface dip slightly beneath the glass. More telling was the weight itself, which bounced several times before settling into the mattress. Given the natural buoyancy of latex, these results didn't surprise me.

Thus, the Awara isn't the best choice if you don't want to be disturbed by your partner's movements – whether they fidget a lot in their sleep or operate on a different schedule than you do. For an organic mattress with superb motion transfer properties, read TechRadar's Avocado Green mattress review.

  • Motion isolation score: 3 out of 5

Edge support

Sturdy edges are essential for any mattress, regardless of size. Whether you tend to roll towards the edges in your sleep or sit on the side prior to getting up out of bed, you don't want to feel as if you'll topple overboard. 

Unfortunately, some brands skimp on edge support, especially for solo sleeper beds. However, this isn't the case with the Awara. My testers and I experienced solid support whether we sat on the corners or the sides. Although the mattress did obviously compress under our weight, we never felt unstable or at risk of sliding off.

I also placed a 50lb weight along the middle perimeter, measuring about an inch of sinkage – the same amount I observed when I placed the weight at the dead center of the mattress. Ideally, the edges shouldn't dip lower than the middle, so the Awara gets a passing grade in this area.

The Awara ranks among the best mattresses I've tested for robust edge support. It's proof that stable edges are possible for even the smallest of beds.

  • Edge support score: 4.5 out of 5

Should you buy the Awara mattress

Buy it if…

✅ You're a fan of firm beds: The Awara's firm surface will appeal to to front and back sleepers – and even side sleepers who eschew overly plush beds will find it comfortably supportive.

✅ You use every inch of your mattress: The Awara's sturdy sides and corners will sufficiently accommodate those who like to sprawl out or need a stable edge to sit on. If you're prone to rolling toward the edge in your sleep, don't worry about falling overboard.

✅ You care about saving the planet (and your money): Organic mattresses often command a higher cost but the Awara's mid-range price makes eco-friendly sleep much more accessible. Add in a year-long sleep trial and a lifetime warranty and you have an tremendous value.

Don't buy it if…

❌ You share a bed: The Awara's bouncy, responsive surface is great for solo sleepers who toss and turn at night – but this could be bothersome for couples or anyone who shares a bed with a lively pet.

❌ You like the sink-in sensation of memory foam: Sleepers seeking the deep embrace of memory foam won't find it here. The Awara's latex comfort layer imparts a firmer touch with limited contouring. TechRadar's best memory foam mattress guide provides a range of alternatives at different price points, but in the #1 spot you'll find the Nectar memory foam mattress

❌ You weigh under 130lbs: Firmness is a matter of personal preference but if you're a smaller-framed individual who weighs under 130lbs, the Awara's firmness and limited give might be too rigid for your liking. Our organic mattress guide has models in a range of firmness profiles, including some that fall into the 'plush' category, like the WinkBeds EcoCloud hybrid

A tired tabby sleeps at the foot of the Awara mattress

(Image credit: Future / Alison Barretta)

How I tested the Awara mattress

I slept on a twin Awara Natural Hybrid mattress for four weeks in January 2022. Since I tested this mattress during the winter, I cranked my central heating system while layering my pajamas. I dressed the mattress in either 100% cotton or cotton/polyester bed sheets, with a mid-weight polyester comforter on top.

In addition to myself – a 5-foot04, 140lb side/stomach sleeper – I asked five other adults to nap on the Awara mattress for at least 15 minutes in their preferred positions. These testers ranged in size from 5-foot-4 and 126lbs to 6-foot and 215lbs – and one participant was even seven months pregnant at the time.

To supplement my real world experience, I also conducted several standardized tests to objectively gauge the Awara's performance. I used a 50lb weight to evaluate pressure relief and edge support, and a 10lb weight plus an empty wine glass to observe the motion isolation.

Hisense PX2-Pro: a fantastic projector value for movie fans
11:03 pm | October 23, 2023

Author: admin | Category: Computers Gadgets Projectors | Tags: , | Comments: Off

Hisense PX-2 Pro: one-minute review

The Hisense PX1 Pro was one of the best ultra short throw projectors for the money, and the company’s new model, the Hisense PX2-Pro, is even better. While its $3,000 price tag makes it an expensive piece of hardware, it’s actually cheaper than its predecessor and is one of the more affordable ultra short throw (UST) projectors around. 

What you get for the money is a fantastic, compact projection system offering solid performance in a dim room and exceptional performance in a dark one. It blasts an admirable 4K picture at a range of sizes and its simply stunning color puts the Xgimi Aura and Epson EpiqVision Ultra LS800 to shame. 

You’ll want to set this projector up with a good sound system and a projection screen to make the most of it, but even on its own the PX2-Pro has everything you need to put on a movie night.

Hisense PX-2 Pro review: price and release date

  • Release date: June 2023
  • MSRP: $2,999

The Hisense PX2-Pro is available now for $2,999, launching at a lower price than its predecessor the PX1-Pro launched at. The projector doesn’t come paired with a screen like Hisense’s Laser TV models.

Hisense PX2-Pro top view with remote control

The PX2-Pro has  cover glass, but no sliding cover that can prevent dust and pet hair in the air from gathering  (Image credit: Future)

Hisense PX-2 Pro review: Specs

Hisense PX2-Pro projector top view of laser light engine

Dolby Vision HDR is supported by the PX2-Pro (Image credit: Future)

Hisense PX-2 Pro review: design and features

  • A stylish, retro-futuristic design
  • Plenty of connection options
  • Google TV with Netflix support

The Hisense PX2-Pro is as snazzy-looking as it is feature-packed. The design is largely unchanged from the PX1-Pro, so it's still every bit the retro piece of hardware that it was in a past life. The design also comes with tighter dimensions than the Hisense L9G and Hisense L9H model that it borrows some of its hardware DNA from. This compact UST projector would look right at home next to a record player. 

The projector sits on four adjustable feet that let you get it properly perpendicular and level to the wall or projection screen you plan to project on. Focus is managed electronically, which is functional, though not as convenient as a manual focus dial. Another disappointing omission is any real cover for the optics. There’s cover glass, but that’s not quite as good as a sliding cover that can prevent dust and pet hair in the air from gathering. (With a projector, it’s surprising how much impact a single hair laying across the lens can have.)

Tucked away into this slick little number is a machine with ample capabilities. Three HDMI ports are ready to receive 4K inputs, though one is also ready to pass along high-bandwidth Dolby Atmos audio using eARC. If you don’t plan to use eARC, you also get optical and 3.5mm analog audio outputs as options. There’s a high-speed USB port for powering streaming sticks or connecting external media storage. Wi-Fi 5 is a bit disappointing to see in this 4K streaming era, but the PX2-Pro includes an Ethernet jack for a better connection.

While almost all of the ports are on the rear, there’s one extra USB type-A port on the left side of the projector that’s ready to serve as a trigger for other home theater devices, such as an electronic projection screen like the Akia Screens Floor Motorized Tab-Tension CineWhite.

The Hisense PX2-Pro also has capable brains with Google TV smart TV interface built in. And unlike many other projectors running Google’s operating systems, this one actually is ready to run Netflix from the jump, and even includes a shortcut to it on the remote. It appears Hisense has started to break down the walls some streaming services curiously had up against smart projectors. 

A few projectors we’ve tested with smart TV systems tend to settle for weak hardware that’s barely up to the task of running them, resulting in a slow experience, but not this Hisense. Navigating the settings menus and pulling up streaming content is quick. During setup, Hisense provides the option to skip some of the process — no internet connection, no Google account — which can come in handy if you want to keep things simple and plan to use the projector only with external sources.

  • Design and features score: 4.5/5

Hisense PX2-Pro onscreen image of Avatar: The Way of Water

The Hisense PX2-PRO delivers bright images with strong contrast in dark, theater-optimized viewing environments (Image credit: Future)

Hisense PX-2 Pro review: picture and sound quality

  • Brightness and contrast a huge plus
  • Incredible color alongside robust HDR support
  • Modest built-in sound

As far as projectors go, the Hisense PX2-Pro is a beaut. It’s coming from a strong lineage, as the PX1-Pro put the phenomenal capabilities of its three-laser DLP projection system to work in stunning fashion. This new model just upgrades that with a bit more brightness, which makes it more watchable than ever. 

The PX2-Pro hasn’t quite split the difference between the brightness of this line and the Laser TV line recently brought up to the L9H, which succeeds the marvelous L9G. But, at $3,000, it’s a powerful value package and one that benefits from the flexible image size option that’s lacking on the Laser TV line.

Hisense’s projector blasts a sharp picture that comfortably stretches up to 130 inches. At that size, 4K really starts to make sense, and it doesn’t come up lacking for clarity. And as long as you can draw the shades, the brightness is more than enough for a picture that size.

Clarity and brightness being what they are here, the true star of the show is the color provided by the three-color light source. Whether it's displaying HDR content or SDR content, the PX2-Pro puts on one hell of a show, easily trouncing the color of the 3LCD Epson LS800 or the single-laser Xgimi Aura. Splashy content like Avatar: The Way of Water looks glorious while down-to-earth shows like Justified still see the lasers show their power every time the camera heads to the neon-lit interior of a bar. The Hisense also has comprehensive support for HDR, covering HDR10, HLG, and Dolby Vision (which was also added to the PX1-Pro after its launch).

The picture is one thing, but the sound is another. A pair of 15-watt speakers may pump out some volume, but they’re hardly a match for the visual capabilities of the projection system. In a 200 square-foot room, you’ll get the volume you need for a rousing time, especially where it comes to mids. But when cranking the volume up, treble becomes unpleasantly sharp and biting, while the deep bass range remains lacking no matter what. There’s just not enough here to shake your bones. They also hardly muster reasonable stereo separation, so the promise of Dolby Atmos is a flat one. Plan to pair this projector with a cheap soundbar at a bare minimum. 

  • Picture quality score: 4.5/5

Hisense PX-2 Pro review: value

  • Expensive but not at the top of the range
  • Dimmer, but fair next to competition

The Hisense PX2-Pro isn’t a cheap projector, but it’s far less expensive than some of its UST compatriots. It’s a good bit cheaper than the $3,499 (about £2,850, AU$5,035) Epson Home Cinema LS800, a model it absolutely decimates in picture quality in a dark room, though it lags well behind the Epson’s brightness. It’s also much cheaper than the $5,499 (about £4000, AU$7500) Hisense L9H, though that model comes bundled with an ambient light rejecting screen.

This all helps make the PX2-Pro a compelling value in the UST projector realm, especially if you have a dim room. It’s got a great picture and is reasonably versatile. There are some threats to it from the portable 4K projector space, such as the JMGO N1 Ultra or even Hisense’s own C1, but the PX2-Pro is a force to be reckoned with regardless.

  • Value score: 4.5/5

Hisense PX2-Pro remote control held in reviewer''s hand

The included remote control has a direct input button for the Netflix streaming app (Image credit: Future)

Should I buy the Hisense PX-2 Pro?

Hisense PX2-Pro projector on stand

(Image credit: Future)

Buy it if...

Don’t buy it if… 

Also consider...

Epson LS800
The Epson LS800 uses a 3LCD laser light source to beam a stunningly bright 4,000 lumens image. This makes it a great option for daytime sports viewing and it also has good built-in sound. Here's our full Epson LS800 review.

Hisense PX2-Pro image projected onscreen of an anime movie

(Image credit: Future)

How I tested the Hisense PX-2 Pro

  • Tested at home in multiple, real-world viewing conditions
  • Presented the display with a variety of media and formats
  • I have tested numerous projectors and displays over the last half-decade

I tested the Hisense PX2-Pro at home, in real-world conditions. This saw it faced with ambient light coming in from numerous windows, in-room lighting, as well as ambient noise that both the projector and speaker systems had to overcome. The projector was tested both against a bare, white wall and an Akia Screens CineWhite screen. It was presented with streamed content, HDR and non-HDR, and console gameplay. 

My testing evaluates the projector’s performance with respect to its price and competition from other models I and colleagues at TechRadar have tested.

I have been testing projectors since 2021 and displays for even longer.

First reviewed: October 2023

Hisense PX2-Pro: a fantastic projector value for movie fans
11:03 pm |

Author: admin | Category: Computers Gadgets Projectors | Tags: , | Comments: Off

Hisense PX-2 Pro: one-minute review

The Hisense PX1 Pro was one of the best ultra short throw projectors for the money, and the company’s new model, the Hisense PX2-Pro, is even better. While its $3,000 price tag makes it an expensive piece of hardware, it’s actually cheaper than its predecessor and is one of the more affordable ultra short throw (UST) projectors around. 

What you get for the money is a fantastic, compact projection system offering solid performance in a dim room and exceptional performance in a dark one. It blasts an admirable 4K picture at a range of sizes and its simply stunning color puts the Xgimi Aura and Epson EpiqVision Ultra LS800 to shame. 

You’ll want to set this projector up with a good sound system and a projection screen to make the most of it, but even on its own the PX2-Pro has everything you need to put on a movie night.

Hisense PX-2 Pro review: price and release date

  • Release date: June 2023
  • MSRP: $2,999

The Hisense PX2-Pro is available now for $2,999, launching at a lower price than its predecessor the PX1-Pro launched at. The projector doesn’t come paired with a screen like Hisense’s Laser TV models.

Hisense PX2-Pro top view with remote control

The PX2-Pro has  cover glass, but no sliding cover that can prevent dust and pet hair in the air from gathering  (Image credit: Future)

Hisense PX-2 Pro review: Specs

Hisense PX2-Pro projector top view of laser light engine

Dolby Vision HDR is supported by the PX2-Pro (Image credit: Future)

Hisense PX-2 Pro review: design and features

  • A stylish, retro-futuristic design
  • Plenty of connection options
  • Google TV with Netflix support

The Hisense PX2-Pro is as snazzy-looking as it is feature-packed. The design is largely unchanged from the PX1-Pro, so it's still every bit the retro piece of hardware that it was in a past life. The design also comes with tighter dimensions than the Hisense L9G and Hisense L9H model that it borrows some of its hardware DNA from. This compact UST projector would look right at home next to a record player. 

The projector sits on four adjustable feet that let you get it properly perpendicular and level to the wall or projection screen you plan to project on. Focus is managed electronically, which is functional, though not as convenient as a manual focus dial. Another disappointing omission is any real cover for the optics. There’s cover glass, but that’s not quite as good as a sliding cover that can prevent dust and pet hair in the air from gathering. (With a projector, it’s surprising how much impact a single hair laying across the lens can have.)

Tucked away into this slick little number is a machine with ample capabilities. Three HDMI ports are ready to receive 4K inputs, though one is also ready to pass along high-bandwidth Dolby Atmos audio using eARC. If you don’t plan to use eARC, you also get optical and 3.5mm analog audio outputs as options. There’s a high-speed USB port for powering streaming sticks or connecting external media storage. Wi-Fi 5 is a bit disappointing to see in this 4K streaming era, but the PX2-Pro includes an Ethernet jack for a better connection.

While almost all of the ports are on the rear, there’s one extra USB type-A port on the left side of the projector that’s ready to serve as a trigger for other home theater devices, such as an electronic projection screen like the Akia Screens Floor Motorized Tab-Tension CineWhite.

The Hisense PX2-Pro also has capable brains with Google TV smart TV interface built in. And unlike many other projectors running Google’s operating systems, this one actually is ready to run Netflix from the jump, and even includes a shortcut to it on the remote. It appears Hisense has started to break down the walls some streaming services curiously had up against smart projectors. 

A few projectors we’ve tested with smart TV systems tend to settle for weak hardware that’s barely up to the task of running them, resulting in a slow experience, but not this Hisense. Navigating the settings menus and pulling up streaming content is quick. During setup, Hisense provides the option to skip some of the process — no internet connection, no Google account — which can come in handy if you want to keep things simple and plan to use the projector only with external sources.

  • Design and features score: 4.5/5

Hisense PX2-Pro onscreen image of Avatar: The Way of Water

The Hisense PX2-PRO delivers bright images with strong contrast in dark, theater-optimized viewing environments (Image credit: Future)

Hisense PX-2 Pro review: picture and sound quality

  • Brightness and contrast a huge plus
  • Incredible color alongside robust HDR support
  • Modest built-in sound

As far as projectors go, the Hisense PX2-Pro is a beaut. It’s coming from a strong lineage, as the PX1-Pro put the phenomenal capabilities of its three-laser DLP projection system to work in stunning fashion. This new model just upgrades that with a bit more brightness, which makes it more watchable than ever. 

The PX2-Pro hasn’t quite split the difference between the brightness of this line and the Laser TV line recently brought up to the L9H, which succeeds the marvelous L9G. But, at $3,000, it’s a powerful value package and one that benefits from the flexible image size option that’s lacking on the Laser TV line.

Hisense’s projector blasts a sharp picture that comfortably stretches up to 130 inches. At that size, 4K really starts to make sense, and it doesn’t come up lacking for clarity. And as long as you can draw the shades, the brightness is more than enough for a picture that size.

Clarity and brightness being what they are here, the true star of the show is the color provided by the three-color light source. Whether it's displaying HDR content or SDR content, the PX2-Pro puts on one hell of a show, easily trouncing the color of the 3LCD Epson LS800 or the single-laser Xgimi Aura. Splashy content like Avatar: The Way of Water looks glorious while down-to-earth shows like Justified still see the lasers show their power every time the camera heads to the neon-lit interior of a bar. The Hisense also has comprehensive support for HDR, covering HDR10, HLG, and Dolby Vision (which was also added to the PX1-Pro after its launch).

The picture is one thing, but the sound is another. A pair of 15-watt speakers may pump out some volume, but they’re hardly a match for the visual capabilities of the projection system. In a 200 square-foot room, you’ll get the volume you need for a rousing time, especially where it comes to mids. But when cranking the volume up, treble becomes unpleasantly sharp and biting, while the deep bass range remains lacking no matter what. There’s just not enough here to shake your bones. They also hardly muster reasonable stereo separation, so the promise of Dolby Atmos is a flat one. Plan to pair this projector with a cheap soundbar at a bare minimum. 

  • Picture quality score: 4.5/5

Hisense PX-2 Pro review: value

  • Expensive but not at the top of the range
  • Dimmer, but fair next to competition

The Hisense PX2-Pro isn’t a cheap projector, but it’s far less expensive than some of its UST compatriots. It’s a good bit cheaper than the $3,499 (about £2,850, AU$5,035) Epson Home Cinema LS800, a model it absolutely decimates in picture quality in a dark room, though it lags well behind the Epson’s brightness. It’s also much cheaper than the $5,499 (about £4000, AU$7500) Hisense L9H, though that model comes bundled with an ambient light rejecting screen.

This all helps make the PX2-Pro a compelling value in the UST projector realm, especially if you have a dim room. It’s got a great picture and is reasonably versatile. There are some threats to it from the portable 4K projector space, such as the JMGO N1 Ultra or even Hisense’s own C1, but the PX2-Pro is a force to be reckoned with regardless.

  • Value score: 4.5/5

Hisense PX2-Pro remote control held in reviewer''s hand

The included remote control has a direct input button for the Netflix streaming app (Image credit: Future)

Should I buy the Hisense PX-2 Pro?

Hisense PX2-Pro projector on stand

(Image credit: Future)

Buy it if...

Don’t buy it if… 

Also consider...

Epson LS800
The Epson LS800 uses a 3LCD laser light source to beam a stunningly bright 4,000 lumens image. This makes it a great option for daytime sports viewing and it also has good built-in sound. Here's our full Epson LS800 review.

Hisense PX2-Pro image projected onscreen of an anime movie

(Image credit: Future)

How I tested the Hisense PX-2 Pro

  • Tested at home in multiple, real-world viewing conditions
  • Presented the display with a variety of media and formats
  • I have tested numerous projectors and displays over the last half-decade

I tested the Hisense PX2-Pro at home, in real-world conditions. This saw it faced with ambient light coming in from numerous windows, in-room lighting, as well as ambient noise that both the projector and speaker systems had to overcome. The projector was tested both against a bare, white wall and an Akia Screens CineWhite screen. It was presented with streamed content, HDR and non-HDR, and console gameplay. 

My testing evaluates the projector’s performance with respect to its price and competition from other models I and colleagues at TechRadar have tested.

I have been testing projectors since 2021 and displays for even longer.

First reviewed: October 2023

Leesa Sapira mattress review 2023: a clear win for lightweight side and combination sleepers
4:00 pm | August 13, 2023

Author: admin | Category: Computers Gadgets | Tags: | Comments: Off

Leesa Sapira mattress: two-minute review

The Leesa Sapira is the bestselling mattress from the Phoenix-based brand. It’s a premium hybrid mattress with six layers: from a soft, zip cover through dense foam layers, then coils in the support layer and a base layer for durability. While the brand cites it as medium to medium-firm (6 to 8 out of 10)—which, per the official product page, 86 percent of customers agree with—I believe it’s a solid 6 out 10, i.e., leaning closer to medium. 

As a lightweight combination sleeper (side and back) prone to side sleeping, I can attest that the Leesa Sapira was exceptionally suited to my needs. Since it offers a perfect balance of contouring and support for my body type and sleep preferences, I highly recommend it for similar sleepers. Given its limited motion transfer per our tests, it’s a great investment for co-sleepers who rouse easily and/or move around at night if they’re part of the aforementioned groups, as well.

The Leesa Sapira Hybrid Mattress on a bed

(Image credit: Michele Ross)

Several design elements (across the cover, top foam layers, and coils) are meant to provide cooling properties. While the Leesa Sapira is cool to the touch and doesn’t necessarily cause overheating, sleepers who run extra hot at night may find these elements to be lacking. Similarly, despite the reinforced perimeters, the edge support isn’t great. 

With these potential cons in mind, they shouldn’t be deal breakers for most—though the hefty price tag may make you think twice before buying. Moreover, some competitors in the same tier provide greater value and customization to back up the cost. Be sure to read the TechRadar best mattress buying guide for greater insights.

I’ll expand on each of these points throughout this review. Ahead, learn more about the Leesa Sapira and how it performed across all major areas of performance—including pressure relief, motion isolation, edge support, cooling, and ease of setup—during my 3-week testing period.

Leesa Sapira mattress review: price

  • Premium range mattress
  • Ongoing sales can save $150 to $400 off MSRP
  • Comes with complimentary no-contact shipping and 2 pillows

No matter if you buy the Leesa Sapira at full MSRP or during one of its many promotional periods, it still comes with a luxury price tag. Leesa offers two other hybrid mattresses—the Original Hybrid and the Legend Hybrid—with the Sapira priced in between the two.

Here’s the current price list for the Leesa Sapira mattress, both at full MSRP and usual promotional pricing:

  • Twin size: MSRP $1,349 (normally sells for $1,199)
  • Twin XL size: MSRP $1,399 (normally sells for $1,249)
  • Full size: MSRP $1,699 (normally sells for $1,499)
  • Queen size: MSRP $1,999 (normally sells for $1,699)
  • King size: MSRP $2,299 (normally sells for $1,899)
  • Cal king size: MSRP $2,299 (normally sells for $1,899)

In most cases with mattress brands, there’s almost always a deal to be had. The Leesa Sapira is no different. Be sure to time your purchase right in order to save as much as $400 off the MSRP. (Note: Total savings will largely depend on your mattress size.) Don’t forget to bookmark TechRadar’s mattress sales guide, which is regularly updated with the latest promotions.

Leesa offers free no-contact delivery, as well as two free pillows, with your Sapira purchase. These perks help increase the value for what you pay. If preferred, you can also shell out $199 for in-home setup and removal of your old mattress.

Leesa Sapira mattress review: specs

Leesa Sapira mattress review: materials and design

  • Hybrid mattress with memory foam, coils, and other foams
  • Foam in comfort and recovery layers are top-notch
  • Several elements are intended to enhance breathability

The Leesa Sapira has six layers, starting with a soft zippable cover consisting of ultra-fine fibers of viscose and plant-based rayon that are intended to wick moisture. The comfort foam layer is 1.5 inches thick and 3 pounds in weight, contours to the body, and is punctured with air channels to maximize airflow. The memory foam recovery layer is 1.5 inches thick and weighs 4 pounds, and is designed to further contour the body and alleviate pressure across the shoulders, back, and hips. (These two foam layers are denser than most across competitors and thus enhance the value of this premium hybrid mattress.) The transition foam layer is 1 inch thick and 2 pounds in weight, and functions as a structural buffer above the coiled support layer.

The Leesa Sapira Hybrid Mattress on a bed

(Image credit: Michele Ross)

The responsive support layer is 6 inches thick, consisting of 1,000+ pocket springs, which reduce motion transfer while aiding air circulation. The final base layer, which offers support and stability, is 1 inch thick and 2 pounds in weight.

Each of the foams used in the Leesa Sapira is CertiPUR-US certified. This means that they meet precise standards for human and environmental safety, including having a low amount of volatile organic compounds (VOCs) compared to other foams lacking this certification.

Design score: 4 of 5

Leesa Sapira mattress review: comfort

  • Medium feel that balances softness and support
  • Best for lightweight side and combination sleepers, plus select back sleepers
  • Lacks support for heavier sleepers and most stomach sleepers

Over the course of 3 weeks, I tested the Leesa Sapira for comfort and support across all sleeping positions. For reference, I’m a combination sleeper naturally prone to side sleeping (though I’m striving to sleep more on my back), and I weigh under 130 pounds.

The Leesa Sapira gently contours the body. While laying down, I felt that I was neither “sinking in” (which you may feel with softer mattresses or full memory foam mattresses) nor “bouncing up” (which you may feel with firmer innerspring mattresses).

I was pleased to discover that it was the perfect fit for my needs while side sleeping. It offered exceptional comfort and pressure relief in my neck and hips in particular, places where I always carry tension. (I had just reviewed a slightly firmer mattress without the cushion of the memory foam, and my body thoroughly enjoyed the extra cradling.) Back sleeping was just as comfortable; I felt supported and didn’t experience any discomfort sleeping in this position. Average weight sleepers who prefer these positions and medium mattresses may also take well to the Leesa Sapira.

While stomach sleeping, I didn’t feel much sagging or misalignment, in part given my relatively light body weight. However, most stomach sleepers should aim for a firm mattress (8+ on a scale of 10) for optimal comfort, support, and spinal alignment.

A kettlebell sinking into the Leesa Sapira Hybrid Mattress

(Image credit: Michele Ross)

According to Leesa, the Sapira has a medium to medium-firm feel (6 to 8 out of 10). I believe it’s closer to 6 and thus medium, based on other reviews and carrying out a kettlebell test. When I placed a 55-pound kettlebell—which mimics body weight—at the center of the mattress, I measured the sinkage as 3 inches. This was a half-inch shy of a medium to medium-firm mattress I’d gauged as a 6.5, also based on previous mattress reviews.

Comfort score: 4.5 out of 5

Leesa Sapira mattress review: temperature regulation

  • Cool to the touch
  • Cooling properties aren’t too impressive given the design features
  • Not ideal for very hot sleepers

The Leesa Sapira has several elements meant to enhance breathability and airflow. There's a moisture-wicking cover, a comfort foam layer super-charged with air channels, and a support layer with pocket springs to help with circulation and keep cool.

A hand pressing down on the Leesa Sapira Hybrid Mattress

(Image credit: Michele Ross)

Despite these features and the fact that the mattress is cool to the touch, I wouldn’t say it boasts an exceptional performance for temperature regulation. I sometimes run hot at night, and did so on a few occasions during my testing period. (Note: I reviewed the Leesa Sapira during a Southern California summer with nightly temperatures averaging in the high-60s Fahrenheit, so some level of warmth was anticipated.)

That said, enhanced cooling properties in a mattress are a nice perk rather than an absolute necessity for me. Long-term hot sleepers may find greater relief from the heat with another mattress with exceptional cooling properties. Non-hot sleepers, however, should fare just fine.

Temperature regulation score: 4 of 5

Leesa Sapira mattress review: motion isolation

  • Thick top foam layers absorb motion very well
  • Excelled in our motion transfer tests
  • Great for co-sleepers, including those who sleep light

I did another kettlebell test to see how well the Leesa Sapira absorbs motion. I placed a wine glass in the center of the mattress and dropped a 15-pound kettlebell at 4, 10, and 25 inches away. How much the wine glass moves can adequately reflect how much motion you may feel if a co-sleeper (or a pet) moves around at night. As such, motion isolation is an important performance indicator for light co-sleepers with a fidgety partner.

A wine glass, kettlebell and tape measure on the Leesa Sapira Hybrid Mattress

(Image credit: Michele Ross)

The glass moved but recovered when I dropped the kettlebell 4 inches away. It looked in danger of tipping, but held out with successive tests. It wobbled a lot less upon dropping the weight from 10 inches away and barely moved 25 inches away. Given this impressive performance, co-sleepers should have no issue on the motion transfer front.

Motion isolation score: 5 out of 5

Leesa Sapira mattress review: edge support

  • Perimeters are reinforced in coil layer
  • Support isn’t great sitting on the side of the bed
  • Doesn’t provoke fear of rolling off

I assessed edge support with a further kettlebell test. I placed my 55-pound weight along the perimeter of the mattress. It dipped just under 3 inches, so it was close to the amount of sinkage in the middle. (Edge support is typically good if the edge sinkage matches the center sinkage.)

It didn’t perform as well with my own experience sitting on the edge of the bed. Although the perimeters are reinforced in the coil layer, I still felt that the support was lacking: basically drooping instead of lifting me up. However, I never felt in danger of rolling off while sleeping—though co-sleepers and heavier sleepers may want to consider this before buying the Leesa Sapira.

Edge support score: 3.5 out of 5

Leesa Sapira mattress review: setup

  • Mattress delivered vacuum-packed, rolled and boxed
  • Free delivery or in-home setup for $199
  • Faint off-gassing smells through the first day

Each purchase of the Leesa Sapira comes with free no-contact delivery. You can’t choose a delivery date or time with this option, but you can follow tracking details provided. You can also opt for white-glove service for $199 (available in select locations). Two people will unbox and set up the mattress as well as remove your old one. This option also comes with greater control over your delivery date, and you can choose a 4-hour window.

The Leesa Sapira Hybrid Mattress in its delivery box

(Image credit: Michele Ross)

I did the standard delivery, and I was glad that the mattress arrived directly at my unit doorstep, saving me the trouble of pushing it down my long hallway. The box has perforations for your hands in case you need to pull it, too. I followed the instructions on the box, placing the rolled mattress directly on the bed instead of cutting through the plastic first. As a solo unboxer, this proved to be very challenging—even though, historically, I don’t have much difficulty unboxing other mattresses on my own. The full process took about 30 minutes and a lot of sweat. If you’re unboxing solo, proceed with caution, or consider enlisting help from someone else.

The Leesa Sapira Hybrid Mattress in its delivery box

(Image credit: Michele Ross)

The mattress hissed once I finally unwrapped the first thick layer of plastic. It quickly expanded to near-full form once I cut through the second vacuum-wrapped later. (Leesa says it’s ready to sleep on that night but perfect the day after.) I noticed a slight off-gassing smell about 3 feet away for a few hours after unboxing, and could also notice it before falling asleep that night.

Setup score: 3.5 out of 5

Leesa Sapira mattress review: customer reviews

  • 4.4 out of 5 stars (average) from 2,000 reviews on website
  • Side sleepers, back sleepers, and co-sleepers offered the most praise
  • Level of firmness, edge support, and inaccurate size were cited as complaints

I finished my Leesa Sapira review in early August 2023. At the time of writing, the mattress has 4.4 out of 5 ratings from 2,016 reviews on the brand’s product page. Based on my own experience and testing, I wasn’t surprised to see that customers who loved the Sapira most were side sleepers, back sleepers, and co-sleepers with these preferences. Many experienced greater comfort, support, relief from aches and pains, and better sleep overall. In addition, a few dozen customers called out Leesa’s high-quality customer service. Others took note of the high-quality materials built to last.

Customers who were less pleased with their purchase found the Leesa Sapira to be softer or firmer than expected. These markers are often subjective and based on countless extenuating factors, so you should read them with a grain of salt—or at least focus on those that have more details for contextual clues. I also saw some reviewers call out that the edges of the bed were uneven or slanted, which I also noticed with my own mattress.

West Elm and Pottery Barn also sell the Leesa Sapira, but reviews aren’t currently listed on either site. Home Depot also carries this model and has 13 reviews at present. Most are positive, but some customers complained about the unimpressive edge support, not being worth the cost, and issues with sizing.

Should you buy the Leesa Sapira mattress?

The Leesa Sapira Hybrid Mattress on a bed

(Image credit: Michele Ross)

If you’re like me—that is, under 130 pounds, a side sleeper, or a combination sleeper—the Leesa Sapira should definitely be near the top of your list. It perfectly fit my needs for comfort and support, two elements I find to be paramount. Light to average weight side sleepers, back sleepers, and co-sleepers are also good candidates for this model.

Most heavier sleepers (potentially excluding select side sleepers) and stomach sleepers (potentially excluding lightweight sleepers) may be better off with a firmer mattress. Sleepers who run extra hot at night may also find greater relief with another mattress, but you should be fine if temperature regulation isn’t an ongoing issue for you. Again, edge support is the least impressive aspect of the Leesa Sapira, but it’s not a major red flag.

Of course, most people will also need to factor in the price in light of the above considerations. The top foams are super dense and of high quality, which can partially justify the cost. If you don’t need a ton of personalization (i.e. the option to choose how thick or firm your hybrid mattress is) and you’re fine with a medium-feel and the price tag, I say it’s worth giving the Leesa Sapira a shot so long as you’re aligned with the key points shared above.

Also consider

Leesa Original Hybrid Mattress
Leesa’s Original Hybrid Mattress is a good option if you want to stick with the brand but save a bit of money. It, too, has a medium to medium-firm feel. However, the foams are less dense and use about 20 percent less coils, so it may not be as comfortable and cool as the Sapira.

How I tested the Leesa Sapira

I slept on a queen-size Leesa Sapira mattress for three weeks in July in Los Angeles, where average nightly temperatures were in the high-60s Fahrenheit. I kept my bedside window open and my ceiling fan on to invite a breeze. I used bamboo sheets and a bamboo duvet to cover an alternative down comforter. Sometimes, I slept on top of these layers to help beat the summer heat.

I ran standardized tests to objectively gauge softness, edge support, and motion isolation. These helped inform my subjective experience as a specific type of sleeper within a specific weight class, yet also shed light on the Leesa Sapira’s key performance indicators given my experience reviewing other mattresses.

Nvidia GeForce RTX 4060: the best midrange graphics card for the masses
4:00 pm | June 28, 2023

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , , | Comments: Off

Editor's Note

• Original review date: June 2023
• Launch price: MSRP at $299 / £289 / AU$479
• Lowest price now: $445.60 / £279 / AU$543

Update – April 2025: With the recent release of Nvidia's 50 series graphics cards—and the mixed reviews they've received from both reviewers and customers alike—the Nvidia RTX 4060 would look like the best graphics card to buy for upgrading your 1080p GPU under normal circumstances, but unfortunately this card is becoming increasingly hard to find on store shelves.

Worse still, those places that do still have the card in stock are selling it for much more than the card's MSRP in the US, making it a much less attractive option when the Nvidia GeForce RTX 5060 will launch at the same MSRP in a few weeks (the UK and Australia have an easier time finding it at or near RRP). Even with the current US market's price inflation, the RTX 5060 will likely sell for the same amount as the inflated RTX 4060 prices I'm seeing online today.

If you can find this card for its MSRP or less and you only want something cheap for 1080p gaming, definitely consider it, as it's one of the best cheap graphics cards you're going to find. Otherwise, I'd recommend you wait and see what the RTX 5060 looks like, not to mention the AMD Radeon RX 9060 XT and RX 9060, which are also due out in the next month or two and should be priced similarly.

Original unedited review follows...

Nvidia GeForce RTX 4060: Two-minute review

Nvidia really wants you to know that the Nvidia GeForce RTX 4060 is a card for those who are still running a GTX 1060 or RTX 2060, and it's really Team Green's best marketing strategy for this card.

To be clear, the Nvidia RTX 4060 is respectably better than the Nvidia RTX 3060 it replaces, and comes in at a lower launch MSRP of $299 (about £240/AU$450) than its predecessor. Its 1080p gaming performance is the best you're going to find under $300, and its 1440p performance is pretty solid, especially when you turn on DLSS. If you're playing a game with DLSS 3 and frame generation, even better.

Unfortunately, the card's 4K performance suffers due to the limited video memory it's working with, which is a 50% decrease from the initial RTX 3060 run's 12GB VRAM pool (though at least it doesn't go below the 8GB of the later RTX 3060s).

You also get more sophisticated ray tracing and tensor cores than those found in the Ampere generation, and this maturity shows up in the card's much-improved ray tracing and DLSS performance.

There are also some added bonuses for streamers as well like AV1 support, but this is going to be a lower-midrange gamer's card, not a streamer's, and for what you're getting for the price, it's a great card.

The real problem for this card though is the Nvidia GeForce RTX 3060 Ti. For more than a year after the RTX 3060 Ti hit the scene, it topped our best graphics card list for its spectacular balance of price and performance, punching well above its weight and even outshining the Nvidia GeForce RTX 3070.

Ever since the crypto bubble popped and Nvidia Lovelace cards started hitting the shelves, the last-gen Nvidia Ampere cards have absolutely plummeted in price, including the RTX 3060 Ti. You can now get the RTX 3060 Ti for well below MSRP, and even though the RTX 4060 outperforms the RTX 3060 by roughly 20%, it still falls short of the RTX 3060 Ti, so if you are able to get an RTX 3060 Ti for near or at the same price as the RTX 4060, it might be a better bet. I haven't seen the RTX 3060 Ti drop that low yet, but it's definitely possible.

The reason why the RTX 3060 Ti is competitive here is especially because many of the best features of the RTX 4060 depend on other people implementing Nvidia's DLSS 3 technology in their products. DLSS 3 with Frame Generation is incredible for most games (though there are some latency issues to work out), but the number of games that implement it is rather small at the moment.

Many newer games will have it, but as we've seen with the recent controversy over Starfield partnering with AMD, one of the biggest PC games of the year might not have DLSS implemented at all at launch. It's a hard thing to hold against the RTX 4060 as a solid negative, since when the technology is implemented, it works incredibly well. But it's also unavoidable that Nvidia's biggest selling point of this generation of graphics cards is explicitly tied to the cooperation of third-party game developers.

With something like the Nvidia GeForce RTX 4070, DLSS 3 is a nice feature to have, but it doesn't make or break the card. With the RTX 4060, its appeal is deeply tied to whether or not you have this tech available in your games, and it seriously undercuts the card when it isn't. Its non-DLSS performance is only better than the RTX 3060 by a standard gen-on-gen uplift at 1080p, and without DLSS, 1440p gaming is possible, but will be severely hampered by the limited VRAM. 4K gaming, meanwhile, would be out of the question entirely.

All that said, the Nvidia RTX 4060 is still going to be one hell of an upgrade for anyone coming from a GTX 1060 or RTX 2060, which is really where this card is trying to find its market. RTX 3060 gamers will honestly be better off just saving up some more money for the RTX 4070 than worrying about the RTX 4060 (and you can probably skip the Nvidia RTX 4060 Ti, honestly).

If you're looking for the best cheap graphics card from Nvidia, the Nvidia GeForce RTX 4060 is probably as good as it's going to get for a while, since there have been few - if any - rumblings about an Nvidia RTX 4050 or Nvidia RTX 4050 Ti coming to the budget segment any time soon. Whether it's worth upgrading from an RTX 3060 is debatable, but if money is tight and looking for an upgrade from the Pascal- or Turing-era 60-series cards, you'll absolutely love this card.

Nvidia GeForce RTX 4060: Price & availability

An Nvidia GeForce RTX 4060 on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • How much is it? MSRP is $299 / £289 / AU$479
  • When is it out? June 29, 2023
  • Where can you get it? Available globally

The Nvidia GeForce RTX 4060 is available on June 29, 2023, for an MSRP of $299 / £289 / AU$479, which is about 10% less than the RTX 3060 was when it launched in 2021.

There is a caveat to this pricing in that there is no Nvidia Founders Edition of the RTX 4060, so it is only available from third-party partners like Asus, PNY, and others. These manufacturers can charge whatever they want for the card, so you can expect to see many of the cards priced higher than Nvidia's MSRP, but there will be those like the Asus RTX 4060 Dual that I tested for this review that will sell at MSRP.

While this card is cheaper than most, it's not the cheapest of the current generation. That would be the AMD Radeon RX 7600, which has an MSRP of $269.99 (about £215/AU$405), which still offers the best performance-to-price value of any of the current-gen cards. Still, given the actual level of performance you get from the RTX 4060, it definitely offers a compelling value over its rival cards, even if they are cheaper in the end.

Nvidia GeForce RTX 4060: Features and chipset

An Nvidia GeForce RTX 4060 on a table with its retail packaging

(Image credit: Future / John Loeffler)
  • 3rd-gen ray tracing and 4th-gen tensor cores
  • Only 8GB VRAM
  • DLSS 3 with Frame Generation under $300

In terms of specs, the Nvidia GeForce RTX 4060 is a marked improvement over the Nvidia RTX 3060 thanks to a smaller TSMC 5nm process node compared to the RTX 3060's 8nm Samsung node. It also features much faster clock speeds, with a roughly 39% faster base and boost clock speed.

You also have a faster memory speed, but a smaller VRAM pool and smaller memory bus, so you end up with a roughly 25% smaller memory bandwidth, which really puts a ceiling on higher resolution performance.

Still, with faster clock speeds, more mature ray tracing and tensor cores, and a lower TGP than its predecessor, this is one of the most powerful and energy-efficient graphics cards in its class.

Nvidia GeForce RTX 4060: design

An Nvidia GeForce RTX 4060 on a table with its retail packaging

(Image credit: Future / John Loeffler)

There is no reference design for the Nvidia RTX 4060, since there is no Founders Edition, so the design of the card is going to depend entirely on which version you get from which manufacturer.

In my case, I received the Asus GeForce RTX 4060 Dual OC edition, which features a dual fan design and a much smaller footprint befitting a midrange card. Thankfully, the card uses an 8-pin power connector, so there's no need to fuss with any 12VHPWR adapter cables.

Image 1 of 2

An Nvidia GeForce RTX 4060 on a table with its retail packaging

(Image credit: Future / John Loeffler)
Image 2 of 2

An Nvidia GeForce RTX 4060 on a table with its retail packaging

(Image credit: Future / John Loeffler)

It comes with the now-standard three DisplayPort 1.4 and one HDMI 2.1 video outputs on this generation of Nvidia cards, so those with one of the best USB-C monitors will once again be out of luck here.

The card is a dual-slot width, so you shouldn't have any issues getting it into a case, and it's light enough that you really should be able to get away without having to use a support bracket.

Nvidia GeForce RTX 4060: Performance

An Nvidia RTX 4060 slotted into a test bench

(Image credit: Future / John Loeffler)
  • Best-in-class 1080p gaming performance
  • Huge improvement if coming from RTX 2060 or older
Test system specs

This is the system we used to test the Nvidia GeForce RTX 4060:

CPU: Intel Core i9-13900K
CPU Cooler: Cougar Poseidon GT 360 AIO
RAM: 64GB Corsair Dominator Platinum RGB DDR5-6600MHz
Motherboard: MSI MAG Z790 Tomahawk Wifi
SSD: Samsung 990 Pro 2TB NVMe M.2 SSD
Power Supply: Corsair AX1000
Case: Praxis Wetbench

When it comes to 1080p, the Nvidia RTX 4060 offers the best gaming performance under $300.

The AMD RX 7600 gives it a run for its money in pure rasterization performance, and even manages to beat out the RTX 4060 on occasion, but once you start cranking up ray tracing the RTX 4060 absolutely pulls away from its rivals.

This is especially true when you flip the switch on DLSS, which makes 1440p gaming a very feasible option with this card. While this definitely isn't going to be one of the best 1440p graphics cards, on certain titles with certain settings, you'll be surprised what you can get away with.

Synthetic Benchmarks

When it comes to synthetic benchmarks, you get the typical blow-for-blow between Nvidia and AMD cards that we've seen in the past, with AMD outperforming on pure rasterization tests like 3DMark Time Spy and Firestrike, while Nvidia pulls ahead on ray tracing workloads like Port Royal and Speedway.

The RTX 4060 and RX 7600 are close enough in terms of raw performance that it might as well be a wash on average, but it's worth noting that the RTX 4060 is about 20% better on average than the RTX 3060. I point that out mostly to contrast it with the RTX 4060 Ti, which was only about 10-12% better than the RTX 3060 Ti on average.

A 20% improvement gen-on-gen, on the other hand, is much more respectable and justifies considering the RTX 4060 as an upgrade even with an RTX 3060 in your rig. You might not actually make that jump for an extra 20% performance with this class of GPU, but it's at least worth considering, unlike with the RTX 4060 Ti.

Gaming Benchmarks

Where the RTX 4060 really takes off though is in gaming performance. Compared to the RX 7600, it's more or less even when just playing at 1080p with max settings without ray tracing or upscaling. Notably, the RTX 4060 actually underperforms the RX 7600 by about 9% in Cyberpunk 2077 when you're not using ray tracing or upscaling.

Crank ray tracing up to Psycho in Cyberpunk 2077 though, and the value of the RTX 4060 really starts to show through. The RX 7600 absolutely tanks when RT is maxed, but that's not universal across the board. In other games, the RX 7600 is competitive, but Cyberpunk 2077 really is AMD's Achilles' Heel. Meanwhile, the RTX 3060 holds up fairly well on some titles, while the RTX 4060 pulls ahead by a substantial amount on others.

With upscaling turned on, the RTX 4060 manages to substantially outperform both the RTX 3060 and the RX 7600. If you leave the base DLSS settings and don't mess with frame generation, the RTX 4060 pulls off a clean win on Cyberpunk 2077, while it has a slightly lower average framerate than the RTX 3060, but a higher minimum framerate, it's a much more stable experience across the board.

Once you turn on frame generation though, things swing dramatically in the RTX 4060's favor. You can even increase the resolution in Cyberpunk 2077 to 1440p with Frame Generation on and you'll get more fps on average and at a minimum than you would with the RTX 3060 at 1080p, while the RX 7600 simply can't keep up at this level.

Unfortunately, a lot of this is dependent on developers implementing Nvidia's new technology. Without DLSS 3 with Frame Generation, you still get respectably better performance than the RTX 3060, but nothing that absolutely blows you away.

Meanwhile, the RX 7600 offers a compelling alternative if you're looking to save some money and don't care about 1440p or ray tracing.

Still, if you can toggle a setting and give yourself an extra 50 fps on a demanding game, there really is no comparison, and on this alone, the RTX 4060 wins out by default.

Should you buy the Nvidia GeForce RTX 4060?

A man's hand holding up the Nvidia RTX 4060

(Image credit: Future / John Loeffler)

Buy it if...

You want the best 1080p gaming under $300
This card is a 1080p champ in its weight class, even if it walks right up to the line of the middle midrange.

You want fantastic ray tracing support
Nvidia pioneered real-time ray tracing in games, and it really shows here.

Don't buy it if...

You want the best value
While the RTX 4060 is very well-priced, the AMD RX 7600 offers a much better price-to-performance ratio.

You don't care about ray tracing or upscaling
Ray tracing is honestly overrated and a lot of games don't offer or even need upscaling, so if you don't care about these features, Nvidia's RTX 4060 might not offer enough for you to spend the extra money.

Nvidia GeForce RTX 4060: Also consider

AMD Radeon RX 7600
Team Red's competing midrange card is a fantastic value while offering compelling 1080p performance (so long as ray tracing and upscaling aren't your biggest concerns).

Read the full AMD Radeon RX 7600 review

Nvidia GeForce RTX 3060 Ti
With graphics card prices for the Nvidia RTX 3000-series continuing to come down, its possible that this card might come in close to the RTX 4060's MSRP, and with its better performance, it offers a compeling alternative.

Read the full Nvidia GeForce RTX 3060 Ti review

How I tested the Nvidia GeForce RTX 4060

  • I spent about a week testing the card
  • I looked at the cards gaming performance and raw synthetic performance
  • I used our standard battery of graphics card tests and several current PC games to push the GPU to its limits.

I spent extensive time testing the RTX 4060 over a number of days, using synthetic tests like 3DMark and Passmark, while also running several games on the card at different settings and resolutions.

I also tested its closest rival card as well as the card it is replacing in Nvidia's product stack and compared the performance scores across the cards to assess the card's overall performance.

I did this using the latest Nvidia and AMD drivers on a test bench using all of the same hardware for each card tested so that I could isolate the graphics card's contribution to the overall performance I found in-game or in synthetic benchmarks.

We pride ourselves on our independence and our rigorous review-testing process, offering up long-term attention to the products we review and making sure our reviews are updated and maintained - regardless of when a device was released, if you can still buy it, it's on our radar.

Read more about how we test

First reviewed June 2023

Intel Core i5-13600K: the best everyday CPU around
1:00 am | May 6, 2023

Author: admin | Category: Computers Gadgets | Tags: , , , | Comments: Off

Intel Core i5-13600K: Two-minute review

The Intel Core i5-13600K follows up one of the top budget chips ever and manages to improve on just about everything across the board, except for the price.

When Intel announced its Raptor Lake processors, a lot of us were a bit dismayed that the price of the Core i5 went up by nearly 15% over the Intel Core i5-12600K that preceded it. That chip was arguably the best processor ever made for budget gaming PCs and those who need good performance without a whole lot of extras at a fair price.

At $329 (about £280 / AU$475), the Intel Core i5-13600K puts itself just outside of the budget class of processors. And that's a shame because otherwise, this is the best processor for the vast majority of people and even for a lot of those who tell themselves that they absolutely must have something more powerful like the Intel Core i7-13700K.

Across the general lineup of performance tests I threw at this chip, it pretty much came out on top in every one of them, beating out the competing AMD Ryzen 5 7600X and substantially outperforming the Core i5-12600K. Getting into the nitty-gritty, the Ryzen 5 7600X puts up a much better fight against the i5-13600K than I was expecting, beating the 13600K to a rough draw by the end.

That does mean that if you're looking for a budget gaming CPU, you're probably going to be better off with the Ryzen 5 7600X since you can save a bit of money in the process. But that savings can easily be gobbled up and then some by the extra cost to upgrade to DDR5 RAM, which the i5-13600K still lets you skip in favor of the aging DDR4 RAM that most people still have. So there is definitely a trade-off to be made in either case.

Ultimately though, there's just no denying that the Intel Core i5-13600K has better specs and performance at this price range, give or take a little spare change. So this is a very easy processor to recommend to just about anybody who isn't a gamer or creative professional.

Intel Core i5-13600K: Price & availability

An Intel Core i5-13600K

(Image credit: Future / John Loeffler)
  • MSRP: $329 (about £280 / AU$475)
  • More expensive than competing Ryzen 5 7600X

The Intel Core i5-13600K is on sale now for $329 (about £280 / AU$475). This puts it at about 10% more expensive than the competing AMD Ryzen 5 7600X and about 14% more expensive than the Core i5-12600K.

Considering that the Intel Core i9-13900K didn't get a price increase over its 12th-gen counterpart, the price hike here is probably the biggest disappointment with this chip. Enthusiast users are used to spending the extra money to have the best right out the gate, so they could absorb some of the price inflation rather than let it fall squarely on the one chip that most people are going to use.

This is especially bad considering that AMD's competing chip is right there for a good bit less. There are performance considerations here, obviously, and we'll get to those soon. Still, at this level, the performance difference is not so great as to really justify taking the best Intel processor in the budget class and pushing it into the lower mid-range for a few extra bucks.

  • Price score: 3.5 / 5

Intel Core i5-13600K: Chipset & features

An Intel Core i5-13600K

(Image credit: Future / John Loeffler)
  • Overclockable
  • Supports DDR4 and DDR5

The Intel Core i5-13600K is Intel's second-gen big.LITTLE mainstream processor, following up the i5-12600K, and there have been some big improvements on the architecture side.

My test bench specs

These are the systems I used to test desktop CPU performance for both AMD and Intel systems in this review:

CPU Cooler: Cougar Poseidon GT 360 AIO
Graphics card:
Nvidia GeForce RTX 4090
SSD:
Samsung 980 Pro SSD @ 1TB
Power Supply:
Corsair AX1000 80-Plus Titanium (1000W)
Case:
Praxis Wetbench

Intel motherboard and RAM:
Motherboard:
MSI MPG Z690 Carbon Wifi
DDR5 RAM: 32GB Corsair Dominator Platinum @ 5,200MHz & 32GB Kingston Fury Beast @ 5,200MHz 

AMD motherboard and RAM:
Motherboard:
ASRock X670E Taichi
DDR5 RAM: 32GB Corsair Dominator Platinum @ 5,200MHz & 32GB G.Skill Trident Z5 Neo @ 5,200MHz

While Intel Meteor Lake chips still use the same 10nm "Intel 7" process as the previous 12th-gen Alder Lake chips, the 13th-gen chips improve on the previous architecture in a number of key ways. 

In addition to more cache memory, there have been some improved clock speeds on the high-end, so that the i5-13600K runs slightly slower at base frequency while boosts slightly higher than the 12600K — though both Intel chips have a lower base and boost frequency than the competing AMD Ryzen 5 7600X.

In terms of core counts, the i5-13600K doubles the efficiency cores over the i5-12600K, for a total of 14 cores and 20 threads to the i5-12600K's 10 cores and 16 thread. This is also substantially more than the Ryzen 5 7600X, which is a straight six-core/12-thread chip with all its cores being full-power performance cores.

And while the rated 125W TDP for the i5-13600K remains the same as with the 12600K, it pulls substantially more power under load than its predecessor in my tests, so plan your build accordingly.

Finally, like its predecessor, the Core i5-13600K supports both PCIe 5.0 and DDR4 and DDR5 RAM, so you can either upgrade to new DDR5 RAM or stick with the best RAM of the DDR4 generation, which definitely helps defray the cost of an upgrade.  

  • Chipset & features score: 4 / 5

Intel Core i5-13600K: Performance

An Intel Core i5-13600K

(Image credit: Future / John Loeffler)
  • Fantastic all around performance
  • Decent gaming chip
  • Low performance per watt rating

The Intel Core i5-13600K is the best processor all-around for most people right now, though that does come with a number of caveats.

Generally, the Core i5-13600K outperforms both the Core i5-12600K and Ryzen 5 7600X by a substantial amount, and while the Ryzen 5 7600X holds its own against the i5-13600K, it's a qualified success rather than a straightforward win.

When it comes to synthetic performance, the Intel Core i5-13600K simply overpowers both chips with a larger number of cores, faster clocks, and raw power wattage. Overall, the Core i5-13600K performs about 42% better than the Ryzen 5 7600X and about 26% better than the Core i5-12600K.

In creative workloads, the Core i5-13600K is a great option for folks on a budget who want to dabble in some creative work like 3D rendering or photo editing. But with only six performance cores, using the best graphics card possible will be far more determinative in most cases. That said, the Core i5-13600K outperforms the Ryzen 5 7600X by about 21% and the 12600K by about 12%.

In my gaming performance tests, the Ryzen 5 7600X actually scores a technical win here, chalking up an extra 2 fps on average over the 13600K, but this might as well be a wash. The 13600K does manage a very solid improvement over its predecessor though, getting as much as 34% higher fps, but landing a solid 20% average performance improvement.

In the end, the Core i5-13600K outperforms the Ryzen 5 7600X by about 40%, while improving on the Core i5-12600K's performance by about 25%. As far as bottom line results go, this would make this processor a slam dunk, but one thing keeps this chip from true greatness: its power consumption.

While the 13600K has the lowest minimum power draw of the three chips tested with 1.973W (an 18% lower power consumption than the 12600K's minimum of 2.415W), it also maxes out at an astonishing 204.634W, which is about 83% more power to achieve a roughly 40% better performance.

This chip also draws 65% more power than the Core i5-12600K for a roughly 25% better performance. These are hardly signs of efficiency, and it continues the exact wrong trend we saw with Intel Alder Lake. For comparison, the AMD Ryzen 9 7950X has a max power draw of 211.483W, and its 3D V-Cache variant has an incredibly tight 136.414W power draw in my AMD Ryzen 9 7950X3D review

So yeah, it's not hard to put up the kind of numbers that the Core i5-13600K does when Intel turns the electron firehose to full on its processor. Considering how this is the ideal chip for a budget build, that build will now have to factor in a bigger PSU than it should account for a burst of power demand from a chip "rated" for 125W. 

Is this a dealbreaker? Not yet, but if Intel thinks it can keep the top spot by just keeping its foot on the gas while AMD is making real investments in power efficiency within a single generation of processors, this won't be good for Intel in the long run.

  • Performance: 4 / 5

Should you buy the Intel Core i5-13600K?

An Intel Core i5-13600K

(Image credit: Future / John Loeffler)

Buy it if...

Don't buy it if...

Also Consider

If my Intel Core i5-13600K review has you considering other options, here are two processors to consider... 

How I tested the Intel Core i5-13600K

  • I spent nearly two weeks testing the Intel Core i5-13600K
  • I ran comparable benchmarks between this chip and rival processors
  • I gamed with this chip for several days

I spent an extensive amount of time testing the Core i5-13600K over the past two weeks, including using the processor in my primary work and gaming machine at home.

In addition to general work tasks and gaming, I used the processor extensively for content creation work like Adobe Photoshop, Adobe Illustrator, and Blender 3D modeling.

I also ran an extensive battery of benchmark tests on this chip and rival CPUs a customer might consider, using as close to identical hardware as possible in order to gather sufficient comparable data to determine how the chips performed in real-life and simulated workloads.

Read more about how we test

First reviewed May 2023

Nvidia GeForce RTX 4070 review
4:00 pm | April 12, 2023

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Editor's Note

• Original review date: April 2023
• Launch price: MSRP at $599 / £589 / AU$1,109
• Lowest price now: $792.99 / £485.28 / AU$899

Update – April 2025: With the release of the Nvidia GeForce RTX 5070, you'd hope that it would force the price of the RTX 4070, one of the best graphics cards of the last generation, down somewhat, but that doesn't appear to be the case in the US (UK and Australian shoppers can actually find this card for less than RRP right now).

With the lowest price I've found for the RTX 4070 is in the US, coming in at just under $800, it's a much harder card to recommend in 2025, especially with AMD Radeon RX 7800 XT cards selling for much less and much more readily available online.

Original unedited review follows...

Nvidia GeForce RTX 4070: Two-minute review

The Nvidia GeForce RTX 4070 is here at long last, and for gamers who've been starved for an upgrade, go ahead and pick this one up. It can do just about everything.

It's hard to follow up the RTX 3070, one of the best graphics cards of all time, and in our Nvidia GeForce RTX 3070 review, we praised that card for being an outstanding performer at 1080p and 1440p — which is where the overwhelming number of PC gamers game at — while also being a much more affordable option over the other two launch cards for Nvidia's Ampere lineup. We especially noted how the RTX 3070 offered comparable performance to the RTX 2080 Ti for half the price.

Everything we said about the RTX 3070 applies just as easily to the RTX 4070, only now it doesn't just dabble in 4K; it can competently game at every resolution, making it a graphics card that everybody can fall in love with without spending a fortune.

A lot has changed since the RTX 3070 launched towards the end of 2020, and unfortunately, not everything changed for the better. Things are more expensive pretty much everywhere you look, and the Nvidia RTX 4070 isn't immune. At $599 (about £510 / AU$870), the RTX 4070 is fully 20% more expensive than the RTX 3070 was at launch.

The Nvidia GeForce RTX 4070 graphics card standing on top of its retail packaging

(Image credit: Future / John Loeffler)

I'm not happy about this at all, and you shouldn't be either, but all you have to do is look at the scores the RTX 4070 puts up on the board and you'll be as hard pressed as I am to dock it any points for this. It consistently puts out RTX 3080-level performance more or less across the board and even manages to bloddy the nose of the Nvidia GeForce RTX 3080 Ti, and while the RTX 3080 beats out the RTX 4070 at native 4K, turn on DLSS and the RTX 3080 simply gets blown out.

On the other side of the aisle, the AMD Radeon RX 7900 XT is Team Red's nearest real competition, and it struggles to justify itself in the presence of the RTX 4070. While the RX 7900 XT solidly outperforms the 4070, it's also 50% more expensive, and the benefits of the RX 7900 XT get quickly drowned out by the power of DLSS, especially in titles with DLSS 3.

Moreover, the RTX 4070 makes for a pretty competent creator GPU, offering indie developers and artists who don't have the funding to get themselves an Nvidia GeForce RTX 4090 a handy option for getting some work done within a more limited budget. It's not going to power a major movie studio or anything, but if you're dabbling in 3D modeling or video editing, this card is great compromise between price and performance.

Finally, wrap this all into a package that feels like a downright normal graphics card from ye olden days, back before you needed to include support brackets and balast to keep your gaming PC from tipping over, and you end up with a graphics card that can easily power some of the best gaming PCs that can actually fit into your PC case and your budget.

This graphics card has its issues, which is inevitable, but given what's on offer here, it's easy enough to look past its shortcomings and enjoy some truly outstanding performance at at a reasonable enough price.

Nvidia GeForce RTX 4070 review: Price & availability

An Nvidia GeForce RTX 4070 graphics card seated inside its retail packaging

(Image credit: Future / John Loeffler)
  • How much is it? $599 (about £510 / AU$870)
  • When is it out? April 13, 2023
  • Third-party cards retail prices will match or exceed Nvidia's MSRP

The Nvidia GeForce RTX 4070 is available starting April 13, 2023, with an MSRP of $599 (about £510 / AU$870). Third-party partners will have their own versions of the RTX 4070 that will vary in price, but they will always have a matching or higher regular retail price than the Nvidia GeForce RTX 4070 Founders Edition.

Notably, the RTX 4070 is getting a 20% price increase over the card it's replacing, the RTX 3070, which had a launch price of $499 in the US (about £425 / AU$725). While we'd have loved to see the price stay the same gen-over-gen, this should come as no surprise to anyone who has been watching GPU price inflation recently.

The Nvidia GeForce RTX 4080, for example, has a ludicrously high MSRP of $1,199 (a roughly 72% jump over the RTX 3080), while the Nvidia GeForce RTX 4090 also increased its price over the Nvidia GeForce RTX 3090 to $1,599 from the 3090's $1,499.

Meanwhile, we haven't seen AMD's direct RTX 4070 competitor yet, the AMD Radeon RX 7800 XT, but the AMD Radeon RX 7900 XT is the closest AMD has this generation with an $899 / £799 (around AU$1,350) MSRP, putting it 50% more expensive than the RTX 4070.

This card is also the same price as the Nvidia GeForce RTX 3070 Ti, for what it's worth, and considering that the RTX 4070 punches well above the 3070 Ti's performance, you do at least get a better sense of value out of this card than anything from the last generation.

  • Price score: 4 / 5

Nvidia GeForce RTX 4070 review: Features & chipset

The power connector for an Nvidia GeForce RTX 4070 graphics card

(Image credit: Future / John Loeffler)
  • DLSS 3 with full Frame Generation
  • Third-gen Ray Tracing Cores and fourth-gen Tensor Cores
  • Lower TGP than RTX 3070

The Nvidia RTX 4070 doesn't change too much on paper over its last-gen predecessor, featuring the same number of streaming multiprocessors, therefore the same number of CUDA cores (5,888), ray-tracing cores (46), and tensor cores (184).

It does bump up its memory to the faster GDDR6X and adds an additional 50% VRAM for a total of 12GB. With a 192-bit bus and a memory clock of 1,313MHz, the RTX 4070 has an effective memory speed of 21 Gbps, equal to that of the Nvidia RTX 4070 Ti, for a memory bandwidth of 504.2 GB/s.

It has a lower base and boost frequency than the 4070 Ti, clocking in at 1,920MHz base and 2,475MHz boost (compared to 2,310MHz base and 2,610MHz boost for the 4070 Ti), but this is a substantial bump up from the 1,500MHz base and 1,725MHz boost frequency of the RTX 3070.

This is owing to the 5nm TSMC process used to fab the AD104 GPU, compared to the Samsung 8nm process for the RTX 3070's GA104. Those faster clocks also power next-gen ray tracing and tensor cores, so even though there are the same number of cores in both the RTX 4070 and the RTX 3070, the RTX 4070's are both much faster and more sophisticated.

Also factor in Nvidia Lovelace's DLSS 3 with Frame Generation capacity, something that Nvidia Ampere and Turing cards don't have access to, and what looks like two very similar cards on paper turns out to be anything but in practice.

Finally, thanks to the 5nm process, Nvidia is able to squeeze more performance out of less power, so the TGP for the RTX 4070 is just 200W, making it a fantastic card for a lower-power, sub-600W build.

  • Features & chipset: 5 / 5

Nvidia GeForce RTX 4070 review: Design

The RTX 4070 logo etched into the trim of the Nvidia GeForce RTX 4070 graphics card

(Image credit: Future / John Loeffler)
  • Same size as the RTX 3070
  • 16-pin power connector
  • Same design as RTX 4090 and RTX 4080

With the RTX 4070 Founders Edition, Nvidia finally delivers a next-gen graphics card that can actually fit in your case without requiring a construction winch to hold it in place.

OK, the previous cards weren't that bad, and even at the reduced form factor and weight, you'll still want to toss a GPU bracket into your case for good measure (there's no harm in protecting your investment, after all).

Image 1 of 6

An Nvidia GeForce RTX 4070 graphics card

(Image credit: Future / John Loeffler)
Image 2 of 6

An Nvidia GeForce RTX 4070 graphics card on its retail packaging

(Image credit: Future / John Loeffler)
Image 3 of 6

The output ports of an Nvidia GeForce RTX 4070 graphics card

(Image credit: Future / John Loeffler)
Image 4 of 6

An Nvidia GeForce RTX 4070 graphics card standing upright on a pink desk mat

(Image credit: Future / John Loeffler)
Image 5 of 6

An Nvidia GeForce RTX 4070 graphics card standing upright next to the RTX 3070

(Image credit: Future / John Loeffler)
Image 6 of 6

An Nvidia GeForce RTX 4070 graphics card sitting in front of a much larger Nvidia RTX 4080 graphics card

(Image credit: Future / John Loeffler)

But holding the RTX 4070 in my hand, this is the first card of this generation that doesn't feel like a piece of machinery. Even the more modestly-sized AMD Radeon RX 7900 XTX and RX 7900 XT feel substantial, while the RTX 4070 feels like an old school GeForce graphics card from a couple years back.

The RTX 4070 Founders Edition keeps the same fan design as the RTX 4090 and RTX 4080 that preceeded it (a fan on the front and back), but it shrinks everything down to a dual-slot card about two-thirds the size of those monsters. The RTX 4070 also features the same outputs as previous RTX Lovelace cards (so no USB-C out), and a 16-pin power connector with an included adapter for two 8-pin leads to power the card.

With a TGP of 200W, Nvidia could theoretically have just gone with a single 8-pin connector, but Team Green seems absolutely committed to the 12VHPWR cable, it seems. I'll never stop complaining about this, but it is what it is. If you have an ATX 3.0 power supply, you won't need to worry about that, but the rest of us will have to deal with additional cable management.

  • Design score: 4.5 / 5

Nvidia GeForce RTX 4070 review: Performance

An Nvidia GeForce RTX 4070 graphics card slotted into a motherboard

(Image credit: Future / John Loeffler)
  • Phenomenal gaming performance
  • Can easily push 60 fps in 4K gaming with DLSS
  • RTX 3080 performance at 60% of the power

Right out the gate, let's just say that the Nvidia RTX 4070 is the best 1440p graphics card on the market right now, and it's likely to remain at the top of that list for a good long while.

Its performance prowess isn't limited to just 1440p, mind you, and when I get into the gaming performance, you'll see that its 4K gaming potential is exciting (with caveats), but for starters, we can dig into its synthetic performance in tests like 3DMark to see how the fundamentals stack up.

General Performance

As you can see, the RTX 4070 outperforms the RTX 3070 by about 21% overall, while underperforming the RTX 3080 by about 1.37%, which is close enough to effectively tie the last-gen 4K powerhouse, and underperforms the RTX 3080 Ti by about 6%. Considering that the RTX 3080 Ti's MSRP is nearly twice that of the RTX 4070, this is an astounding result.

The RTX 4070 does lag behind the RTX 4070 Ti and the RX 7900 XT by quite a bit, averaging about 22% worse performance than the RX 7900 XT and about 13.5% worse performance than the RTX 4070 Ti. These current-gen cards also have substantially better hardware, so this isn't unexpected.

Creative Performance

When it comes to creative performance, well, we have a more limited dataset to work with since Blender Benchmark 3.5.0 decided it only wanted to test half the cards I tried to run it on (including failing to run on the RTX 4070), so we'll have to come back to that one at a later date once the benchmark is updated.

In the meantime, the tests I was able to run really showcased how well the RTX 4070 can handle creative workloads. On Adobe Premiere and Adobe Photoshop, the RTX 4070 performed noticeably better than the RTX 3080 across both apps and fell in very close behind the RTX 4070 Ti for an overall second place finish.

In lieu of Blender's Benchmark, V-Ray 5 is a fairly good stand-in, as well as an excellent benchmark in its own right. Here, the RX 7900 XT wouldn't run, since it doesn't use CUDA or Nvidia's RTX, but we can see the RTX 4070 coming in a respectable runner up to the RTX 4070 Ti.

One of my recent favorite workloads, Lumion 12.5, renders an architectural design into either a short movie clip at 1080p or 4K at 60 fps, making it one of the best benchmarks for creatives to see how a graphics card handles production level workloads rather than synthetic tests.

It requires the same kind of hardware as many of the best PC games in order to light a scene, create realistic water effects, and reproduce foliage on trees, and it's the kind of real-world benchmark that tells more about the card than a simple number devoid of context.

Considering that it can take a five-second, 60 fps movie clip an hour to render at production quality, I switched things up a bit and rather than calculate frames per second, like I do with Handbrake's encoding test, I use frames per hour to give a sense of how long a movie clip you can produce if you leave the clip to render overnight (a common practice).

In the case of the RTX 4070, it rendered a five-second movie clip at 60 fps at draft (1-star) quality 13% faster than the RTX 3080, about 7% faster than the RTX 3080 Ti, and nearly 23% faster than the RX 7900 XT.

It lagged behind the RTX 4070 Ti, though, by about 8%, a deficit that grew wider at 1080p production (4-star) quality, where the RTX 4070 rendered the movie 25% slower than the 4070 Ti and 6.78% slower than the RX 7900 XT.

For Handbrake, the RTX 4070 manages to pull out its first clean win on the creative side, though not by a whole lot. Still, 170 frames per second encoding from 4K to 1080p is not bad at all.

Overall then, the RTX 4070 puts in a solid creative performance, besting the RTX 3080, the RX 7900 XT, the RTX 3070 Ti, and the RTX 3070, while barely losing out to the RTX 3080 Ti.

Gaming Performance

An Nvidia GeForce RTX 4070 graphics card on a pink desk mat

(Image credit: Future / John Loeffler)

As good of a creative card as the RTX 4070 is, in its bones, this is a gamers' graphics card, so gaming performance is definitely where I spent most of my time testing the RTX 4070. I want to note that the included figures here are a representative sample of my testing, and that not all test results are shown.

When it comes to gaming performance, the RTX 4070 offers some of the best you're going to get at this price, though there are some stipulations to bring up right out the gate.

First, broadly speaking, this card can game at 4K on most games not called Cyberpunk 2077 or Metro: Exodus using max settings natively, so long as you keep things within reasonable limits. Or, really, one limit: keep ray tracing turned off.

Overall, the RTX 4070 gets about 58 fps on average at 4K when not ray tracing, with a floor of 45 fps at 4K, which is eminently playable. Turn ray tracing to the max and your get an average fps of 34 with a floor of 25, which is just better than a slideshow.

The RTX 3080 doesn't fare too much better on this metric, managing 40 fps on average with a floor of 29 fps at max settings with ray tacing turned on, while the RTX 3080 Ti averages about 36 fps and a floor of 19 fps. This does put the RTX 4070 just behind the 3080 Ti in terms of average fps and with a higher fps floor than the 3080 Ti.

If you're dead set on ray tracing, the RTX 4070 can certainly deliver, thanks to DLSS, which can bump those numbers back up to 79 fps on average with a floor of 55 fps. Compare that to the RTX 3080's 80 fps average with a 58 fps floor in our tests and the RTX 4070 can definitely go toe to toe with the RTX 3080 when ray tracing on max settings if DLSS is on.

In addition, the RTX 4070 gets about 10% less fps on average than the RTX 3080 Ti at 4K with ray tracing and DLSS on, (79 fps to the 3080 Ti's 88 fps), and a roughly 14% lower fps floor than the RTX 3080 Ti (55 fps to the 3080 Ti's 64 fps).

Overall, the RTX 4070 manages an average 57 fps at 4K, with a floor of 41 fps, across all the settings I tested. This is about 28% lower than the RTX 4070 Ti (79 fps average, overall), about 10% lower than the RTX 3080 (63 fps average, overall), the RX 7900 XT (64 fps average, overall), and the RTX 3080 Ti (64 fps average, overall).

These numbers skew a bit against the RTX 4070, since the RTX 4070 Ti, RX 7900 XT, RTX 3080, and RTX 3080 Ti all handle native 4K gaming much better, but so few people play at native 4K anymore that is a fairly meaningless advantage.

Meanwhile, the RTX 4070 actually beats the RX 7900 XT by about 20% when using DLSS (versus the RX 7900 XT's FSR) at 4K with max settings and ray tracing; 79 fps on average to 66 fps on average, respectively. It also manages to strike a dead heat with the RTX 3080 (80 fps average) and come just 10% short of the RTX 3080 Ti's average RT performance at 4K with ray tracing.

It's important to note as well that these don't factor in DLSS 3 Frame Generation, to make it a fair comparison.

As for the RTX 3070, the RTX 4070 manages about 39% better average 4K performance, with a 53% higher fps floor (57 fps average with a 43 fps floor for the RTX 4070 compared to the RTX 3070's 41 fps average and 28 fps floor).

When it comes to 1440p gaming, the RTX 4070 is on much more solid footing, even if some of the bigger cards definitely perform better in absolute terms. The RTX 4070 underperforms the RTX 3080 by about 8% in non-ray-traced, non-upscaled 1440p gaming, on average (105 fps to the RTX 3080's 115 fps), though they both have a very similar floor around 80-85 fps.

Meanwhile, the RTX 4070 falls about 12% short of the RTX 3080 Ti's 119 average fps at non-ray-traced, non-DLSS 1440p.

Both the RTX 4070 Ti and RX 7900 XT kinda clobber the RTX 4070 with roughly 25-29% better performance at non-ray-traced, non-upscaled 1440p gaming, and this carries over into gaming with ray tracing settings maxed out, though the RTX 4070 is still getting north of 60 fps on average (67 fps, to be precise), with a relatively decent floor of 51 fps.

The real kicker though is when we turn on DLSS, at which point the RTX 4070 beats out everything but the RTX 4070 Ti and RTX 3080 Ti, including the RX 7900 XT, which it outperforms by about 29% on average (125 fps to 97 fps), with a much higher floor of 88 fps to the RX 7900 XT's 60 fps, a nearly 49% advantage.

The RTX 4070 also beats the RTX 3080 here too, with about 5% better performance on average and a 7.5% higher fps floor on average than the RTX 3080. Incredibly, the RTX 4070 is just 3% slower than the RTX 3080 Ti when both are using DLSS at 1440p with max ray tracing.

As for the RTX 3070, the RTX 4070 gets about 35% better performance at 1440p with ray tracing and DLSS 2.0 than the card it replaces (125 fps to 93 fps), with a nearly 53% higher fps floor on average (87 fps to the 3070's 57 fps), meaning that where the RTX 3070 is setting the 1440p standard, the RTX 4070 is blowing well past it into territory the RTX 3070 simply cannot go.

The story is pretty much the same at 1080p, with there being essentially no difference between the RTX 4070, the RTX 3080, the RTX 3080 Ti, and the RX 7900 XT, with the RTX 3070 languishing about 30% behind and the RTX 4070 Ti off on its own out ahead of everyone else.

There has been a lot of talk about the RTX 4070 ahead of its launch as benchmarks have leaked and people have looked at numbers out of context and downplayed the performance of the RTX 4070 based on one or two tests. They've even pointed to the price increase to say that this card is a disappointment.

Granted, I'm not thrilled about the 20% price increase either, but there's no getting around the fact that you're getting a graphics card here with just 200W TGP that's putting up numbers to rival the RTX 3080 Ti. And I haven't even touched on the new features packed into Lovelace that you can't get with the last-gen Nvidia graphics cards.

The numbers are what they are, and the RTX 4070's performance is simply outstanding across every resolution in all the ways that matter.

  • Performance score: 5 / 5

Should you buy the Nvidia GeForce RTX 4070 ?

A man's hand holding the Nvidia GeForce RTX 4070 graphics card

(Image credit: Future / John Loeffler)

Buy it if...

You want next-gen performance for less than $600
The Nvidia RTX 4070 offers performance on par with the RTX 3080 and even the RTX 3080 Ti for a good deal less.

You don't want a massive GPU
Graphics cards are starting to resemble transformers nowadays (both the autobot and power plant variety), so it's nice to get a graphics card that's just normal-sized.

You want next-gen features like DLSS 3
Nvidia's hardware is often on the bleeding edge of the industry, but things like DLSS 3 and Nvidia Reflex are Nvidia's not-so-secret force multiplier here.

Don't buy it if...

You can get an RTX 3080 cheap
Generally, the RTX 4070 is going to outperform the 3080, but if you don't care about the advanced features and can grab the 3080 in a bargain bin, you could save some money.

You're looking for Nvidia's next budget card
The RTX 4070 is a lot cheaper than the rest of the current-gen graphics card lineups from Nvidia and AMD, but at $600, it's still too expensive to truly be a "budget" GPU.

Nvidia GeForce RTX 4070 review: Also consider

If our Nvidia GeForce RTX 4070 review has you considering other options, here are two more graphics cards to consider...

AMD Radeon RX 7900 XT
While its bigger sibling gets a lot more attention, don't sleep on the RX 7900 XT. It's one of the best graphics cards AMD has ever produced, and while it's a good bit more expensive than the RTX 4070, it's powerful and future-proofed enough for 8K gaming that you'll be able to get a lot of use out of this card in the long-term.

Read our full AMD Radeon RX 7900 XT review

Nvidia GeForce RTX 4070 Ti
The Nvidia RTX 4070 Ti doesn't have a Founders Edition, so it's going to be more expensive than its $799 MSRP, but the performance on offer here makes this an excellent alternative to the RTX 4070 if you've got some extra cash to spend.

Read our full Nvidia GeForce RTX 4070 Ti review

How I tested the Nvidia GeForce RTX 4070 Ti

An Nvidia GeForce RTX 4070 graphics card slotted into a motherboard

(Image credit: Future / John Loeffler)
  • I spent about 50 hours with the RTX 4070 in total
  • Besides general benchmarking, I used the card for everyday gaming and creative work
My test bench specs

Here is the systems I used to test the Nvidia GeForce RTX 4070:

CPU: AMD Ryzen 9 7950X3D
CPU Cooler: Cougar Poseidon GT 360 AIO Cooler
DDR5 RAM: 32GB Corsair Dominator Platinum @ 5,200MHz & 32GB G.Skill Trident Z5 Neo @ 5,200MHz
Motherboard: ASRock X670E Taichi
SSD: Samsung 980 Pro SSD @ 1TB
Power Supply: Corsair AX1000 80-Plus Titanium (1000W) Case: Praxis Wetbench

When I test a graphics card, I start by making sure that all tests are performed on the same test bench setup to isolate GPU performance. I then run it through a series of synthetic benchmarking tools like 3DMark as well as in-game benchmarks in the most recent PC games I can access like Cyberpunk 2077 and F1 2022.

I run everything on the maximum settings possible without upscaling tech, and I run all tests at the resolution a reader is most likely to use a given card at. In the case of the RTX 4070, this meant testing at 1080p, 1440p, and 2160p.

I also make sure to install the latest relevant drivers and rerun tests on any competing graphics card that I might have already reviewed and tested, like the RTX 4070 Ti, RX 7900 XT, and RTX 3080 to make sure that I have the most current scores to account for any driver updates. All of these scores are recorded and compared against the card's predecessor, its most direct rival, and the card directly above and below it in the product stack, if those cards are available.

I then average these scores to come to a final overall score and divide that by the card's MSRP to see how much performance every dollar or pound spent actually gets you to find how much value the card actually brings to the table.

Finally, I actually use the card in my own personal computer for several days, playing games, using apps like Adobe Photoshop or Adobe Illustrator, and watching for any anomalies, crashes, glitches, or visual disruptions that may occur during my time with the card. Having extensively covered and tested many graphics cards over the years, I know what a graphics card should do and how it should perform, and can readily identify when something is not performing up to expectations and when it exceeds them.

Read more about how we test

First reviewed April 2023

Tempur-Pedic Tempur-Adapt mattress review: optimal relief for back sleepers
5:00 pm | March 19, 2023

Author: admin | Category: Computers Gadgets | Tags: | Comments: Off

Tempur-Adapt mattress review: two-minute review

The Tempur-Adapt mattress is Tempur-Pedic's mid-range model, albeit it's still quite pricey relative to many of the best mattresses on the market. (It has a starting MSRP of $1,699 for a twin.)  After three weeks of sleeping on a twin-sized Tempur-Adapt, here's what I (and my 5-person testing panel) discovered about this popular memory foam mattress...

My fellow testers and I are habitual back sleepers, but all of us felt the most comfortable lying on our backs. In this position, we experienced the "legendary pressure relief" Tempur-Pedic promises, along with all-over support that kept us well-aligned. It was a similar experience when we slept on our stomachs, but side sleeping was a mixed bag. I found it too firm along my shoulders, while another side sleeper said it was too soft along her hips.

Tempur-Pedic Tempur-Adapt mattress against a white background

(Image credit: Tempur-Pedic)

Motion isolation is incredible. Tempur Material absorbs almost every movement with ease so couples will benefit from it. Edge support is sufficient — there was some noticeable give when we sat along the edges but none of us feared falling off the bed. Unfortunately, temperature regulation is where the Tempur-Adapt faltered. Despite measures to make my sleeping environment as pleasant as possible, I woke up warm or downright sweaty while sleeping on the Tempur-Adapt mattress.

White-glove delivery is standard, which will be a welcome perk for anyone who can't maneuver their mattress solo. The Tempur-Adapt arrives flat, ready to sleep on as soon as it's placed on your bed frame. Mattress removal is also included if you need it. Since I had no way to dispose of my previous mattress, I took advantage of this convenience.

Keep scrolling to learn more about how the Tempur-Adapt fared when it came to pressure relief, motion isolation, edge support, and temperature regulation — following TechRadar's mattress testing methodology. There are also general considerations regarding cost and value, plus ease of set-up.

Tempur-Adapt mattress review: price

  • Tempur-Pedic's mid-range model
  • Save up to $300 during rare Tempur-Pedic sales
  • Short trial and warranty, but in-home delivery is included

The Tempur-Adapt mattress is second only to the entry-level Tempur-Cloud bed-in-a-box in terms of pricing. The MSRP for a Tempur-Adapt is $1,669 for a twin and $2,199 for a queen. It's worth noting that these are lower ticket prices than we've recently seen from Tempur-Pedic — the retail price for a twin was previously $1,949 while a queen was $2,749.

Below is the official 2023 pricing for the Tempur-Adapt Mattress:

  • Twin MSRP: $1,699
  • Twin XL MSRP: $1,699
  • Full MSRP: $2,049
  • Queen MSRP: $2,199
  • King MSRP: $2,899
  • Split King MSRP: $3,398
  • Cal king MSRP: $2,899
  • Split Cal king MSRP: $3,398

Despite its mid-range status in the Tempur-Pedic mattress lineup, the Tempur-Adapt is still among the priciest memory foam mattresses on the market. Tempur-Pedic mattress sales only run during major holidays. In fact, we saw $300 off the Tempur-Adapt during Tempur-Pedic's Presidents' Day sale. That's one of the more substantial savings we've seen as Tempur-Pedic usually takes $100 to $200 off.

However, Tempur-Pedic mattresses are sold at a number of third-party retailers like Amazon, Raymour & Flanigan and Mattress Firm. You can browse their respective mattress sales for possible savings outside of shopping events, although these stores generally follow Tempur-Pedic's pricing conventions at the time. (Also, we'd recommend buying straight from the manufacturer for easier aftercare.)

For the amount of money you pay, Tempur-Pedic doesn't offer much in terms of its trial and warranty periods at 90 days and 10 years, respectively. (Plus, it costs $175 to return it.) On the other hand, in-home delivery and mattress setup are included, which is a perk most brands don't include for free, if at all.

Tempur-Adapt mattress review: specs

Tempur-Adapt mattress review: materials and design

  • An 11-inch mattress with three layers
  • Cover has a cool-to-the-touch feeling
  • Tempur Material adapts to your weight and temperature

The 11-inch Tempur-Adapt mattress contains three layers — an 8-inch polyfoam base layer, a 1.5-inch support layer of dense Original Tempur Material in the middle, and a 1.5-inch comfort layer of softer TEMPUR-ES Material on top. Together, these layers are designed to provide adequate pressure relief, support, and response time.  A hybrid upgrade is available, which adds a layer of individually-wrapped coils for more bounce.

Tempur-Adapt mattress review featuring a close-up of the knit cover

A close-up of the Tempur-Adapt's knit cooling cover (Image credit: Future / Alison Barretta)

Tempur Material features an open-cell construction to help disperse heat, along with a knit cover made from specialized yarn that gives it a cool-to-the-touch feeling. These work in tandem to regulate temperature and prevent overheating.

The cover is spot-clean only so you'll want to invest in the best mattress protector to safeguard it from spills, bed bugs, and other unpleasantries. Note that Tempur-Pedic recommends using its own-brand mattress protector to enhance the contouring properties of the Tempur Material.

Design score: 4.5 out of 5

Tempur-Adapt mattress review: comfort

  • Rated 7.5 out of 10 on the firmness scale
  • Exceptional for back sleepers
  • Won't suit all side sleepers

For three weeks, I (an average-sized side/stomach sleeper) slept on a twin Tempur-Pedic Tempur-Adapt mattress — but since comfort is subjective, I recruited a panel of five individuals to provide their perspectives by napping on it for at least 15 minutes. My group included men and women of varying heights, weights, and sleep preferences.

The Tempur-Adapt mattress comes in one firmness, which we collectively rate a 7.5 out of 10 on the firmness scale. Although one person in our group said it was soft (especially along her hips while side sleeping), the majority of us found it close to Tempur-Pedic's self-assessment of medium.

A cat helps out with the Tempur-Adapt mattress review

Note: Alex the Cat was not an official part of the testing panel. (Image credit: Future / Alison Barretta)

It took me a couple of weeks to break in the Tempur-Adapt when sleeping on my side. (It was particularly firm around my shoulders.) However, I liked resting on my stomach since the mattress gently cradled my hips level while keeping them level with the rest of my body.

However, my testing group and I agree that the Tempur-Adapt is most comfortable for back sleeping. All of us felt immediate pressure relief in our backs, and regardless of our stature our body weight was evenly distributed so we were well supported. (Interestingly enough, nobody in our group is a natural back sleeper.)

I had just completed my Saatva Loom & Leaf mattress review so I was used to sleeping on a slightly softer mattress. The Tempur-Adapt felt a touch too firm for me at first, but I gradually eased into it. Meanwhile, most of my fellow testers usually sleep on a firm mattress at home and found it plusher than they're used to but still comfortable.

Testing pressure relief for the Tempur-Adapt mattress review

(Image credit: Future / Alison Barretta)

I tested the Tempur-Adapt's "legendary pressure relief" by placing a 56lb kettlebell in the middle of the mattress to simulate someone sinking into it. The weight compressed the mattress by around 2.5 inches, and it took 20 seconds for the Tempur Material to snap back into place. 

What did we human testers think? We liked the responsiveness and soft hug of the top TEMPUR-ES layer, especially those of us who experience regular aches and pains. As someone who's dealt with a recent lower back injury, I appreciated how the Tempur Material relieved pressure from my hips and lumbar.

Final verdict? The Tempur-Pedic Tempur-Adapt mattress provides the best pressure relief and all-around support for back sleepers. Stomach sleepers should also get on with it, but if you favor your side you might not find that keen balance of comfort and support. Check out any of the best mattresses for side sleepers instead.

Comfort score: 4 out of 5

Tempur-Adapt mattress review: temperature regulation

  • Sleeps hot, despite its cool-to-the-touch cover
  • Specialty bedding could help improve this

I'm prone to overheating at night yet can't help cocooning myself in covers. Thus, I was excited to put the Tempur-Adapt through its paces here, given the Tempur Material's response to temperature and the cool-to-the-touch surface.

Despite wearing lightweight pajamas and using cotton-polyester bed linens, I woke up warm to sweaty most mornings. I even transitioned from a mid-weight comforter to a lightweight crocheted blanket and still couldn't cool off.

Tempur-Adapt mattress review: feeling the cooling cover of the mattress

(Image credit: Future / Alison Barretta)

There are a couple of things to note here. First. Tempur-Pedic says its Tempur Material could lead to an increase in blood circulation, which could result in needing fewer covers. (That didn't make a difference to me, unfortunately.)

Second, Tempur-Pedic warns that mattress protectors may impede the Tempur Material from doing its thing, which is why the brand recommends using a Tempur-Pedic mattress protector. My polyester mattress protector is quite thin but it possibly affected how well the Tempur-Adapt could regulate temperature.

Still, I was not expecting this level of discomfort, especially for a premium mattress with a cooling cover.

Temperature regulation score: 2 out of 5

Tempur-Adapt mattress review: motion isolation

  • Tempur-Material absorbs nearly every movement
  • An excellent choice for co-sleeping

The Tempur-Adapt may have failed the temperature regulation segment of the review, but it passed its motion isolation tests with flying colors.

A twin bed is meant to comfortably accommodate one person so to measure motion isolation I dropped a 10lb weight next to an empty wine glass from six inches high and varying distances away. This is meant to simulate a partner's movements as they shift positions or get in and out of bed.

Testing motion isolation with a wine glass and kettlebell for the Tempur-Adapt mattress review.

(Image credit: Future / Alison Barretta)

When I dropped the weight 25 inches away, the wine glass remained firmly in place. I counted one jiggle when I performed the same test from 12 inches away. When I released the weight from four inches away, the wine glass briefly wobbled back and forth before returning to its original position.

What does this mean? If you share a bed with a fidgety co-sleeper or someone who has a different schedule than you do, you'll barely notice a thing because the Tempur-Adapt absorbs nearly every movement. Plus, plenty of happy couples have left glowing reviews praising its low motion transfer.

Motion isolation score: 4.5 of 5

Tempur-Adapt mattress review: edge support

  • Provides adequate edge support
  • Top cover bunches up a bit
  • No danger of falling off

Regardless of whether you sleep on a twin or a king, edge support is a key feature to consider. Not only do strong edges prevent sagging, but they can aid sleepers with mobility issues who need to sit before getting in or out of bed. (Not to mention, it lessens the fear of possibly rolling overboard while you sleep.)

Testing the edge support for the Tempur-Adapt mattress review using a kettlebell.

(Image credit: Future / Alison Barretta)

I placed my 56lb kettlebell along the Tempur-Adapt's edges along the middle perimeter and at the bottom. The weight compressed the mattress two inches at either edge. It did create a slight bulge in the top layer, but the edges returned to form once I removed the kettlebell.

My fellow testers and I also sat on the center edge plus along the corners. Some of us felt more sinkage than others, but none of us felt like we were in danger of falling off. We didn't experience any issues with getting up from the mattress, either.

Based on these assessments, the Tempur-Adapt mattress has average edge support. However, like standard foam, Tempur Material has a lot of give to it so this is to be expected. Hybrid mattresses are generally a better option if you need exceptional edge support since they combine foam with reinforced coils. (The Tempur-Adapt is available as a hybrid, as well.)

Edge support score: 3.5 out of 5

Tempur-Adapt mattress review: setup

  • Arrives flat, and delivered right to your room of choice
  • Mattress removal is also included free of charge
  • No obvious off-gassing smell

There was little I had to do to set up the Tempur-Adapt mattress. Since it arrives flat, Tempur-Pedic includes free in-home delivery straight into your room of choice. All I had to do was confirm a delivery date and time.

About a week after my initial contact with a local logistics company, a couple of crew members placed a fully-formed Tempur-Adapt mattress on my platform bed. I could have slept on it right away if I wanted to but since my delivery was nice and early at 9 am on a Monday morning, I opted to wait anyway.

Tempur-Adapt mattress review, featuring the mattress on a platform bed right after setup

(Image credit: Future / Alison Barretta)

The complimentary white glove delivery is a nice perk if you live alone, are recovering from an injury, or sleep on a larger bed. (Depending on the size, the Tempur-Adapt weighs between 44lbs and 96lbs.)

Optional mattress removal is also available. I took advantage of this service since I had no way to dispose of my previous mattress. Note that you'll have to include this in your delivery notes when you make your appointment.

I didn't detect an obvious off-gassing smell from the Tempur-Adapt mattress. Tempur-Pedic uses CertiPUR-US-certified foams low in the volatile organic compound (VOC) emissions that could make your new mattress stink. (Avoiding the whole process of unfurling a vacuum-sealed foam bed may have aided in an odorless experience, too.)

Setup score: 5 out of 5

Tempur-Adapt mattress review: customer reviews

  • 4.5 out of 5-star rating on Tempur-Pedic's site
  • Praised for its contouring properties and pain relief
  • Many complain about overheating

To supplement my experience and that of my testing panel, I combed through hundreds of user reviews to provide an even greater perspective of how well this mattress performs. The Tempur-Adapt Medium mattress has a 4.5-star rating out of 5 from nearly 1,000 reviews on Tempur-Pedic's website as of March 2023. 

Those who like the Tempur-Adapt mattress say that it's helped alleviate issues like snoring and back pain. Plus, many sleepers praise its contouring properties and low motion transfer. On the other hand, some reviewers say it sleeps too hot, while several side sleepers claim it's too firm. You can filter reviews by keyword and star rating so you can find exactly what you'd like to know about this mattress.

The Tempur-Adapt mattress is sold at a selection of third-party sellers, with an average rating of at least 4 stars. However, most of those reviews appear to be sourced from Tempur-Pedic's website, so it's best to just look there.

Should you buy the Tempur-Adapt mattress?

I recommend the Tempur-Pedic Tempur-Adapt mattress for back sleepers who are seeking exceptional pressure relief and all-over support. Despite not being natural back sleepers ourselves, my testing panel and I felt the most comfortable resting in this position, followed by sleeping prone (on our stomachs). Side sleepers, on the other hand, may not get on as well with it. Although I gradually settled into the Tempur-Adapt after a couple of weeks, side sleeping wasn't the most comfortable for me on this mattress.

Co-sleepers who want a mattress that absorbs nearly every movement should also be satisfied with the Tempur-Adapt based on my testing plus the glowing reviews from content couples. Hot sleepers, meanwhile, will want to avoid sinking money into this mattress unless they're willing to pay extra for Tempur-approved bedding. (Personally, I don't know how much of a difference this would make so go for a dedicated cooling mattress instead.) 

The most important thing I took from my testing is that brand reputation alone should not dictate your mattress purchase. My expectations were admittedly quite high due to it being a premium mattress from a renowned company. Despite my issues with it, I still believe it's a well-made mattress that'll perfectly suit a certain type of sleeper. It's quite an investment, though, so try to take advantage of the rare Tempur-Pedic sale when you can.

Alternatively, for a more affordable way to bring Tempur-Pedic's signature comfort into your home, check out our Tempur-Adapt mattress topper review.

Tempur-Adapt mattress review: Also consider

Tempur-Pedic Tempur-Breeze Mattress
If you want to enjoy the cradling comfort of Tempur Material without overheating, you'll want the Tempur-Breeze, which sleeps between 3 and 8 degrees cooler, depending on the version you choose. It's loaded with cooling tech such as Tempur-CM Material (which cycles out heat and humidity) and a layer of PureCool+ Phase Change Material. The downside: it's Tempur-Pedic's priciest mattress.View Deal

Saatva Loom & Leaf Mattress
This memory foam mattress is comparable to the Tempur-Adapt in price and construction. Notable here is a gel-infused foam lumbar crown, a boon for anyone with lower back pain. Motion isolation is also excellent, but it does tend to sleep warm. White glove delivery is included, as are a one-year trial and a lifetime warranty — arguably making it a better value than the Tempur-Adapt.
Read our Saatva Loom & Leaf reviewView Deal

Helix Midnight Mattress
Side sleepers, this is made just for you. Despite being a hybrid, the Helix Midnight has a plusher feel than the Tempur-Adapt and provides exceptional pressure relief along the shoulders and hips. It also has impressive temperature regulation and very good motion isolation, but edge support could be better.
Read our Helix Midnight reviewView Deal

How I tested the Tempur-Adapt mattress

I slept on a twin Tempur-Pedic Tempur-Adapt Medium mattress for three weeks between January and February 2023. Although it's officially winter in my part of the world at this time, overnight temperatures ranged from below-freezing to unseasonably mild. I used cotton-polyester linens and alternated between a mid-weight polyester comforter and a lightweight crocheted blanket.

In addition to my own experience, I asked a 5-person panel to sleep on the Tempur-Adapt mattress in multiple positions for at least 15 minutes and sit on the edges. Participants ranged in height and weight, with our smallest tester being 5ft4in and 125lbs, and our biggest tester being 6ft and 190lbs.  

To objectively evaluate the Tempur-Adapt's firmness, edge support, and motion isolation, I performed standardized tests to gauge these features beyond my preferences and potential biases.

First reviewed: February 2023

Nvidia GeForce RTX 4070 Ti review: the next-gen Nvidia card for the rest of us
6:26 pm | January 26, 2023

Author: admin | Category: Computers Gadgets | Tags: , , | Comments: Off

Nvidia GeForce RTX 4070 Ti: Two-minute review

The Nvidia GeForce RTX 4070 Ti came onto the scene needing to score a real win if Team GReen had any hope of reigning in a resurgent AMD, and this is exactly the right graphics card to do that.

The RTX 4070 Ti isn't the best graphics card Nvidia's ever put out, and its launch has been somewhat overshadowed by the major stumbles Nvidia has made since Jensen Huang first announced the Nvidia Lovelace launch lineup back in September — which was really more of a brief aside during a presentation overwhelmingly devoted to getting us to care about the Omniverse, but I digress.

And, lest we forget, the RTX 4070 Ti is identical in substance to the "unlaunched" RTX 4080 12GB that was initially announced to the confusion of many, and it's not surprising that there is no Founders Edition for this card, since you really can't scratch a 7 out from an 8 in RTX 4080.

What's more, the RTX 4080 that we did get is too expensive to really recommend, so it's disappointing that the RTX 4070 Ti wasn't the card to carry the 4080 brand into the next generation. It is without question the best Nvidia graphics card you can buy right now (by value) from this new generation of GPUs, and it represents a major leap forward for everyday, mainstream PC gaming. It's not without its flaws, but on balance, it's the Nvidia GPU that anyone looking to upgrade with Team Green ought to be buying unless they have a couple of thousand dollars to burn.

With an MSRP of $799 / £799 / AU$1,469, it's cheaper than the cheapest RDNA 3 GPU, the AMD Radeon RX 7900 XT, and half the price of the Nvidia GeForce RTX 4090. And while both of those cards outperform the RTX 4070 Ti in raw performance terms, there are a number of value adds for the RTX 4070 Ti that collectively make it worth major consideration regardless of its limitations.

The Nvidia RTX 4070 Ti isn't an undisputed winner in the lower-premium GPU class, but this is where Nvidia really needed to shore up its flank after AMD crushed it in my AMD Radeon RX 7900 XTX review and to that end it is exactly what Nvidia needed right now.

Nvidia GeForce RTX 4070 Ti review: Price & availability

An Nvidia GeForce RTX 4070 Ti graphics card on a wooden table with its retail packaging

(Image credit: Future)
  • RTX 4070 Ti MSRP is the lowest of all the latest next-gen graphics cards
  • Some third-party cards can even be bought at MSRP
  • Availability is generally pretty good

The Nvidia RTX 4070 Ti, available now in the US, UK, and Australia, comes in at the lowest MSRP of any of the next-gen cards to hit the market in recent months, and it's good to see that Nvidia took a lot of the criticism about price inflation directed toward it after the Lovelace announcement to heart.

The RTX 4070 Ti, with an MSRP of $799 / £799 / AU$1,469, is $100 / £100 cheaper than the RX 7900 XT (though it is actually AU$60 more expensive in Australia), and even though there is no Nvidia Founders Edition card guaranteed to sell at MSRP, even some third-party cards can be found very close to or even matching Nvidia's MSRP.

That said, this is also much more expensive than the Nvidia GeForce RTX 3070 Ti, which had a launch MSRP of $599 / £529 / AU$959. That makes the RTX 4070 Ti $200, £270, and AU$510 more expensive than the card it is technically replacing, so we can't go praising Nvidia too for relaunching the RTX 4080 12GB (originally slated for an $899 / £849 / AU$1,659 launch price) at a somewhat lower price point.

And, of course, prices for other third-party cards may end up being substantially higher, especially for OC versions that squeeze a few hundred MHz more out of the GPU's clock speeds or special edition cards with premium design or water cooling.

Still, there's no getting around the fact that this is the cheapest next-gen card we're going to have for a while, so the RTX 4070 Ti is going to score some major points here by default.

  • Price score: 4 / 5

Nvidia GeForce RTX 4070 Ti review: Features & chipset

An Nvidia GeForce RTX 4070 Ti graphics card on a wooden table with its retail packaging

(Image credit: Future)
  • Third-gen RT cores
  • DLSS 3 with full frame generation
  • Lower TDP

The RTX 4070 Ti is built on Nvidia's new Lovelace GPU architecture, which features a significantly smaller TSMC process than the last-gen Nvidia Ampere architecture. At 4nm, as opposed to Ampere's 8nm process from Samsung, we're getting significantly faster clock speeds as well as more energy efficiency with this generation. The RTX 4070 Ti has just over twice as many transistors as the RTX 3070 Ti while packing them into a GPU die about 75% of the size of the RTX 3070 Ti's silicon, and you can see it in the RTX 4070 Ti's slightly lower TDP (285W to 290W for the RTX 3070 Ti).

The RTX 4070 Ti we reviewed, the Asus Tuf GeForce RTX 4070 Ti Gaming OC 12GB, also features a higher boost clock (but not that much more than Nvidia's reference specs) but both the reference clocks and the actual clocks on our Asus card are nearly a full 1,000 MHz faster than those of the RTX 3070 Ti, so this card is seriously fast.

As I said before, this is essentially the unlaunched RTX 4080 12GB, and so it has the same specs as that unlaunched card did, including 7,680 CUDA cores, 60 ray tracing cores, and 240 Tensor cores for doing all those tricky machine learning calculations needed to power the new DLSS 3 with full frame generation, which is as big a deal today as DLSS 2.0 was when it launched with Nvidia Ampere and made fast 4K gaming a reality for gamers around the world.

On the memory side, there is 12GB GDDR6X VRAM, the same as in the unlaunched RTX 4080, as is both the memory clock (1,313MHz) and 192-bit memory bus (for a total memory bandwidth of 504.2GB/s). If you're worried that 12GB might be a bit too low for a 4K graphics card, you don't need to worry about that with this card. There is more than enough to power a high-refresh 4K display, which is something that the RTX 3070 Ti could only do on the most restrictive of settings.

That memory though is too little to really power 8K content at fast speeds, and even if you could get more than 60 fps at 8K in any given game, the RTX 4070 Ti is locked in at 60Hz for 8K video thanks to its lack of DisplayPort 2.1 output, so you'll never be able to game faster than 60 fps at 8K. 

The RX 7900 XT, on the other hand, has both the memory and the output capacity to go as high as 165Hz at 8K, though it would only really be able to natively manage more than 60 fps on an 8K display with very low lift games like esports titles and the like. 

Still, it's possible for the RX 7900 XT, and it's really not with the RTX 4070 Ti, which is a shame. 8K gaming isn't really here yet beyond a couple of titles like Spider-man: Miles Morales, but with higher refresh rate 8K displays set to hit the market in the next year or two, the RTX 4070 Ti feels less future-proof than a card this expensive should be.

  • Features & chipset: 4 / 5

Nvidia GeForce RTX 4070 Ti review: Design

An Nvidia GeForce RTX 4070 Ti graphics card on a wooden table with its retail packaging

(Image credit: Future)
  • No Founders Edition
  • Designs will vary, but none of them will be small
  • USB Type-C output

The Nvidia RTX 4070 Ti we reviewed is actually a third-party card, since there is no Nvidia RTX 4070 Ti Founders Edition, but there are definitely some things that I can generalize about the design of the RTX 4070 Ti. Namely, that this is going to be a honking-big card no matter who you buy it from.

Image 1 of 3

An Nvidia GeForce RTX 4070 Ti graphics card on a wooden table with its retail packaging

(Image credit: Future)
Image 2 of 3

An Nvidia GeForce RTX 4070 Ti graphics card on a wooden table with its retail packaging

(Image credit: Future)
Image 3 of 3

An Nvidia GeForce RTX 4070 Ti graphics card on a wooden table with its retail packaging

(Image credit: Future)

Side by side with the RTX 4080, the RTX 4070 Ti takes up just as much space and will be just as challenging to wedge into all but the largest of tower cases. It's a triple-slot card, so a normal ATX motherboard will leave little room for anything else to slot in next to it.

Then there's the matter of its 16-pin connector. Given the lower power requirements for the RTX 4070 Ti, the card only requires three 8-pin connectors to be plugged into the included adapter (as opposed to four 8-pins for the RTX 4090), but the adapter is still going to be clunky to deal with, so unless you have a new ATX 3.0 PSU that comes with a 16-pin connector cable, your cable management skills are going to really be put to the test.

In terms of output, there is no DisplayPort 2.1, as I mentioned, but there's also no USB Type-C output either, something that would make a lot of sense on this card since it has very strong creative workload performance, so a lot of creatives on a budget would be tempted to give this card a look, but since the best USB-C monitors are also very popular among the creative professional crowd, they'll have to use an adapter, and no one likes having to use those if they can help it.

There is also an included support bracket for the Asus card we received for testing, and I imagine that a lot of other manufacturers will be including them as well. This card weighs a good bit, so torque forces are not going to be kind to it (or any other RTX 4070 Ti) in the medium-to-long term, so make sure you use one if you aren't running the card upright.

Finally, as for Asus's Tuf Gaming design, the open metal shroud exposes more of the heat sink while a triple-fan array will keep air moving through it. The cage-like shroud does look cool, and there is some RGB along the top of the card as well if you're into that.

  • Design score: 3.5 / 5

Nvidia GeForce RTX 4070 Ti review: Performance

An Nvidia GeForce RTX 4070 Ti running on a Praxis test bench

(Image credit: Future)
  • 4K gaming on a 1440p GPU
  • Great "budget" graphics card for creative pros
  • Competent ray tracing at 4K, especially with DLSS

As we move on to the RTX 4070 Ti's performance, the long and short of it is what it's been for two generations now: the RTX 4070 Ti is better at most creative workloads and ray tracing while AMD pulls ahead in rasterization, especially in gaming, though it is worth noting that the RX 7900 XT doesn't fall as far behind the RTX 4070 Ti in ray tracing as it would have a generation ago.

Image 1 of 6

Benchmark results for the RTX 4070 Ti

(Image credit: Future / Infogram)
Image 2 of 6

Benchmark results for the RTX 4070 Ti

(Image credit: Future / Infogram)
Image 3 of 6

Benchmark results for the RTX 4070 Ti

(Image credit: Future / Infogram)
Image 4 of 6

Benchmark results for the RTX 4070 Ti

(Image credit: Future / Infogram)
Image 5 of 6

Benchmark results for the RTX 4070 Ti

(Image credit: Future / Infogram)
Image 6 of 6

Benchmark results for the RTX 4070 Ti

(Image credit: Future / Infogram)

Nvidia's edge in ray tracing performance can be seen in our 3DMark Speedway and Port Royal tests, both of which are ray tracing-heavy benchmarks. The RTX 4070 Ti pulls out a fairly close win here, edging out the RX 7900 XT by a few hundred points in each.

Once we move onto Timespy and Firestrike though, both at 1440p and 4K, AMD's rasterization advantage really pulls ahead of Nvidia's RTX 4070 Ti with the RX 7900 XT blowing out the RTX 4070 Ti by a few thousand points at times.

Image 1 of 5

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 2 of 5

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 3 of 5

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 4 of 5

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 5 of 5

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)

Whatever gloating the RX 7900 XT might be doing gets quickly shut down in creative tests, especially in our Blender Benchmark tests. Admittedly, this isn't a fair fight, since Blender Cycles is highly optimized for the CUDA instruction set. Even the last-gen RTX 3070 Ti pushes AMD's RX 7900 XT aside in Blender Benchmark, and the RTX 4070 Ti fully laps its Team Red rival and then some.

This extends to Adobe Premiere as well, where the RTX 4070 Ti outperforms the RX 7900 XT by about 17.5%. If it's any consolation for the RX 7900 XT, it's that it edges out the RTX 4070 Ti in Photoshop, which is the living definition of a rasterization workload, so this shouldn't be surprising. Still, the RTX 4070 Ti manages to only lose by about 1.5%, so it's enough to call it a wash.

The key takeaway for me from these creative benchmark results though is that the RTX 4070 Ti is quite adept at creative work normally reserved for graphics cards twice as expensive, so any creatives out there looking for a more "budget" GPU option for their workstation actually have one now.

Image 1 of 13

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 2 of 13

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 3 of 13

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 4 of 13

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 5 of 13

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 6 of 13

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 7 of 13

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 8 of 13

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 9 of 13

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 10 of 13

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 11 of 13

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 12 of 13

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 13 of 13

Benchmark results for the RTX 4070 Ti

(Image credit: Future / InfoGram)

In gaming performance, we again see the RTX 4070 Ti besting the RX 7900 XT when it comes to ray tracing while often slipping into second when ray tracing isn't a factor. Ultimately, though, the difference here isn't too significant to go with one over the other on this basis alone.

All games need rasterization performance while only a few games even implement ray tracing, so the RTX 4070 Ti's third-gen ray tracing cores are only really a factor when it comes to premium ray tracing experiences like Cyberpunk 2077 or Hitman 3, and even then, it's still at the point where you need to rely on DLSS 3 for a good frame rate.

It also needs to be said that, technically, the RTX 4070 Ti is a 1440p graphics card. It's not supposed to really perform this well at 4K, as evidenced by the RTX 3070 Ti's rather pitiful showing in a number of 4K game benchmarks. Heck, the RTX 3070 Ti could barely get through the Hitman 3 Dubai benchmark at 4K without ray tracing and totally craps the bed when you turn ray tracing on to the point of crashing to the desktop. 

The fact that the RTX 4070 Ti is competitive at 4K is a huge win for the RTX 4070 Ti here, especially given that the RX 7900 XT has a much more built-in hardware advantage at 4K owing to 66% more VRAM and 58.66% more memory bandwidth for pushing through 4K textures.

AMD might have the edge in our gaming tests, but the raw number doesn't tell the whole picture. In our formal benchmark tests, we don't use DLSS or FSR to improve frame rates algorithmically, since updates can make or break their functionality, and this happens too often for any numbers gathered while using upscaling to really be valid after even a single update. As such, it's more important to get a baseline figure that can't change much over time when comparing hardware, and the non-assisted hardware rendering taking place in the PCIe slot can rarely, if ever, change. 

That said, no one plays games nowadays without some form of upscaling. Even if you've got a GTX 1060, you can still use FSR and you undoubtedly will. With the RTX 4070 Ti, DLSS 2.0 is already phenomenal, but DLSS 3 with frame generation takes DLSS 2.0's performance gains and pretty much doubles it. In practice, with DLSS 3 you will almost always get more fps in-game than you will with the RX 7900 XT running FSR 2.2, so even though the RX 7900 XT barely pulls ahead on gaming performance, your actual experience of gaming on the RTX 4070 Ti probably won't reflect that at all.

Image 1 of 4

Benchmark results for the Nvidia RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 2 of 4

Benchmark results for the Nvidia RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 3 of 4

Benchmark results for the Nvidia RTX 4070 Ti

(Image credit: Future / InfoGram)
Image 4 of 4

Benchmark results for the Nvidia RTX 4070 Ti

(Image credit: Future / InfoGram)

In the end, the RTX 4070 Ti comes in very close behind the RX 7900 XT in overall performance, owing mostly to the RX 7900 XT's strong absolute performance in synthetic benchmarks. Normalize all of that mathematically, though, and the RTX 4070 Ti is going to be the better bet here even without DLSS 3. Throw that in on top of everything else, and the RTX 4070 Ti simply walks away with it in terms of performance against both its predecessor and its direct rival from AMD.

  • Performance score: 4.5 / 5

Should you buy the Nvidia GeForce RTX 4070 Ti?

Buy it if...

You want the cheapest next-gen card available
At MSRP, the RTX 4070 Ti is the cheapest next-gen GPU on the market right now.

You want the best bang for your buck
The RTX 4070 Ti has the best performance for price of any of the newest graphics cards from Nvidia and AMD, so your money will go farther with the 4070 Ti than with anything else from this generation.

You want next-gen features like DLSS 3
Nvidia's hardware is often on the bleeding edge of the industry, but things like DLSS 3 and Nvidia Reflex are Nvidia's not-so-secret force multiplier here.

Don't buy it if...

You plan on doing a lot of 8K gaming
With just 12GB VRAM and no DisplayPort 2.1, 8K gaming with modern AAA titles on the RTX 4070 Ti is going to be a challenge.

You're on a very tight budget
While the RTX 4070 Ti is the cheapest graphics card on the market right now, it won't be the cheapest for long, as more affordable cards from both Nvidia and AMD are set to drop in the next few months.

Nvidia GeForce RTX 4070 Ti review: Also consider

If our Nvidia GeForce RTX 4070 Ti review has you considering other options, here are two more graphics cards to consider...

How I tested the Nvidia GeForce RTX 4070 Ti

An Nvidia GeForce RTX 4070 Ti graphics card on a wooden table with its retail packaging

(Image credit: Future)
  • I spent about 30 hours with the RTX 4070 Ti in total
  • Besides general benchmarking, I used the card in my everyday gaming and creative work
  • In addition to standard benchmarks, I played games for several days with a framerate monitor active and recorded the real-world average

When I test a graphics card, I start by making sure that all tests are performed on the same test bench setup to isolate GPU performance. I then run it through a series of synthetic benchmarking tools like 3DMark as well as in-game benchmarks in the most recent PC games I can access like Cyberpunk 2077 and F1 2022. I run everything on the maximum settings possible without upscaling tech, and I run all tests at the resolution a reader is most likely to use a given card at. In the case of the RTX 4070 Ti, this means nothing less than 2,560 x 1440p, with 3,840 x 2,160p wherever possible.

I also make sure to install the latest relevant drivers and rerun tests on any competing graphics card that I might have already reviewed and tested, like the RTX 4080 and the RX 7900 XT, to make sure that I have the most current scores to account for any driver updates. All of these scores are recorded and compared against the card's predecessor, its most direct rival, and the card directly above and below it in the product stack, if those cards are available. I then average these scores to come to a final overall score and divide that by the card's MSRP to see how much performance every dollar or pound spent actually gets you to find how much value the card actually brings to the table.

Finally, I actually use the card in my own personal computer for several days, playing games, using apps like Adobe Photoshop or Adobe Illustrator, and watching for any anomalies, crashes, glitches, or visual disruptions that may occur during my time with the card. Having extensively covered and tested many graphics cards over the years, I know what a graphics card should do and how it should perform, and can readily identify when something is not performing up to expectations and when it exceeds them. 

Read more about how we test

First reviewed January 2023

Nvidia GeForce RTX 4090 review: the very best there is, without question
4:00 pm | October 11, 2022

Author: admin | Category: Computers Computing Computing Components Gadgets | Tags: , , , | Comments: Off

Editor's Note

• Original review date: October 2022
• Launch price: MSRP at $1,599 / £1,649 / AU$2,959
• Lowest price now: $2459 / £1,999.99 / AU$2,999

Update – April 2025: The Nvidia RTX 4090 is currently the second most powerful 'consumer' graphics card on the market today, so it's definitely still worth buying after the release of the Nvidia GeForce RTX 5090.

Whether it's gaming or creative work, this is one of the best graphics cards you can get, capable of some of the fastest 4K framerates around with creative chops second only to the RTX 5090.

Its market price is way higher than its launch MSRP (often close to double its launch price, at least in the US), but if you're in the market for an RTX 4090, chances are money isn't as big of a concern as it is further down the premium GPU stack, and this might be an excellent alternative to the RTX 5090, whose price right now is simply offensive. Your biggest problem, though, is going to be finding this card, which is increasingly difficult as most retailers selling new cards are completely sold out and aren't expecting restocks, so you might have to look to the RTX 5090 or RTX 5080 instead.

Original unedited review follows...

Nvidia GeForce RTX 4090: two minute review

Well, the Nvidia GeForce RTX 4090 is finally here, and there's no question that it delivers on many of the lofty promises made by Nvidia ahead of its launch, delivering stunning gen-on-gen performance improvement that is more akin to a revolution than an advance. 

That said, you won't find four times performance increases here, and only in some instances will you see a 2x increase in performance over the Nvidia GeForce RTX 3090, much less the Nvidia GeForce RTX 3090 Ti, but a 50% to 70% increase in synthetic and gaming performance should be expected across the board with very rare exceptions where the GPU runs too far ahead of the CPU.

On the creative side of things, this card was made to render, completely lapping the RTX 3090 in Blender Cycles performance, which makes this the best graphics card for creatives on the market, period, hands down.

On the gaming side, this is the first graphics card to deliver fully native 4K ray-traced gaming performance at a very playable framerate, without the need for DLSS, showing the maturity of Nvidia's third-generation ray tracing cores.

Even more incredible, Nvidia's new DLSS 3 shows even more promise, delivering substantially faster framerates over the already revolutionary DLSS 2.0. And while we did not test DLSS 3 as extensively as we did the RTX 4090's native hardware (for reasons we'll explain in a bit), from what we've seen, Nvidia's new tech is probably an even more important advance than anything having to do with the hardware.

On trhe downside, the card does require even more power than its predecessor, and when paired with something like the Intel Core i9-12900K, you're going to be pulling close to 700W of power between these two components alone. Worse still, this additional power draw requires some very strategic cable management to practically use, and for a lot of builders, this is going to be a hard card to show off in a case with a bundle of PCIe cables in the way.

The price has also increased over its predecessor, though given its incredible performance and the price of the previous graphics card champ, the RTX 3090 Ti, the RTX 4090 offers for more performance for the price than any other card on the market other than the Nvidia GeForce RTX 3080 and Nvidia GeForce RTX 3080 Ti. So even though the Nvidia RTX 4090 is a very expensive card, what you are getting for the price makes it a very compelling value proposition if you can afford it.

In the end, the Nvidia GeForce RTX 4090 is definitely an enthusiast graphics card in terms of price and performance, since the level of power on offer here is really overkill for the vast majority of people who will even consider buying it. That said, if you are that enthusiast – or if you are a creative or a researcher who can actually demonstrate a need for this much power – there's isn't much else to say but to buy this card. 

It is more powerful than many of us ever thought it could be, and while I'd definitely argue that the Nvidia GeForce RTX 4070 Ti or AMD Radeon RX 7900 XTX is the better purchase for gamers given their price, the RTX 4090 was always going to be a card for the early adopters out there, as well as creatives who are out to spend the company's money, not their own, and the RTX 4090 will give you everything you could want in an enthusiast graphics card.

Nvidia GeForce RTX 4090: Price & availability

A man's hand holding an Nvidia RTX 4090 in an office

(Image credit: Future)
  • How much is it? MSRP listed at $1,599 (about £1,359, AU$2,300)
  • When is it out? It is available October 12, 2022.
  • Where can you get it? Available in the US, UK, and Australia.

The Nvidia GeForce RTX 4090 goes on sale worldwide on October 12, 2022, with an MSRP of $1,599 in the US (about £1,359/AU$2,300).

This is $100 more than the MSRP of the RTX 3090 when it was released in September 2020, but is also about $400 less than the MSRP of the RTX 3090 Ti, though the latter has come down considerably in price since the RTX 4090 was announced.

And while this is unquestionably expensive, this card is meant more as a creative professional's graphics card than it is for the average consumer, occupying the prosumer gray area between the best gaming PC and a Pixar-workstation.

Of course, third-party versions of the RTX 4090 are going to cost even more, and demand for this card is likely to drive up the price quite a bit at launch, but with the crash of the cryptobubble, we don't think we'll see quite the run-up in prices that we saw with the last generation of graphics cards.

Finally, one thing to note is that while this is an expensive graphics card, its performance is so far out ahead of similarly priced cards, that it offers a much better price to performance value than just about any other card out there, and it is far ahead of its immediate predecessors in this regard. Honestly, we don't really see this kind of price-to-performance ratio outside of the best cheap graphics cards, so this was definitely one of the biggest surprises coming out of our testing.

  • Value: 4 / 5

Nvidia GeForce RTX 4090: features & chipset

The 16-pin power connector on an Nvidia RTX 4090

(Image credit: Future)
  • 4nm GPU packs in nearly three times the transistors
  • Substantial increase in Tensor Cores
  • Third generation RT Cores
Nvidia GeForce RTX 4090 key specs

GPU: AD102
CUDA cores: 16,384
Tensor cores:
512
Ray tracing cores: 128
Power draw (TGP): 450W
Base clock: 2,235 MHz
Boost clock:
2,520 MHz
VRAM: 24GB GDDR6X
Bandwith: 1,018 GB/s
Bus interface: PCIe 4.0 x16
Outputs:
1 x HDMI 2.1, 3 x DisplayPort 1.4a
Power connector: 1 x 16-pin

The Nvidia GeForce RTX 4090 features some major generational improvements on the hardware front, courtesy of the new Nvidia Lovelace architecture. For one, the AD102 GPU uses TSMC's 4nm node rather the Samsung 8nm node used by the Nvidia Ampere GeForce cards.

The die size is 608mm², so a little bit smaller than the 628mm² die in the GA102 GPU in the RTX 3090, and thanks to the TSMC node, Nvidia was able to cram 76.3 billion transistors onto the AD102 die, a 169% increase in transistor count over the GA102's 28.3 billion.

The clock speeds have also see a substantial jump, with the RTX 4090's base clock running at a speedy 2,235 MHz, compared to the RTX 3090's 1,395 MHz. It's boost clock also gets a commesurate jump up to 2,520 MHz from 1,695 MHz. 

It's memory clock is also slightly faster at 1,325 MHz, up from 1,219 MHz, giving the RTX 4090 a faster effective memory speed of 21.2 Gbps versus the RTX 3090's 19.5 Gbps. This lets the RTX 4090 get more out of the same 24GB GDDR6X VRAM as the RTX 3090.

When it comes to the number of cores, the RTX 4090 packs in 56% more streaming multiprocessors than the RTX 3090, 128 to 82, which translates into nearly 6,000 more CUDA cores as the RTX 3090 (16,384 to 10,496). That also means that the RTX 4090 packs in 46 additional ray tracing cores and an additional 184 Tensor cores, and next-gen cores at that, so they are even better at ray tracing and vectorized computations than its predecessor.

This is immediately apparent when cranking up ray tracing to the max on games like Cyberpunk 2077, and especially when running DLSS 3, which makes the jump to full-frame rendering rather than just the pixel rendering done by earlier iterations of DLSS. 

  • Features & Chipset: 5 / 5

Nvidia GeForce RTX 4090: design

An Nvidia RTX 4090 sitting on its retail packaging on a wooden desk

(Image credit: Future)
  • Yeah, that 16-pin connector is a pain to work with
  • A little bit thicker, but a little shorter, than the RTX 3090

The Nvidia GeForce RTX 4090 Founders Edition looks very much like its predecessor, though there are some subtle and not-so-subtle differences. First off, this is a heavier card for sure, so don't be so surprised that we need to start adding support brackets to our PC builds. It might have been optional in the last generation, but it is absolutely a necessity with the Nvidia RTX 4090.

The Founders Edition does not come with one, but third-party cards will likely include them and manufacturers are already starting to sell them separately so we would definitely suggest you pick one up.

Otherwise, the dimensions of the RTX 4090 around that much different than the RTX 3090. It's a bit thicker than the RTX 3090, but it's a bit shorter as well, so if your case can fit an RTX 3090 FE it will most likely fit an RTX 4090 FE.

The fans on either side of the card help pull air through the heatsink to cool off the GPU and these work reasonably well, considering the additional power being pulled into the GPU.

Speaking of power, the RTX 4090 introduces us to a new 16-pin connector that requires four 8-pin connectors plugged into an adaptor to power the card. Considering the card's 450W TDP, this shouldn't be surprising, but actually trying to work with this kind of adapter in your case is probably going to be a nightmare. We definitely suggest that you look into the new PSU's coming onto the market that support this new connector without needing to resort to an adapter. If you're spending this much money on a new graphics card, you might as well go hog and make your life – and cable management – a bit easier.

  • Design: 4 / 5

Nvidia GeForce RTX 4090: performance

An Nvidia RTX 4090 plugged into a test bench

(Image credit: Future)
  • Unassisted native 4K ray-traced gaming is finally here
  • Creatives will love this card

So here we are, the section that really matters in this review. In the lead up to the Nvidia GeForce RTX 4090 announcement, we heard rumors of 2x performance increases, and those rumors were either not too far off or were actually on the mark, depending on the workload in question.

Image 1 of 8

A chart showing the relative performance of the Nvidia GeForce RTX 4090 against competing GPUs

(Image credit: Future / InfoGram)
Image 2 of 8

A chart showing the relative performance of the Nvidia GeForce RTX 4090 against competing GPUs

(Image credit: Future / InfoGram)
Image 3 of 8

A chart showing the relative performance of the Nvidia GeForce RTX 4090 against competing GPUs

(Image credit: Future / InfoGram)
Image 4 of 8

A chart showing the relative performance of the Nvidia GeForce RTX 4090 against competing GPUs

(Image credit: Future / InfoGram)
Image 5 of 8

A chart showing the relative performance of the Nvidia GeForce RTX 4090 against competing GPUs

(Image credit: Future / InfoGram)
Image 6 of 8

A chart showing the relative performance of the Nvidia GeForce RTX 4090 against competing GPUs

(Image credit: Future / InfoGram)
Image 7 of 8

A chart showing the relative performance of the Nvidia GeForce RTX 4090 against competing GPUs

(Image credit: Future / InfoGram)
Image 8 of 8

A chart showing the relative performance of the Nvidia GeForce RTX 4090 against competing GPUs

(Image credit: Future / InfoGram)

Across our synthetic benchmark tests, the Nvidia RTX 4090 produced eye-brow raising results from the jump, especially on more modern and advanced benchmarks like 3DMark Port Royal and Time Spy Extreme, occasionally fully lapping the RTX 3090 and running well ahead of the RTX 3090 Ti pretty much across the board.

Image 1 of 6

A chart showing the synthetic benchmark performance of the RTX 4090 against competing cards

(Image credit: Future / InfoGram)
Image 2 of 6

A chart showing the synthetic benchmark performance of the RTX 4090 against competing cards

(Image credit: Future / InfoGram)
Image 3 of 6

A chart showing the synthetic benchmark performance of the RTX 4090 against competing cards

(Image credit: Future / InfoGram)
Image 4 of 6

A chart showing the synthetic benchmark performance of the RTX 4090 against competing cards

(Image credit: Future / InfoGram)
Image 5 of 6

A chart showing the synthetic benchmark performance of the RTX 4090 against competing cards

(Image credit: Future / InfoGram)
Image 6 of 6

A chart showing the synthetic benchmark performance of the RTX 4090 against competing cards

(Image credit: Future / InfoGram)

This trend continues on to the GPU heavy creative benchmarks, with the Nvidia RTX 4090's Blender performance being especially noteable for more than doubling the RTX 3090 Ti's performance on two out of three tests, and blowing out any other competing 4K graphics card in Cycles rendering.

On Premiere Pro, the RTX 4090 scores noticeably higher than the RTX 3090 Ti, but the difference isn't nearly as dramatic since PugetBench for Premiere Pro measures full system performance rather than just isolating the GPU, and Adobe Photoshop is a heavily raterized workload, which is something that AMD has an advantage in over the past couple of generations, which is something we see pretty clearly in our tests. 

Image 1 of 5

A chart showing creative benchmark performance of the Nvidia RTX 4090 against competing cards

(Image credit: Future / InfoGram)
Image 2 of 5

A chart showing creative benchmark performance of the Nvidia RTX 4090 against competing cards

(Image credit: Future / InfoGram)
Image 3 of 5

A chart showing creative benchmark performance of the Nvidia RTX 4090 against competing cards

(Image credit: Future / InfoGram)
Image 4 of 5

A chart showing creative benchmark performance of the Nvidia RTX 4090 against competing cards

(Image credit: Future / InfoGram)
Image 5 of 5

A chart showing creative benchmark performance of the Nvidia RTX 4090 against competing cards

(Image credit: Future / InfoGram)

Gaming is obviously going to see some of the biggest jumps in performance with the RTX 4090, and our tests bear that out. Most gaming benchmarks show roughly 90% to 100% improved framerates with the RTX 4090 over the RTX 3090, and roughly 55% to 75% better performance than the Nvidia RTX 3090 Ti.

Image 1 of 13

A chart showing the gaming performance of the Nvidia RTX 4090 against competing graphics cards

(Image credit: Future / InfoGram)
Image 2 of 13

A chart showing the gaming performance of the Nvidia RTX 4090 against competing graphics cards

(Image credit: Future / InfoGram)
Image 3 of 13

A chart showing the gaming performance of the Nvidia RTX 4090 against competing graphics cards

(Image credit: Future / InfoGram)
Image 4 of 13

A chart showing the gaming performance for the Nvidia RTX 4090 against competing cards

(Image credit: Future / InfoGram)
Image 5 of 13

A chart showing the gaming performance of the Nvidia RTX 4090 against competing graphics cards

(Image credit: Future / InfoGram)
Image 6 of 13

A chart showing the gaming performance of the Nvidia RTX 4090 against competing graphics cards

(Image credit: Future / InfoGram)
Image 7 of 13

A chart showing the gaming performance of the Nvidia RTX 4090 against competing graphics cards

(Image credit: Future / InfoGram)
Image 8 of 13

A chart showing the gaming performance of the Nvidia RTX 4090 against competing graphics cards

(Image credit: Future / InfoGram)
Image 9 of 13

A chart showing the gaming performance of the Nvidia RTX 4090 against competing graphics cards

(Image credit: Future / InfoGram)
Image 10 of 13

A chart showing the gaming performance of the Nvidia RTX 4090 against competing graphics cards

(Image credit: Future / InfoGram)
Image 11 of 13

A chart showing the gaming performance of the Nvidia RTX 4090 against competing graphics cards

(Image credit: Future / InfoGram)
Image 12 of 13

A chart showing the gaming performance of the Nvidia RTX 4090 against competing graphics cards

(Image credit: Future / InfoGram)
Image 13 of 13

A chart showing the gaming performance of the Nvidia RTX 4090 against competing graphics cards

(Image credit: Future / InfoGram)

These numbers are likely to jump even higher when you factor in DLSS 3. DLSS 3 isn't available in any commercially available games yet, but we were able to test DLSS 3 on a couple of special builds games that will be available shortly after the release of the RTX 4090. A few of these games had in-game benchmarks that we could use to test the performance of DLSS 3 using Nvidia's FrameView tool and the results showed two to three times better performance on some games than we got using current builds on Steam with DLSS 2.0.

Since we were using special builds and Nvidia-provided tools, we can't necessarily declare these results representative until we are able to test them out on independent benchmarks, but just eyeballing the benchmark demos themselves we see an obvious improvement to the framerates of DLSS 3 over DLSS 2.0. 

Whether the two to three times better performance will hold up after its official release remains to be seen, but as much as DLSS 2.0 revolutionized the performance of the best PC games, DLSS 3 looks to be just as game-changing once it gets picked up by developers across the PC gaming scene. Needless to say, AMD needs to step up its upscaling game if it ever hopes to compete on the high-end 4K scene.

Now, there is a real question about whether most gamers will ever need anything coming close to this kind of performance, and there is such a thing as diminishing returns. Some might find that the native 4K ray tracing is neat, but kind of redundant since DLSS can get you roughly the same experience with an RTX 3090 or even an RTX 3080 Ti, but that's a judgment that individual consumers are going to have to make. 

Personally, I think this card is at least approaching the point of overkill, but there's no doubt that it overkills those frame rates like no other.

  • Performance: 5 / 5

Should you buy an Nvidia GeForce RTX 4090?

An Nvidia RTX 4090

(Image credit: Future)

Buy the Nvidia GeForce RTX 4090 if…

You want the best graphics card on the market
There really is no competition here. This is the best there is, plain and simple.

You want native 4K ray-traced gaming
DLSS and other upscaling tech is fantastic, but if you want native 4K ray-traced gaming, this is literally the only card that can consistently do it.

You are a 3D graphics professional
If you work with major 3D rendering tools like Maya, Blender, and the like, then this graphics card will dramatically speed up your workflows.

Don’t buy the Nvidia GeForce RTX 4090 if…

You're not looking to do native, max-4K gaming
Unless you're looking to game on the bleeding edge of graphical performance, you probably don't need this card.

You're on a budget
This is a very premium graphics card by any measure.

Also consider

Nvidia GeForce RTX 3090
The RTX 3090 isn't nearly as powerful as the RTX 4090, but it is still an amazing gaming and creative professional's graphics card and is likely to be very cheap right now.

Read the full Nvidia GeForce RTX 3090 review

Nvidia GeForce RTX 3080
The RTX 3080 is a far cry from the RTX 4090, no doubt, but the RTX 3080 currently has the best price to performance proposition of any 4K card on the market. If you're looking for the best value, the RTX 3080 is the clear winner here.

Read the full Nvidia GeForce RTX 3080 review

AMD Radeon RX 6950 XT
In another universe, AMD would have lead the Big Navi launch with the RX 6950 XT. It is a compelling gaming graphics card, offering excellent 4K gaming performance on par with the RTX 3090 and generally coming in at the same price as the RTX 3080 Ti.

Read the full AMD Radeon RX 6950 XT review

First reviewed in October 2022

Nvidia GeForce RTX 4090 Report Card

« Previous PageNext Page »