I bought the AMD Radeon RX 9070XT graphics card the day after launch, and it seems like an amazing card - I’ve had some brilliant gaming results with it, and it also runs fairly cool. However there are some issues, such as coil whine problems - and a lack of FSR4 support. So in this mega review, I wanted to dive into everything you need to know about the card.

I purchased the Sapphire Pulse RX 9070XT model, however 99% of the review will apply to all other 9070XT models too.

I cover 4K and 1440p gaming in games like Cyberpunk 2077, Hogwarts Legacy, Metro Exodus, Assassin’s Creed Mirage, Guardians Of The Galaxy and Red Dead Redemption 2.

I also use my thermal camera to show off exactly where the hotspots on the 9070XT GPU are, and I also tested with both air and liquid coolers (a Noctua NH-D15 for air, and a Corsair RX Titan AIO cooler for liquid). I also use smoke to analyze how the air flow works in my case (Lian Li Lancool II RGB) with this Radeon graphics card.

The three most recent PCIe generations (gen 3, gen 4 and gen 5) are also covered so you can decide whether to keep your gen 3/4 motherboard, or whether you need to upgrade to gen 5.

Finally I also looked at FSR4 in Smite, I analyzed x4 vs x8 vs x16 PCIe performance - and I discuss the coil whine problems that are currently affecting my card… along with how to possibly benefit this.

(Link to direct YouTube video)

If you prefer text over video, please read on for the guide/transcript version of this video.

Video Transcript And Guide

Hey everyone, I recently bought the 9070 XT graphics card, and I’m really excited about it because it seems like a really good card. I think AMD has knocked it out of the park. In this video, I wanted to actually do a really detailed video and discuss everything you might need to know about the card, including the fact that, right now, there is a fairly significant coil whine issue with this particular card. But I think, in time, after doing loads of testing, this issue will go away. I wanted to talk about that later in this video.

Obviously, I’ll be doing lots of gaming benchmarks too, but I also want to dive into things like the thermals. So I used my thermal camera and a bunch of other stuff to actually show exactly where the hotspots are on the card. I’ve also tested with a Noctua air cooler and a Corsair all-in-one liquid cooler, so I can discuss the best cooling methods open to you. I also tested with both a Gen 4 and Gen 5 motherboard, and I also put them in Gen 3 mode as well. This way, I can give you all the data you need to actually know whether you can stick with your Gen 4 motherboard or whether upgrading to Gen 5 is necessary to make use of the fact that this is a PCI Express Gen 5 card. Of course, I’ll give you all of that information so you can make that decision.

I also discuss a lot more than that, but there are chapters in this video, so you can jump to the bit that actually interests you the most.

So, this is the Sapphire Pulse, which is Sapphire’s entry-level model. You’ve then got the Pure and the Nitro+ above it. But actually, Sapphire is a very good graphics card manufacturer for AMD, so the performance you get with this Pulse graphics card will be comparable to some of the other models. Of course, pretty much everything I talk about in this review is applicable to the 9070 XT because that’s the actual GPU die or whatever. So, this is not just a review of the Sapphire Pulse.

The box itself is fairly simple and plain. When you open it up, you’ve got the graphics card in an anti-static bag, of course, and then the accessories box. In there, you’ve got an anti-sag bracket, which is very important to use because this is quite a heavy card. If you don’t use any form of anti-sag bracket, the card could droop down, either damaging the card’s PCB or the PCI Express slot on your motherboard. So, it is worth using this anti-sag bracket.

But then, as I said, there’s nothing else in the box. You don’t need any fancy adapter with 12VHPWR cables for this particular model because it just has standard 8-pin power. I’m not actually a massive fan of how the 8-pin connectors are placed on this particular card because they’re recessed in the back. Often, you need to run the cables in from the back for the best cable management potential. You don’t have anything like the Nitro+ that has that fancy backplate to hide the 12VHPWR power cable you run in. That’s one particular thing I didn’t love, but it’s not a massive deal in the grand scheme of things.

As I said, the card in general is brilliant. I’ll go onto benchmarks in a minute, but the actual 9070 XT is part of AMD’s new RDNA 4 architecture, which is a brand-new architecture. It also brings FSR 4 upscaling, which I’ll talk about later in this video.

But without further ado, I wanted to jump into the gaming benchmarks. You can see the specs that I’m using to test this card on the screen now. In fact, I changed my motherboard as part of this. Originally, I had a B650 motherboard with a Gen 4 PCI Express slot and upgraded that to a like-for-like X870 motherboard, also from the ASRock Pro RS line, which gives me the PCI Express Gen 5 slot. Later on, as I said, I’ll do those comparisons to actually show whether you need to upgrade or not.

Let’s just get on with the gaming benchmarks. I wanted to start out with Cyberpunk because that’s always a great game to try things out on. It’s a really good-looking game still. In native 4K, with no upscaling or anything like that, I used the high preset but no ray tracing for this particular example. It looks and feels really good—not my gameplay necessarily, but the actual game itself.

You can see there is around 70 FPS. The graphics card is using 300 watts, with good, stable 1% lows. The GPU is being worked much harder than the CPU. I’ve only got a mid-range Ryzen 7600 CPU, but in this case, especially at 4K, the graphics card is the bottleneck. It’s a really good showing here.

What happens if we try and turn on FSR for upscaling? Will that lead to better results? We were seeing 70 FPS before, and with this on, we’re now seeing around 120 FPS—so almost double, it would seem. However, I did put frame generation on as well. Unfortunately, the game looks pretty bad. I’m not sure if it comes through because of YouTube’s compression, but the game looks and feels absolutely terrible with FSR 3, unfortunately.

But what about if we do ray tracing on Ultra? Will that make things better?

What I’ve done here is actually put things at 50% speed, and everything just looks bad again. The steps look awful, the windows don’t really look like windows, and everything is just a little bit blurry—almost like some sort of dream world. You can see the ray tracing looks quite good there, but everything else is pretty bad, to be honest. This is due to FSR 3, unfortunately.

Now, this isn’t me hating on AMD or FSR 3—well, I am hating on FSR 3 quite a bit actually, as Hardware Unboxed and other people have shown. But I’m not trying to hate on AMD in general. It’s just that Cyberpunk’s implementation of FSR 3 is really, really bad, so it is worth knowing that. Luckily, these cards do support FSR 4, but right now, Cyberpunk doesn’t support it—unless you want to try out community mods and things like that.

So right now, your options are to try and use FSR 3 or Intel XeSS on Cyberpunk, or just go with native rendering. What I’m going to try doing is using ray tracing on low and seeing whether I can run that natively with no FSR and no frame generation, and see if that works.

I restarted the game, so this is 4K native with ray tracing on low—some level of path tracing and ray tracing. It looks pretty damn good, in my opinion. We are seeing average FPS rates of around 60 FPS, which I think is actually amazing. That’s a really, really good result with this particular AMD graphics card because previously, you wouldn’t have had these sorts of numbers.

I’m actually really impressed with this, and the 1% lows as well are pretty good—around 40 FPS. Obviously, many people wouldn’t want to game at 40 or 50 FPS, but this actually looks and feels fairly stable. I don’t get any stutters, and I think it looks brilliant.

RT Low is actually still very, very good in this game. But what if you dial it down to 1440p? Ray tracing low, a little bit of ray tracing and path tracing on, and then no FSR and no frame generation—you get 90 FPS, which is really, really good. It’s a really impressive result, and actually, it all looks stunning.

Again, this feels really smooth and everything else. The fact that this card can get these sorts of frame rate levels with 1440p native is actually really, really good in my opinion—especially since the city area tends to stress things out quite a lot.


In this final comparison, I wanted to look at the RTX 5080, which I’ve also got, and compare it to this particular card because I am hoping to be able to get rid of my RTX 5080, to be honest, based on how good the AMD card is.

If we look at the average FPS rates, it is lower on the AMD card—looking at 60 to 70 compared to 80 or 90 on the Nvidia card. But actually, I still think that’s pretty damn good.

Now, obviously, the 9070 XT isn’t designed to be a competitor to the RTX 5080, but I wanted to give that final example just so you can have some context because I’ve got both graphics cards. It’s quite interesting that really, I think there’s only about a 20% performance difference without ray tracing because this was tested on high with ray tracing disabled.

Actually, I think a 20% performance difference is pretty damn good for AMD because there’s not a 20% price difference between the two cards. Often, you can pay almost double for the 5080 compared to a 9070 XT, so I think AMD is offering really good value with this card.


Next, I wanted to look at Hogwarts Legacy.

So, I launched the game and waited for shaders to compile—which takes ages. But I did this at 4K with FSR 2 enabled, so 1440p internally, and everything maxed out. Ultra graphical settings, ray tracing on, with no frame generation.

Actually, it looks really good, considering it’s FSR 2. And the FPS rates are really good too—above 130, 140 FPS, with good 1% lows as well—around 80 to 90 FPS.

The graphics card is using around 300 watts, and it’s doing a really good job. So, I wanted to see what would happen now if we just did native rendering. Would that change anything?

Yeah—same settings, everything maxed out, and again, we see 90 to 100 FPS, which is really good. This is 4K native, so really, my GPU is being maxed out. My CPU is not being used all that much, and again, it’s doing a really good job.

Now, the graphics card is using almost 14 gigs of VRAM, so I’m really glad that AMD is offering 16 gigs on this particular card.

Then, if we look at 1440p native and scale it down, what we actually see is the card starts—not struggling so much—but actually, GPU usage is now often below 90%, and the CPU becomes a little bit more of a bottleneck.

Still, we’re getting very, very good frame rates—over 100 FPS. Everything looks really, really good here, and actually, the board is now only drawing less than 250 watts of power, which is quite interesting. It’s using less VRAM too.

This is a really good experience if you wanted to game at 1440p native. That’s very good.

So yeah, this card does really well in Hogwarts Legacy, even though it’s got FSR 2. That’s certainly better than Cyberpunk’s FSR 3 implementation. But obviously, you can play natively at 4K or 1440p with really good frame rates too, so yeah—massive win.


Next, I wanted to check out Metro Exodus.

Moving on to Metro Exodus—again, 4K native, and I like gaming at that. Actually, you know, 60 FPS, good 1% lows as well, 12 gigs of VRAM usage, and the graphics card is pulling around 300 watts.

In this case, GPU utilization is almost always at 100%, while CPU usage is mainly below 40%—though it did spike up a bit. But in general, this is a good experience. If you are happy to game at 60 FPS, it’s pretty good.

This is with everything maxed out—Ultra settings and Ray Tracing Ultra as well.

If we now drop that down to high, with RT set to high as well, immediately, the FPS rate goes up. Instead of getting around 60 FPS, we’re getting 70 to 80 FPS, depending on where you are in the game. 1% lows are above 60 FPS, so another really good experience.

This is a solid FPS rate with no frame generation or anything like that, considering it’s native 4K rendering.

If we look at 1440p and stick with high as opposed to ultra, what we see here is really, really good frame rates150-160+ FPS, with 1% lows often above 120 FPS as well.

The graphics card continues to pull in 300 watts here, and again, the GPU is the main thing being used. A lot of my CPU is being hit—50-60% CPU usage.

So, using something like a 9800X 3D would obviously increase frame rates even more. But again, you’re looking at 150+ FPS with pretty good graphical image quality.

Actually, I think this is a really good experience.

So, I’ll go back to gaming stuff in just a minute, but I wanted to talk about thermals. I did a lot of testing here because people often say that, you know, AMD graphics cards run hotter than NVIDIA ones. So I ran FurMark, I used my thermal camera, I tested with a Noctua air cooler and a Corsair all-in-one liquid cooler just to give you as much data as possible, and I also monitored the exhaust temps just so we can see how those work. But let’s dive into the results.

So the first thing I did is run FurMark for 20 minutes ‘cause that really stresses out the GPU, and then I just kept an eye on the thermals and everything else. At first, I tested with my Noctua air cooler, and you can see that there’s not much of a gap between the GPU and the air cooler, but that shouldn’t affect performance that much, although you do see the CPU temps are averaging around in the 70s and actually sometimes they peak at 80. So you’re not going to get thermal throttling, but it is decently high.

Then if we look at the GPU, we get a GPU temperature of 55, memory temperature of 90, and if you look at the comparison to the 80, the board temperatures are often 20° lower, which is really good. Although obviously, you know, the memory temperature is higher, but that’s actually still well within range.

But next I wanted to move to an AIO cooler just to see whether that would affect things. I didn’t think it would affect the GPU temps, but obviously it would have more of an impact on the CPU temps. But that’s me just fitting it, and that’s the final effect and the radiator at the top in this particular case, just so I still got those case fans at the front bringing air in. And you can see again I’m just running the furry donut there to actually stress out my GPU.

And if we look at the final temperatures now, you know, it’s similar. You know, GPU temperature 50-yard, memory temperature 89-90, hotspot of 80. So all really good actually. But my CPU temperatures are a lot more under control. And there was nothing wrong with the air cooler temperatures for the CPU; they weren’t going to thermal throttle or anything, but obviously the AIO cooler has kept things under control a lot more.

When it comes to the actual exhaust temperatures, which is ultimately the thing that’s going to heat up your room and everything else, what I actually did was get a little thermostat—you can see there—and put it into the rear exhaust fan. And then I tested with both my Noctua air cooler and obviously the all-in-one cooler as well, and I seen, you know, what was the actual maximum temperature.

So even though this is a clamp meter, it can actually detect temperature too, and with the all-in-one cooler, the maximum exhaust temperatures I seen was 33° C or 91° F. With the air cooler, it was slightly higher. I think things peaked around 35° C or 95° F. So not too much difference, only a couple of degrees really.

And next I wanted to get my thermal camera and just check things out there. So with my Noctua air cooler, you can see the actual heat fins of the CPU cooler aren’t very hot—they’re below 30° C—and it’s the graphics card itself, you know, pictured by that little red mark, that’s actually above 40° C.

And if we dive into things a bit more, if you look at my exhaust fans with my Noctua air cooler, I had two there, and it was only the one on the left that was actually getting most of the heat, to be honest. The one on the right wasn’t doing that much.

If we then go back to looking at the graphics card, the actual back plate was hitting around 50-odd degrees. It was fairly hot to touch, but that is normal and in keeping with what we’ve seen with HWinfo.

If we then look at the right-hand side of things, the actual motherboard itself is below 40° C, and is mainly the graphics card on the right-hand side that is the warmest bit above 45° C. But again, that is normal. You’d expect that for graphics cards when gaming.

If I then look up with my all-in-one cooler and I do the same sort of tests again, the graphics card is going to be the hottest thing. You know, there’s not actually that much heat at all on the motherboard or the CPU there. So the pump is doing its job, and because we haven’t got that massive air cooler in the way, we can see where the GPU die is more, and actually that’s where most of the heat is, you know, 60 to 70° C. That’s the main hot spot.

And if we actually dive into things from the underside a little bit more, or the side view, you can see it’s going up to around 75 to 80° C on the maximum reading, which is in line with what we were seeing with HWInfo. And that’s obviously because if we look at this static image, that’s where the GPU die is. That’s where the hot spot was, and that’s completely normal.

I’ve pictured the area here. You would fully expect that area to be the hottest, and that’s why you’ve obviously got that leftmost fan, which is obviously going to be bringing in a bunch of air to actually cool it. And that’s why, personally, I like actually having some intake fans under my graphics card. It’s not essential, but obviously by having those intake fans, it’s going to bring in more cool air and just help to cool things a bit more.

And I wanted to highlight this by playing with fire literally—so don’t try this at home—by actually buying some smoke matches. And what you can actually see once I’ve actually lit it, it’s giving off a load of smoke, and you can see the air flow a little bit. So the intake fans are actually bringing some cool air in. The graphics card is then going to use that to cool itself and also that gets exhausted out of the rear exhaust fan, mainly.

You can see it there, and then obviously in the leftmost fan, it’s actually bringing that into the graphics card, and then that’s actually being vented out of the side of the graphics card. The more intake fans you have, the better really, because that’s going to help cool a graphics card that pulls in 300 watts, for example.

But what is nice about this graphics card design, and it’s a fairly conventional one, but our left-hand fan is going to bring in cool air, it’s going to cool the GPU die, and then exhaust out of the side that you can see on the screen now. And that’s quite a nice design, and I think that’s why we were seeing the GPU temperatures be in the 50° compared to like the RTX 80, where obviously things are a lot warmer, going up to like 70-odd degrees on average. So the 9070XT cooling design is quite nice, in my opinion.

And again, going back to the point of having as much air flow as possible, you can see how the air flow is actually working for my three intake fans. And actually, you know, the bottom one’s going to be drawing air into the graphics card, the middle one well, and the top one really isn’t going to be doing all that much for the graphics card. It’s going to help cool the motherboard and the VRM and the CPU if you had an air cooler, but obviously less important if you’ve got a pump.

To be honest, I think the main thing if you’ve got an all-in-one cooler is actually just make sure you’ve got at least one exhaust source fan, but you can see there, the air flow, and actually things are looking quite effective. And my system’s staying fairly cool, so I’m quite happy with the cooling performance.

But next, let’s take a look at FSR 4. FSR is AMD’s upscaling technology, and I’m not going to talk about exactly what that is in loads of detail, but let’s say I’m playing a game at 4K with everything maxed out and, you know, it’s really, really struggling. What you can actually do is turn on upscaling, and internally the game will be rendered at 1440p, for example, and then FSR, in this particular case, will upscale it back to 4K.

But unfortunately, FSR 3 wasn’t exactly very good, and that’s where FSR 4 comes in with the 9000 Series launch from AMD, because AMD really needed to make some graphical gains here with FSR 4. And they have done that from what I can tell, but actually the unfortunate thing is not many games support FSR 4.

There’s various lists online, and often they have like 20 to 30 games that apparently support FSR 4. But what I actually found in my experience is that some of them actually don’t. I actually went through my existing game library, and none of those supported FSR 4. So then I downloaded Marvel Rivals ‘cause it’s free and I’m cheap, and apparently that supports FSR 4. And to be honest, it doesn’t seem to support it. Like, I looked as much as I could, I looked in the AMD software, and I went into the game itself, and there’s no option to, you know, override to FSR 4 support.

And actually, if you look at many of the FSR 4 lists, what they say is “current or upcoming games.” So I’m assuming the Marvel Rivals is one of those games that supports DLSS 4, NVIDIA’s latest, but doesn’t support AMD’s FSR 4, which is a big pity.

And during the process of finding this out, I accidentally got sucked into an online game in Marvel Rivals, and I didn’t have a clue what I was doing. I was playing with real people and kept cocking up, so sorry if you were playing with me. But yeah, it’s actually quite a fun game—I just kept smashing things. I didn’t have a clue what I was doing. But after dying one too many times, I got bored and I downloaded Smite 2 instead, and that does have an FSR 4 option within the AMD software.

Now, in time, some games will actually be updated to actually have the FSR toggle within the game options itself, within the settings menu. But in most other cases, you’re going to use the AMD software and actually override things. So if a game supports FSR 3.1 and then AMD whitelist it, what you can actually do is come along and enable FSR 4 within the AMD software, as you can see here. And then once you launch the game, it says FSR 4 is available but not necessarily enabled, ‘cause you then got to go into the game and make sure FSR is turned on. So quality or performance or whatever else, and then if you Alt+Z, the overlay will then show that FSR 4 is active. And then you can play the game.

And actually, yeah, this was a really fun game as well. I just kept spamming things, number keys and everything else. And yeah, good game. But actually, this is FSR 4, and, you know, graphically this isn’t the most demanding of games, but FSR 4 looked and felt really good. I had no issues or bugs with it. But obviously, the game isn’t the most graphically challenging. And so when I compared that to FSR 3, I couldn’t personally notice that much difference. But certainly there was no regressions with FSR 4, and obviously my frame rates were quite good too.

And actually I had Frame Gen on, and for this sort of game where it’s online and multiplayer and fast action, you know, Frame Gen can be a bit debatable. But actually I had a good experience with FSR 4 and Frame Gen, so that is pretty good.

But what’s not good is this segue to our sponsor—kidding, kidding! No, what’s not good is the coil whine on this particular card, which unfortunately is a real issue.

So I put a microphone onto the back of my case, as you can see here, and then I recorded coil whine in a range of situations. Some games didn’t seem to have any at all, but games like Guardians of the Galaxy had it quite a lot. You can hear exactly the sound now, and it’s not exactly good, to be honest.

I did find when playing the game, sometimes the coil whine got better. But often, you know, it wasn’t great to be honest. However, the fact that it happens more in menus than when playing the game could mean that actually if you set like an FPS cap, that could actually help to reduce or even eliminate coil whine.

But as a result of that, what I wanted to do as well is play around with power limiting within the AMD app to see whether that would reduce coil whine too. And I mainly focused on reducing the fan speed just in case, and then obviously the power limit like I mentioned. So I put a minus 10% power limit in the first instance and ran the Guardians of the Galaxy benchmark. And I’ve ramped up the audio a little bit just so you can hear the coil whine.

Hopefully you might need to be wearing headphones, but certainly at least for me with headphones on, it is quite audible now. But if you notice there, the coil whine stopped when it went from one scene to another. Same thing there—the coil whine sort of comes and goes, which is a bit weird really, but that can happen with coil whine.

But next I wanted to actually try out a minus 30% power limit and see whether that would change anything. And I think it reduced it. It certainly reduced it, it’s not as audible in my opinion, and now you can probably hear the fans more than the coil whine, whereas with a minus 10% power limit, I think you could hear the coil whine more than fans. So it’s still annoying, but it’s an improvement, I guess.

So that’s obviously a very quick test, but I did notice a reduction in the coil whine. It wasn’t completely eliminated, but by just whacking a 20% or minus 20% power limit on it, I actually seen, you know, not much change in the FPS rate. I think it was 7 FPS drop on average, and it did obviously reduce the coil whine too.

And my hunch is that coil whine is often something you see more with brand new products, and then it gets a bit better over time. So I am hopeful that actually, you know, through playing with power limiting and maybe putting an FPS cap on and just waiting for a few weeks, I’m hoping that the coil whine situation will get a little bit better.

Next up, I wanted to talk about motherboards and more specifically PCI Express generations. I ran through quite a lot of tests here, running 3D Mark, Assassin’s Creed Mirage, Cyberpunk 2077, Guardians of the Galaxy, and finally Red Dead Redemption 2. And I may as well jump directly into the results.

So I tested everything at Gen 4 x16 via my B650 motherboard, and then I went into the BIOS and actually overridden things to Gen 3 and repeated the same test. Then I tested everything at Gen 5 with my new X870 motherboard, and here are the average FPS scores.

There’s not much point to me saying much more really, because overall the average FPS rate is very similar. I’ll call out a few separate results in a minute, but other than Guardians of the Galaxy, you’re not going to see much difference, if any, between, you know, a Gen 3 motherboard and a Gen 5 motherboard, which is good to know.

The one with Guardians of the Galaxy—well, I think this was more of an outlier because in all my other tests, things were very, very similar. So I don’t know why there was such a big difference, but certainly this was one game where, yeah, Gen 5 seemed to do a little bit better.

If you’ve got a Gen 4 motherboard, for example, you don’t need to upgrade to Gen 5 unless that Gen 5 board has, you know, other features that you want. But certainly Gen 4 x16 will work perfectly fine, as well, you know, Gen 3 x16 as well.

What doesn’t work as well is if you start going down to like x8 or x4 bandwidth levels. So this took quite a long time to test, to be honest. It was also kind of fun. So what I did in the BIOS was I went in and I actually changed the main x16 slot to like x8 or x4 speeds via the BIOS bifurcation option, essentially.

And then I reran all my tests, you know, Assassin’s Creed there at Gen 3 x4 and then Gen 3 x8.

I’ll dive into the results, then call some specific ones out in a minute. But at x8 bandwidth levels, there’s no real difference apart from Guardians of the Galaxy—that always seems to vary. But with everything else, it’s pretty similar.

But when we come to x4, we do start to see more performance drop-offs. It’s hard to show with average FPS, but I’ll show more specific results in a minute. But in general, we did start to see a little bit of drop-off with Gen 3 and Gen 4 at x4.

And as we can see here, at Gen 3 x4, the GPU is being starved of that bandwidth, and then that is causing the CPU to wait longer, and that’s affecting the minimum FPS rates. It’s a little bit better with Gen 4 x4, as you can see the minimum FPS rate is higher, but the CPU is still waiting. So this isn’t really ideal.

So at x4, you’re going to have some issues unless you use Gen 5, which had no issues at all. So the Gen 5 x4 results were comparable to the Gen 5 x8 and x16 results.

And I kind of feel that that’s a lot of data to get through, but that was the purpose of this particular review—is just to give you loads of information. But with YouTube chapters, you can skip to the bits you want so that, you know, if you’re debating whether to go with an air cooler or liquid cooler, you can actually decide that. Or if you’re wondering what motherboard to go for, you can decide that too, hopefully with this review.

But I think overall it’s fair to say that AMD have knocked it out of the park with this particular graphics card. I’m actually really impressed with what they’ve done for it. It’s a good 4K gaming graphics card, and yes, FSR 3 is rubbish and games like Cyberpunk, but, you know, actually you can play natively. That’s the brilliant thing with this card.

And I’m hoping in time FSR 4 support does get rolled out more, and then I’ll come back to that point in future videos. Equally, I’m aware I haven’t really gone into overclocking or undervolting much in this video. I did show undervolting a little bit where I took 100 watts off things and only had like a seven average FPS drop in Guardians of the Galaxy.

But mainly didn’t cover that because, you know, the average user actually doesn’t want to play around with that too much. But I will probably do a separate video on that in the future, and if you want that, please do let me know down in the comments.

But if you got any questions or anything like that, please let me know down in the comments. This video took a long time to put together, so if you enjoyed it, please click the thumbs up button, and please subscribe to see more videos like this. Thanks for watching.