Who Needs A GPU? (/s) Gaming On My UPGRADED Ryzen 7600 PC
Who needs a GPU? After all, we’re in the age of APUs – CPUs with integrated graphics. Do these allow you to play modern PC games without needing to buy a really expensive graphics card?
Well, yes and no. Last year I tried this out, and had pretty bad gaming results on my AMD Ryzen 7600 6-core CPU. HOWEVER since then I have upgraded the RAM, storage and cooling, and my CPU-only gaming performance is MUCH better – as I show off in this video, when playing the following games:
- Skyrim
- The Witcher 2
- OG Crysis
- Rocket League
- FFX HD Remastered
The new specs are:
- Mobo: ASRock B650M Pro RS
- CPU: Ryzen 7600
- RAM: 32GB 6000Mhz CL30 Corsair Vengeance
- Storage: WD_BLACK SN850X 4TB Gen 4 NVMe SSD
If you prefer text over video, please read on for the guide/transcript version of this video.
Video Transcript And Guide
Hey everyone, who needs a graphics card, right? All you need is a Ryzen 7600 CPU, no graphics card, and you can game on it all day long, right? Well, clearly not.
Last year, I did exactly that in my Homelab NAS system, and the gaming results weren’t very good unless you wanted to game at 720p, which you almost certainly didn’t want to do. But since then, I’ve given the system a bit of a glow-up, to use the cool kids’ phrase.
Making My ‘NAS’ More Powerful
I basically upgraded the case to have more cooling. I also changed the stock cooler to a really nice ID cooling one. With the RAM as well, I went from 8 gigs of really slow RAM to two sticks of CL30 6000 MHz RAM, and that makes a big difference, especially with AM5. Finally, I moved everything from a SATA SSD to an NVMe SSD.
Will that lead to better gaming results? Well, yes, it almost certainly will, as we’ll look at in just one minute. But because I have extra thermal headroom as well, I thought, in this video, it would be cool to play around with overclocking and see whether I could get slightly more FPS. And finally, I invested £5 of my hard-earned money in Lossless Scaling. So, towards the end of this video, I’ll play around with that and see whether gaming performance can be improved even more on a Ryzen 7600 system with no GPU.
Old Results Recap
Before I dive into the new results, which I’ll do in just one minute, I wanted to very briefly recap the old results. Skyrim looked quite good at 4K, but we were averaging 8 FPS. It was more of a slideshow, not exactly very playable. And we’ve seen this throughout all of the games. OG Crysis, if we dial it down to 1080p, averaged around 23 FPS, which is almost playable, I guess. You still wouldn’t really want to rely on that. It does struggle.
While it’s pretty impressive that Crysis can run on an integrated graphics CPU and no GPU, it’s still not really great. If we look at all the results here, 4K and 1440p were never playable. 1080p was barely playable, and it was only really at 720p that things were half-decent. That’s why I wanted to do this test to upgrade the hardware because you don’t really want to game at 720p.
Enter The Only Fans Upgrade
Enter the “Only Fans Build” with some upgraded hardware like the RAM and the storage, but obviously extra cooling as well. Immediately, we saw an uplift. In The Witcher 2, previously, it was unplayable at 1080p, but now, we’re getting an average of around 29 FPS. Previously, that was just 12 FPS. That’s quite a significant improvement. Yes, it’s still not perfect, and sometimes we had screen tearing and things like that, but overall, it was a lot better, especially considering there’s no graphics card here.
I know this is an older game, but I think this is actually a fairly good result: getting two to three times the performance. But it’s not all sunshine and rainbows. I did sort of cherry-pick one of the better sets of results there, and it was quite a good improvement, to be fair.
Next, we look at Rocket League, which didn’t fare quite as well. At 1440p, we were averaging around 22 FPS, which at least is slightly better than the 19 FPS rates we were seeing before, but it still wasn’t very good. It’s just a little bit jittery overall, so you wouldn’t really want to play on this. 1080p, though, is a bit better, as you can see here, but actually no real improvement compared to the old hardware—or maybe losing 1 FPS. Overall, we didn’t see that much change apart from at 4K, where Rocket League didn’t run at all on the old system. It was just too laggy and wouldn’t boot up. If you want a game at 13 FPS, goodness, you could do that, but you probably wouldn’t want to. Overall, no real change here.
We did see more difference in something like Skyrim, for example. On that, you can see here, everything is a lot better. At 1440p, we get an almost playable 27 FPS. The 1080p is 41 FPS average, which is actually really good. Remember, there’s no graphics card here—this is just on a CPU that’s not actually designed for gaming. So, I think this is actually a fairly impressive result. As we’ve seen earlier, we see quite a nice improvement in Witcher 2 as well.
720p is a dramatic improvement, doubling performance. 1080p is more than doubling performance, and we see that again with 1440p and 4K. Yeah, you’re not going to want to game at, you know, 20 FPS average at 1440p, but now 29 FPS at 1080p is something you might want to game on. I think that’s correct. Moving to two sticks of RAM, I think that’s the main thing.
But next up: Can it run Crysis? Well, yes, yes, it can. Actually, amazingly, we see a lot better improvement at all resolution levels. Yeah, you wouldn’t want to play at 1440p again because it’s only 18 FPS average, but at 1080p, now you’ve got a playable 32 FPS. I know that’s lower than the 60 FPS average some people like, but I think if you’re going to game without a graphics card, maybe aiming for 30 FPS might be a better idea.
However, unfortunately, Final Fantasy HD Remaster just seemed to struggle. You know, last year when I did these tests, it continued to struggle. Yes, we did get a doubling of FPS rates in 4K, but obviously, 6 FPS still isn’t playable—that’s a slideshow. But what you will notice is the temperatures.
Overclocking FTW..?
Previously, it was 54°C if you look at the GPU bit. Because this is integrated graphics, that’s effectively the CPU. If we keep looking at the GPU bit, we now go down to 38°C, which means, of course, that we’ve got more thermal headroom on the system. It no longer runs as hot, which can only mean one thing: overclocking.
But be forewarned, I’m not very good at overclocking, and my CPU is probably drafting its resignation letter as we speak. So, here we go. Nothing to be concerned about. I went into the BIOS, which is OC Tweaker. I haven’t done this in a long time, and I read on Reddit that you can get around a 5.4 GHz overclock with an increase in the voltage. So, I tried that a little bit and booted up into Windows. Actually, it seemed stable—or at least you can see Task Manager running at 5.4 GHz. So, I thought I’d run Prime95 and actually test this out.
I’ll skip ahead a bit now, but although the temperatures were high, they weren’t substantially high, and things seemed to be okay. Wait, I achieved a 5.4 GHz overclock on my very first attempt? I’m a hacker! Or a cracker! Or I’m an overclocking god! We need to jump into a game and actually see the gaming performance.
Giving Up (Sort Of)
Yeah, so as you can tell from my sarcasm, previously we were seeing an FPS average of 15 at 4K. Now, it was back down to a slideshow of around 6 to 8 FPS, so it was dramatically lower. I think, even though I had run Prime95 for a whopping two or three minutes, something clearly was wrong. I probably needed to run it for longer, and we would have seen that something was actually not very stable with the overclock.
So, I gave it a bit more voltage, increased the frequency for some stupid reason—because, as I said, I’m not great at overclocking—and then I saw something that said “graphical clock.” So, I increased that too. What could go wrong, right? Well, we’re about to see, aren’t we?
So, this is back in the system, and Prime95 was running, but then we started getting errors. You can see there one of the cores failed, and then one of the other cores failed. So, Prime95 is detecting a hardware fault, and in the Task Manager, you can see there two out of the six cores have completely failed. That’s what we were probably seeing before—you know, my overclock wasn’t stable, and that was affecting FPS rates.
So, to be honest, at this point, I just thought, let’s go back to Auto—in other words, turn off the overclock—and see what else we can do. Obviously, the AMD software actually allows you to overclock as well. So, I booted back into the system, and in the AMD software, I turned on the overclock there, which gives you a 0.2 GHz increase, which actually, on this CPU, isn’t too bad, to be honest.
The system rebooted. You can see the CPU voltage there has gone up to around 1.4 volts, but actually, it achieved a stable overclock—hooray! And because this is through the AMD software, hopefully, it’s going to be a bit more stable than my own manual attempts, which weren’t exactly very good. Prime95 was stable for 10 minutes, so I launched into gaming, and I saw better gaming results.
So, whilst previously at 4K, we saw 15 FPS, this now was creeping up to around 16 or 17 FPS average. Not exactly dramatic, but it did actually get better results overall when I was playing, which is promising. So then, at 1440p, I tried that out as well. Previously, we were seeing an average of 27 FPS. This then went up to around 29 or 30 FPS. Again, clearly a bit better. I think this means that the overclocking approach I took with the AMD software is probably a little bit better than me manually trying to do it and completely messing it all up.
Because, yeah, this is actually fairly good. At 1440p, we’re now getting a stable 30 FPS average, which I think is actually fairly impressive for this CPU. Actually, I think that’s pretty good. I mean, I went with the 7600 CPU, not the 7600X. So, the fact that I did eke out a little bit of extra gigahertz, and that resulted in slightly more FPS, is actually something I’m fairly happy with.
Lossless Scaling
Next, we can look at Lossless Scaling and see whether we can push things further to get even better gaming results. So, I firstly wanted to try out upscaling without the frame generation. I put Crysis in windowed 1440p mode and then went over to Lossless Scaling. So, we got LS1 there, I turned off the frame generation, and I clicked on “scale.”
I went back to Crysis, and okay, at the top left, you can see there the actual FPS rate is lower. So obviously, instead of getting like a 20 FPS average, it’s a little bit lower because there is, obviously, an overhead with moving to upscaling. But overall, this looks really good apart from the low FPS rates.
So, I then wanted to try out frame generation. I know you need a base of 60 FPS really to actually get good results, but I thought, let’s just try it out along with the upscaler. So, it’s upscaling there up to 4K, and immediately, we were in jelly mode, which, as we know now, is due to the frame generation and not the upscaler because we weren’t getting that before. Clearly, this isn’t great. We’re down to like 9 or 10 frames because of the overhead from both the frame gen and the upscaler. Even though it’s scaled it to a 20-odd FPS average due to the frame generation, it’s just not very good. We can tell that.
So that’s not a good option.
One thing I did want to try out, though, before moving to another game, is actually upscaling from 1080p all the way to 4K via the scale factor of two. Actually, I turned off frame generation here, but I wanted to see how this would actually perform. Usually, this much of an upscale wouldn’t look very good, but actually, this is pretty damn good. I actually really am happy with this. You know, the fact this is 1080p upscaled to 4K—you wouldn’t immediately know that. I mean, it all looks pretty good and sharp. You haven’t got a complete blurry mess.
I think really the grass, sometimes on the edge, you can see some artifacting, and it looks a bit too blurry. So, that’s probably—you can see there a little bit—but I think that’s one of the things you will see with upscaling, you know, a two-times factor here. But overall, this was actually not too bad. I’d actually happily play like this, to be honest.
Scaling Skyrim To 4K
So, I decided to move on to another game, and I got Skyrim there at 1080p. I decided to upscale that to 4K, so again the two-times mode. Again, it looks pretty good considering there’s no GPU here. I think this is pretty good. Annoyingly, though, the mouse did sometimes glitch up. You can see there I’ve got the mouse and the cursor. I think Lossless Scaling just bugged up a little bit, and as a result, shooting was sometimes a little bit annoying. But overall, I actually like how this looks. I’m getting a 30 FPS average here, and it all looks and feels pretty good.
So next, I thought I’ll try it with frame generation, two-times, and I wasn’t expecting much. But this is upscaled to 4K, then with the two-times frame generation, and you can see it’s gone down to like a 20 FPS average base and then obviously goes back up to 40 frames due to the frame generation. It’s not great. You kind of got that jelly effect. We’ve also still got the mouse cursor there, so don’t worry, that’s not yours—that’s mine. But actually, it wasn’t too bad. There was less sort of jelliness than we had with Crysis, in my opinion.
Although it was a bit harder to actually shoot, you know, I wouldn’t actually play like this because I think the input latency or the mouse bug, one of the two, did mean that it just wasn’t really a stable way of doing things. But thankfully, Lydia’s there finishing the task because I can’t shoot for anything right now. But there we are—go, Lydia, go! You can do it.
Actually, this is pretty cool. I mean, not necessarily some of the bugs I’m seeing with frame generation, but actually being able to upscale like 1080p to 4K without a graphics card and getting fairly decent results is actually fairly impressive to me.
Now, if we look at the overall results, they are a lot better than what we had previously with the old system. 4K is still not playable, but 1440p starts to become playable in something like Skyrim. 1080p is then playable on pretty much every game other than Final Fantasy HD. Again, because we don’t have a graphics card, I’m looking at 30 FPS as being my goal here and not 60 FPS. But if you did want 60 FPS rates, then 720p would be achievable in a few of the games I looked at. So actually, fairly impressive.
Final Thoughts
I think we’ve pushed things as far as we can. Lossless Scaling is a good tool for upscaling—I mean, that looked really good. Previously, it was a blurry mess when I tried scaling from like 720p or 1080p onto this 4K monitor. But actually, it did help quite a lot. Frame generation, not so much. However, that is to be expected because the tool does say you need really stable 60 FPS rates before you enable frame gen because of things like that.
Also, the upscaler does have a performance hit, so the moment you turn it on, you’re going to get lower FPS rates. That’s why they recommend having that 60 FPS base. As a result, it’s not really worth diving into frame generation and things like that at 1080p or 1440p, for example, because you just don’t have enough FPS rates as a base to actually then use frame generation and things like that.
But actually, I’m overall pleasantly surprised at just how much extra gaming performance I actually had just from changing the two RAM sticks, upgrading the cooling, and moving to an NVMe SSD. My hunch is it’s probably the RAM that actually increased things the most.
But I hope you enjoyed this video. If you did, please click the thumbs-up button, and please subscribe to see more videos like this. Thanks for watching!