“CPU Virtualization” is one of those cool-sounding technologies that you just have to enable. Y’know, not because it sounds cool – but because, uhm, you’re about to run an application that really needs it. That’s totally why.
But is it potentially a bad thing to enable CPU Virtualization? After all, some people warn that it can have security implications.
Some motherboard manufacturers ship their BIOSes with virtualization disabled, thinking that most users won’t need it. But there certainly are some good uses of virtualization, as I cover in this video:
If you prefer text over video, please read on for the guide/transcript version of this video.
Video Transcript And Guide
Intro
Hey everyone, if you’ve ever looked at Task Manager you might have seen that “CPU Virtualization” is marked as “disabled”:
Does this mean that your CPU is a dud, or is there a genuine reason that it’s disabled? Well in general if you only plan on gaming then you can PROBABLY leave it off, but there are a bunch of good reasons to enable it too.
So what is CPU virtualization? Well, it’s a way of “splitting” your computer’s hardware up in such a way that it can be shared with other operating systems. So if you’re a – for example – programmer, you can install Linux “within” your Windows install and THEN make use of Linux’s improved support for compiling and running various programming applications.
Or you can download VirtualBox on Windows, install an old OS and play loads of decades old games.
Although to be honest in THAT case, using something like DOSBox might be a better option (and this doesn’t usually require virtualization support, since it’s an emulator not a virtualizer).
Which CPUs Support Virtualization?
Nearly all CPUs have virtualization support nowadays – especially if you have an Intel or AMD CPU from the last decade or so. All of AMD’s Ryzen processors support virtualization, as do many of the pre-Ryzen chips like the AMD Athlon X4 860K from a decade ago. I feel old.
Pretty much all of Intel’s CPUs support virtualization too, even their older Pentiums from AGES ago. Where you MIGHT start to see non-virtualization supporting CPUs, though, is from some budget “mobile” CPUs from more than a decade ago – like the Pentium B980 from 12 years ago. This is a LAPTOP chip and it just doesn’t support CPU virtualization. Most modern laptop CPUs DO support it though.
Possible Security Risks
So if pretty much all CPUs since the release of Iron Man 3 support virtualization, why is it sometimes DISABLED by default (like it was on my Ryzen 5900X build)? Well in rare cases some applications can have compatibility issues when it’s enabled, and IN THEORY it can have some security implications too: for example, a badly managed virtual machine setup could be susceptible to certain malware that targets VMs, and also “VM sprawl” can also be an issue.
I said “in theory” earlier, though, because in REALITY, simply having virtualization enabled in the BIOS isn’t REALLY a risk in my opinion. It’s a bit like viruses – they aren’t really a risk to Windows users UNLESS they take unnecessary risks and download a bunch of dodgy files from dodgy sources:
So really, virtualization should arguably ALWAYS be enabled nowadays – especially because Windows have recently rolled out features like Core Isolation and Memory Integrity for virtualization set-ups that can REALLY help beef up security:
But I guess that some motherboard makers just err on the side of caution and think that because the majority of users won’t ever USE virtualization, they may as well leave it disabled.
Should You Enable It?
That brings us to the main question: should YOU leave CPU Virtualization on or off? Well, “Virtualization” sounds really cool so that’s enough reason to enable it, right? Maybe not. In reality, if you’re purely planning on gaming and surfing the internet, you can easily leave Virtualization off.
Even heavier applications like video editing don’t require it. But if you’re likely to be building software or messing around with really old applications, then using virtual boxes can be really useful so enabling Virtualization in your BIOS can then help a lot.
How To Enable CPU Virtualization
Okay, with all that said, how do you ACTUALLY enable virtualization? Well there’s hundreds of tutorials out there showing this process, so I won’t reinvent the wheel and take 10 minutes to show you exactly how to do this. But I WILL say that you need to enable a setting within the BIOS, and on Intel systems the option is called “Intel(R) Virtualization Technology” or simply “VT-X”. While with AMD systems, it’s often called “AMD Virtualization Technology” or “SVM”.
So on my Asus B550M-Plus motherboard I can enable virtualization by heading into the BIOS, clicking advanced mode and then advanced on the top before clicking CPU configuration. I then find SVM mode and as you can see it says “Enable CPU Virtualization”.
I can then exit out of the BIOS and restart.
Great, so that’s now enabled and my PC will soon be hit by a wave of virtual machine hackers. Nope. It’s all fine. All quiet on the western front. So now that’s enabled I could download VirtualBox and install Windows Vista. Ah, Vista – how I loved your UAC prompts:
Ugh gosh, Vista was terrible. Moving on though – virtualization allows me to test out pretty much ANY operating system I want to, all from my Windows machine, which is pretty awesome. Yes you can often dual boot BUT that’s a clunky approach that requires disk partitions and boot loader annoyances. The beauty of VMs is being able to spin up a new OS, test it out (or do something specific on it – like building and running some software project from Github), and then tear down the virtual machine once you’re done with it. There’s no mess, it’s just simple.
Another possible use of VMs was covered by Craft Computing, who demonstrated how you can use virtualization to effectively SHARE your graphics card, turning your single PC into a multi-gaming hub. This is a pretty cool idea and while it can be a bit clunky to set up, it could work out a LOT cheaper than building two separate gaming PCs in your home.
What Is Hyper-V?
Before wrapping up, I wanted to answer one final question – what is “hyper-v”, y’know the thing that’s directly below “Virtualization” in Task Manager? Well Hyper-V is Microsoft’s own hardware virtualization platform. In some ways it’s a rival to software like VirtualBox and VMWare, although Hyper-V runs at a slightly different level (known as “type 1”) than VirtualBox, which is a “type-2 hypervisor”.
Basically Windows rolled out Hyper-V to increase virtualization support throughout the entire Windows ecosystem, and also to introduce “native” Linux support within Windows – something known as WSL. But you don’t necessarily NEED Hyper-V to be enabled – some versions of VirtualBox and VMWare don’t work with Hyper-V, meaning that you then need to disable Hyper-V support to use them (but naturally you still need CPU virtualization to be enabled).
That wraps up today’s video. I hope you found it useful. If you did, please click the thumbs up button. Please also subscribe to my channel if you haven’t already, and thank you for watching!
If you have any questions, feedback or suggestions about this article, please leave a comment below. Please note that all comments go into a moderation queue (to prevent blog spam). Your comment will be manually reviewed and approved by Tristan in less than a week. Thanks!