In the world of PC gaming, a single problem can have a simple solution like a loose cable connection, or it can be as bad as a faulty or damaged component, with a spectrum of solutions in between. For someone not in touch with their geeky side, diagnosing the problem and finding a working solution can be time-consuming and frustrating. If your OS fails to detect a component as expensive as a GPU, it can be alarming. Thankfully, because you’re able to get a display out of your PC, that means you have an integrated GPU to rely on, so not all is lost. Here are a few fixes you can try before you take a shovel to your backyard and bury your old graphics card for good.
What causes your GPU not to get detected?
The GPU is a sensitive piece of hardware with lots of smaller components inside it that can malfunction. If it’s a hardware fault, it can be difficult to pin down exactly what’s causing the problem without sending it back to your GPU manufacturer or your nearest repair shop. For any sort of hardware problem, we suggest that you don’t open your GPU in hopes of repairing it yourself. Doing this can void the warranty on your GPU if you still have one, or you could end up making matters worse by further damaging the graphics card.
If the GPU isn’t faulty, then the most probable culprit is the drivers. The GPU drivers can often get corrupted, or even updating your drivers to a newer version can sometimes cause this issue. In some scenarios, your OS automatically switches from your dedicated GPU to the integrated GPU and disables the former. The problem can also be a loosely installed GPU in the PCIe slot, or the PCIe power connector wasn’t inserted into the GPU properly. The fixes for these issues are pretty straightforward, so we’ll start to fix the most common issues first and move toward the rarer ones. Hopefully, the problem with your GPU is fixed early on, and you don’t have to send it in for repairs.
Enable your GPU from the device manager
If Windows disabled your GPU, then simply re-enabling it might fix your problem. Here’s how you do it:
- Open the Start Menu and search for Device Manager. Open it when it shows up.
- Double-click on the Display adapters section to make it expand. If your GPU was indeed disabled, you’ll see it here.
- Right-click on your GPU and select Enable device to get it running again.
The Device Manager is a good spot to try to diagnose the problem with your GPU. If you can’t find your GPU under the Display Adapters, chances are that it’s a driver issue. In such cases, it can sometimes show up when you turn on Show Hidden Devices from the View menu of the Device Manager. However, if your GPU does show up, but it’s grayed out, that could give you an early hint of a more sinister problem such as hardware failure. This happens when the GPU drivers are installed correctly, but the drivers are unable to recognize your card properly due to an underlying hardware fault.
Update your GPU drivers
For your GPU to function properly, you need a working set of drivers, preferably the newest ones. Updating your drivers is easy for both Nvidia and AMD GPUs, but it requires their discrete applications. For Nvidia users, you should have GeForce Experience, and for owners of AMD graphic cards, AMD Radeon software is the way to go. The process is essentially the same for both software, but we’ll be taking Nvidia as an example. Here’s how you can update your GPU drivers:
- Open GeForce Experience if you already have it installed. If you don’t, it’s best to install it as it can help automatically update GPU drivers.
- Up top, you’ll see the Drivers section. Click on it.
- Here, you should see a new update for your Nvidia GPU. If you can’t, then click on the Check for Updates button, and if you’re still running an older driver version, a new version should show up. Click on the Download button and wait for it to download.
- Select either Custom Installation if you want to do a clean install or Express Installation if you want to install the update as soon as possible. Follow the instructions and wait until the update is installed, and you should be good to go.
Your screen might flash a couple of times during the installation, which is normal, and nothing to worry about. After the installation is complete, restart your PC and check whether your GPU has started working or not.
Re-slot your GPU and check the PCIe power connector
If the problem doesn’t get fixed by enabling your GPU, updating the drivers, and restarting your PC, then you’ll have to get a screwdriver and probe into the internals of your PC. The first order of business is to take your GPU out of the PCIe slot. While it’s out, feel free to clean it a bit if it’s excessively dusty, then put it back into the PCIe slot. Make sure it’s fully inserted and you hear a clicking sound. Put the power connector back in and ensure it’s fully inserted into the GPU. Restart your PC and it should start working again.
If you’ve had heating problems with your GPU, then reapplying thermal paste to your GPU might be worth considering. This can bring down temperatures and prevent your GPU from overheating and shutting down.
What to do if nothing works?
If your GPU has a hardware issue, then there isn’t anything you can do about it. Hopefully, you still have a warranty, which you can take advantage of and get your GPU repaired. The only other option is to send your card to your GPU manufacturer (Asus, MSI, EVGA, etc.) and ask them to repair it, for which you’ll have to pay. You can also try your local or nearest computer shop for repairs if they provide that service.
Sometimes, a faulty graphics card can be a good thing as it can push you to get a newer, more powerful GPU. You can get one of the best GPUs as a replacement, and let your trusty old card finally rest in peace, or go for the budget GPU picks if you’re strapped for cash.