Have you ever faced really low FPS, like 5 or 10, even when you’ve installed a dedicated graphics card?
If yes, then your PC, or the application running, might be using the built-in graphics card, or the integrated GPU. And is not using your dedicated graphics card.
In order to fully utilize the dedicated graphics card, which you probably paid a lot for, you might need to disable the integrated GPU.
How to Check What GPU Is Currently in Use?
By default, when we connect the monitor to the motherboard, the system uses the integrated GPU. However, if you want your PC to use the dedicated GPU, you need to connect the monitor to the graphics card.
Alternately, you can also use Windows settings to determine what GPU is currently in use. Please follow these steps to check the in-use GPU.
Press the Windows + I key simultaneously to open Settings.
Go to System > Display > Advanced Display.
Under Display information, you can see the GPU that a monitor is currently using.
Here, according to the number of monitors connected to your PC, you may have one or multiple displays.
How to Differentiate Integrated and Dedicated GPU?
Using the Task Manager, you can get the details about your GPU. This includes details about your integrated and dedicated graphics card.
Press Ctrl + Alt + Delete simultaneously.
Select Task Manager to open Task Manager,
Click on More details if the Task Manager only shows the list of applications running.
Now, go to the Performance tab.
Here, if you have GPU 0 and GPU 1, it means that your PC has an integrated and dedicated GPU.
Now, check the Dedicated GPU Memory and Shared GPU Memory. Dedicated GPU memory is generally higher compared to Shared GPU memory.
How to Disable Integrated Graphics
There are two specific ways you can disable the integrated graphics card, from Device Manager or the BIOS. Now, let us discuss each method and see which one is recommended.
From Device Manager
Since the Device Manager has the list of all drivers installed on your computer, disabling the integrated graphics driver will disable the integrated GPU as well.
Please follow these steps to disable integrated graphics from Device Manager.
Press the Windows + X key simultaneously.
Click on Device Manager and expand Display adapters.
Here, right-click on the integrated graphics driver.
Then click on Disable device.
In case your dedicated graphics card fails and you have disabled your integrated graphics card, your monitor may not display anything even when you connect the monitor to the motherboard.
As for laptops, when you disable both integrated and dedicated GPU, the OS will automatically run Microsoft’s default display driver.
To disable an integrated graphics card in BIOS, you need to set the dedicated GPU as the primary graphics adapter.
Enter the BIOS by pressing the Delete or F2 key, depending on the motherboard
You may have different setting names that represent the primary graphics adapter.
Set primary graphics adapter to PCI / PCIe instead of Auto or IGFXand VGA priority to Offboard.
If you cannot find these settings, please refer to the motherboard’s user manual to navigate the BIOS.
Should You Disable Integrated Graphics Card?
When you are on a desktop PC with a graphics card, it will use the dedicated graphics card and ignore the integrated GPU if it’s connected to the motherboard. But this is not always the case, as the laptop uses both the integrated and dedicated GPU.
So, should you or should you not disable integrated GPU? Well, the answer to this question depends on whether you are a desktop or a laptop user.
If you are a desktop user and you have connected your monitor to the dedicated GPU, you can disable the iGPU. As the PC will automatically use the dedicated card to display.
However, if your graphics card dies, and you have disabled iGPU from the BIOS, your screen may go blank. You may need to reset the BIOS to resolve this issue.
On laptops, tasks are switched between the integrated GPU and the dedicated GPU. The graphics card handles graphics-intensive tasks such as video rendering, gaming, etc., whereas the integrated GPU handles low specs applications such as discord, or even the internet browser.
Therefore, it is not recommended to disable integrated graphics on laptops, as they require both the GPU, to operate smoothly. However, if you disable the integrated graphics, the OS will switch to Microsoft’s basic display driver and the software based video processing will take over.
How Can You Run Application With Dedicated Graphics Card
If you are on a laptop, you may face issues with the two GPUs, integrated and dedicated. Since laptops utilize both integrated and dedicated GPU depending on the workload, some graphics-heavy applications, due to unknown reasons, can use integrated GPU.
In that case, you will need to change the application to high performance so that it uses the dedicated graphics card.
Please follow these steps to run an application with a dedicated graphics card.
Press the Windows + I key simultaneously to open Settings.
Navigate to System > Display > Graphics.
Click on the application you want to run at high performance.
Here, check High Performance, then click on Save.
Now, whenever the application opens, it will use your dedicated graphics card.
Does Disabling Integrated GPU Improve Performance?
If your computer solely runs on integrated GPU, disabling them will not improve performance. However, if you have a dedicated graphics card on your PC, it will automatically use them to perform graphics-intensive tasks.
Therefore, disabling them will not improve performance. However, if you have connected the monitor to the motherboard, the PC will always use the integrated GPU. This, in turn, will lower your performance. So, if you have a separate graphics card, you need to connect it to the monitor.
What Happens When We Disable All Display Drivers At the Same Time?
When you are on a laptop, both the integrated and dedicated GPU works simultaneously. So, when you disable both the graphics driver, the OS uses Microsoft’s basic display driver.
However, if you are on a desktop PC, it is not ideal to disable the dedicated graphics driver. Doing so, you will not get anything displayed on your monitor.
If that happens, you will need to connect the monitor to your motherboard video output port.