graphics-card interview questions

Top graphics-card frequently asked interview questions

How much power do video cards use after Windows turns your display off?

Modern video cards seem to use 150-200 Watts at idle. Does this mean that this is the minimum power the video card will ever draw while your computer is on? It's clear that if you are sitting at the Windows desktop and not doing much, this is the power you'll be drawing. If you're playing a game, you'll draw more power. But what about when your computer is idle long enough to trigger Windows' "Turn off the display" event? Will the video card use negligible power during this time, or still use idle power?

To be clear, I am not talking about the entire computer entering sleep or standby mode. I'm also not talking about simply pushing the power button on the monitor. I'm talking about the computer being on, visible on the network, possibly performing background or server tasks, but with the display off as a result of Windows power settings.

Source: (StackOverflow)

How can I determine and set my primary graphics card?

I have a Lenovo W520 laptop with two graphics cards:

Device Manager showing "Intel(R) HD Graphics Family" and "NVIDIA Quadro 1000M"

I think Windows 7 (64 bit) is using my Intel graphics card ₃ which I think is integrated — because I have a low graphics rating in the Windows Experience Index. Also, the Intel card has 750MB of RAM while the NVIDIA has 2GB.

  1. How do I know for certain which card is Windows 7 really using?
  2. How do I change it?
  3. Since this is a laptop and the display is built in, how would changing the graphics card affect the built in display?

Source: (StackOverflow)

Why did my graphics card explode?

How did my graphics card "explode" like it did in the picture I have here? And if I was just plugging into my on-board graphics, why wouldn't the PC boot up like it normally would? It wouldn't come up until I opened the cover and removed the card, that's when I found it like this.

enter image description here

EDIT: If the card was trying to still operate like this for a while, did it run the risk of damaging the PCI-E slot?

Source: (StackOverflow)

Is it possible to connect an external GPU via Ethernet?

I have laptop which has working Ethernet port but I always use WiFi . I am wondering if it is possible to run and use a graphics card (with external power supply) connected to the Ethernet port (with some kind of PCI emulation to emulate the Ethernet GPU as a PCI one).

A Cat6 cable can do 10 Gbps, which should be enough for a GPU to run and play games.

Could this be possible?

Source: (StackOverflow)

How does the CPU and GPU interact in displaying computer graphics?

Here you can see a screenshot of a small C++ program called Triangle.exe with a rotating triangle based on the OpenGL API.

enter image description here

Admittedly a very basic example but I think it's applicable to other graphic cards operations.

I was just curious and wanted to know the whole process from double clicking on Triangle.exe under Windows XP until I can see the triangle rotating on the monitor. What happens, how do CPU (which first handles the .exe) and GPU (which finally outputs the triangle on the screen) interact?

I guess involved in displaying this rotating triangle is primarily the following hardware/software among others:


  • HDD
  • System Memory (RAM)
  • CPU
  • Video memory
  • GPU
  • LCD display


  • Operating System
  • DirectX/OpenGL API
  • Nvidia Driver

Can anyone explain the process, maybe with some sort of flow chart for illustration?

It should not be a complex explanation that covers every single step (guess that would go beyond the scope), but an explanation an intermediate IT guy can follow.

I'm pretty sure a lot of people that would even call themselves IT professionals could not describe this process correctly.

Source: (StackOverflow)

What is the use for built-in graphic card on a "gaming" motherboard?

Many motherboards marketed as "gaming" has an integrated Intel graphic cards. Examples are the ASUS B150I PRO GAMING/WIFI/AURA and the Gigabyte GA-Z170N-Gaming 5 but these are just a couple of many. Note the "Gaming" word in their respective names.

Now I understand, that if you want to build a gaming PC most likely you would opt for Nvidia or AMD. This is because integrated video do not have a chance to compare with higher end Nvidia/AMD offerings. Correct me if I'm wrong.

I understand that putting in an integrated graphics into a motherboard increases it's cost. So there must be a reason why manufactures do this. It looks to me that putting an integrated GPU on a gaming MB is more of a rule rather than an exception.

I however cannot figure it out, what this integrated graphic is good for. Could you please explain what it can be used for (I'm guessing the intentional use, but any other possible uses too) given that for a gaming PC one is most likely to utilize an external GPU?

If you think any of my assumptions are wrong please point that out, since the whole thing does not make a lot of sense to me it is quite likely that it's my assumptions that are wrong somewhere.

Source: (StackOverflow)

How can I detect the model of my graphics card?

I know of many ways to get a rough idea of my graphics card model. Here are two examples (instructions for Windows 7):

Method 1

1) Click start

2) Type dxdiag and press enter

3) Go to the Display tab and check the Name property.

Method 2

1) Click start

2) Right click on Computer and select Properties

3) Click on Device Manager

4) Expand the Display adapters to get a list of video cards


Unfortunately both these methods suffer from problems:

1) This is a very inaccurate measure. For example, if I have an ATI Radeon 4830, both methods will show that I have an ATI Radeon 4800 series i.e. there is no way to distinguish between different models within the 4800 series or any other series for that matter.

2) This is dependent on having the correct driver installed. If I have an incorrect driver installed, there is no way for me to find out what the correct driver should be.


Is there any way for me to be able to determine the exact model of a graphics card without relying on having the correct driver installed. I realise there are ways to do this such as checking the documentation that comes with the computer or perhaps opening it up but I am interested in seeing if there is way of doing this with software.

Edit: Please note the requirements carefully. If the method relies on reading from the driver then it is ineligible.

If there is no program that can do this, is there a manual method? Some kind of website database etc?


Source: (StackOverflow)

Converting DisplayPort and/or HDMI to DVI-D?

Newer Radeon video cards come with four ports standard:

  • DVI (x2)
  • HDMI
  • DisplayPort

enter image description here

If I want to run three 24" monitors, all of which are DVI only, from this video card -- is it possible to convert either the HDMI or DisplayPort to DVI? If so, how? And which one is easier/cheaper to convert?

I did a little research and it looks like there isn't a simple "dongle" method. I found this DisplayPort to DVI-D Dual Link Adapter but it's $120; almost cheaper to buy a new monitor that supports HDMI or DisplayPort inputs at that point!

There's also a HDMI to DVI-D adapter at Monoprice but I'm not sure it will work, either.

AnandTech seems to imply that you do need the DisplayPort-to-DVI:

The only catch to this specific port layout is that the card still only has enough TMDS transmitters for two ports. So you can use 2x DVI or 1x DVI + HDMI, but not 2x DVI + HDMI. For 3 DVI-derived ports, you will need an active DisplayPort-to-DVI adapter.

Source: (StackOverflow)

How can I enable onboard graphics AND dedicated card simultaneously?

My PC (Compaq Presario) has an onboard Intel 3100 which is pretty lame wbut would be useful for testing on, or a 3rd monitor. I've then got a nVidia PCIx card installed. I can't seem to find a way to turn both on at once... is it likely this is a BIOS limitation?

Running Windows 7.

The official page suggests I can't do this but I wondered if there is a way?

Source: (StackOverflow)

ATI CrossFire instability and horizontal bands?

I recently added another ATI 5870 card to my system to experiment with ATI Crossfire (dual GPU) performance increases.

However, I've had a lot of intermittent stability problems, most seriously a set of oscillating horizontal bands which appear during gameplay and become quite severe, to the point that you can barely see the screen to exit the game!

It looks a little like this:

ATI crossfire banding

My system has an overclocked Sandy Bridge CPU that has been rock stable with a single 5870, but adding the second video card and enabling CrossFire seems to be problematic. The cards are both installed fine, fully seated with plenty of space between them, have both PCI 6-pin power connectors connected, and my 850 W power supply should be ample.

The Catalyst hardware properties look fine:

Primary Adapter     
Graphics Card Manufacturer  Powered by AMD  
Graphics Chipset    ATI Radeon HD 5800 Series   
Device ID   6898    
Vendor  1002    

Subsystem ID    2289    
Subsystem Vendor ID 1787    

Graphics Bus Capability PCI Express 2.0 
Maximum Bus Setting PCI Express 2.0 x8  

BIOS Version 
BIOS Part Number    113-C00801-XXX  
BIOS Date   2010/02/08  

Memory Size 1024 MB 
Memory Type GDDR5   

Core Clock in MHz   875 MHz 
Memory Clock in MHz 1225 MHz    
Total Memory Bandwidth in GByte/s   156.8 GByte/s   

Linked Adapter      
Graphics Card Manufacturer  Powered by AMD  
Graphics Chipset    ATI Radeon HD 5800 Series   
Device ID   6898    
Vendor  1002    

Subsystem ID    2289    
Subsystem Vendor ID 1787    

Graphics Bus Capability PCI Express 2.0 
Maximum Bus Setting PCI Express 2.0 x8  

BIOS Version 
BIOS Part Number    113-C00801-100  
BIOS Date   2010/03/31  

Memory Size 1024 MB 
Memory Type GDDR5   

Core Clock in MHz   850 MHz 
Memory Clock in MHz 1200 MHz    
Total Memory Bandwidth in GByte/s   153.6 GByte/s

I've tried the following:

All to no avail!

Source: (StackOverflow)

Why do workstation graphics cards cost far more than equivalent consumer graphics cards?

An Nvidia GeForce GTX 780 Ti costs $700, while a Quadro K6000 costs $4000—yet they use the same underlying GK110 core!

The same can be said for other workstation GPUs from both Nvidia and AMD.

What exactly does this price difference pay for with a workstation GPU? It is my understanding that they have specially-tuned drivers for CAD and other intensive business applications, sacrificing speed in gaming applications for greater accuracy and performance in such business software, but this by itself can't explain the cost difference. They may have more memory, and often of the ECC type, but that still can't explain a nearly sixfold difference.

Would hardware validation explain the difference? I suspect it goes like this: among the GPU chips that test as usable, 30% go into a high-end consumer card, and 68% go into a slightly cheaper consumer card; the other 2% go through even deeper validation, and the few that pass get put into a workstation card. Could this be the case, and is this why they're so expensive?

Source: (StackOverflow)

Why do lots of games have Direct X 9 and 11 options, but NOT DX10?

I don't really know much about DirectX other than it is responsible of having better graphic options for games, for example, tessellation and Ambient Occlusion in DX11.

But my question is, why some games (most recent games I've played at least), have the option of choosing DX9 (default) or DX 11 (with advanced options, and obviously with compatible video cards), but there is NO option for DX 10?

Is DX10 a version that never got released? was it defective? or what about it? why those games don't show an option to use DX 10 along DX 9 and 11?

Are there ANY games that show those 3 options? or do they just 'jump' from DX 9 directly to 11? why?


Source: (StackOverflow)

How can I test my GPU memory/RAM? [duplicate]

This question already has an answer here:

I run MemTest86 a lot at work on customer's machines, and it's great for troubleshooting memory issues. My question is, how can I test that a GPU is starting to go?

I know of programs like 3DMark to push the graphics card to its limits, but what about with Video Memory? Is it worth testing? Is there a stress tool actually able to catch issues in the video card (memory), perhaps using CUDA/OpenCL?

Source: (StackOverflow)

What exactly is VGA, and what is the difference between it and a video card?

Operating system development tutorials pinpoint reaching screen data by writing directly to VGA or EGA or Super VGA, but what I do not get is what is the real difference between writing to a fixed address for display, and writing to a video card directly, either onboard or removable? I just want the basic clarification of my confusion on this on my issue

And since it's not such a simple case with variables in cards, connective-interfaces, buses, architectures, system on a chip, embedded systems, etc., I find it to be hard to find a way to understand the idea behind this 100%. Would the fixed addresses differ from a high-end GPU to a low-end onboard one? Why and why not?

It is one of my goals in programming to host a kernel and make an operating system, and a farfetched dream indeed. Failing to understand the terminology not only hinders me in some areas, but makes me seem foolish on the subjects of hardware.

EXTRA: Some of these current answers speak of using the processors maximum addressable memory in the specifics on 16-bits. The problem is some of these other arising issues:

1.What about the card's own memory? That would not need system RAM for screen data itself.

2.What about in higher-bit modes? And can't you not neglect BIOS in real mode(x86)and still address memory through AL?

3.How would the concept of writing to a fixed address remain unchanged on a GPU with multitudes of registers and performance at or above the actual microprocessor?

Source: (StackOverflow)

Broken mouse cursor on main monitor, Windows 7 64 Bit, ATI Radeon HD 7870

UPDATE: Please check out the new answer I've posted to this problem. It might be that a solution to this frustrating problem exists now. Scroll down to see it.

Quite a while ago my graphic card died and I had to buy a new one. I decided for an ASUS Radeon HD 7870.

While I love the power of the graphic card and have no problems while playing games, I'm experiencing an annoying problem while I'm just on Windows with my dual monitor setup. Sometimes my mouse cursor gets broken on my main monitor and simply looks like this:


This seems to happen just at random situations and also sometimes when I move the mouse from one monitor to the other one. I can also always use a "workaround" to "fix" the problem, which means if I just move the mouse from one monitor to the other one often enough it becomes normal again at some point. But I don't want to do this all the time, so I'm searching for a solution.

I did a lot of Google research (try typing "ATI brok" in Google and it will already show you a lot of search entries for a broken Cursor), but the results where mostly not helping at all. Often they are "old" (from 2009 and before) and deal with mouse problems while playing games, which is not my problem. I'm missing up to date results from someone with maybe the same graphic card and can help me.

What I read some times is that deactivating windows aero should "fix" the problem, but to be honest I enjoy Windows Aero a lot and would prefer something different (I don't want to sound arrogant). The same is that some people say it would help to activate mouse trails, but the look & feel (like lagging) then bothers me even more. I also tried to disallow that the mouse cursor gets changed through designs, but this didn't change anything

Here is for example a big thread where people are talking about a similar (same?) problem. Some also state that deactivating Catalst AI would solve it for them, but I can't find this option in my up to date Catalyst Control Center anymore (maybe possible in a file somewhere in the directory of the CCC?).

Well, what's left to say is that I always keep my system up to date and already often installed new graphic card drivers (even sometimes tried Beta Versions). But the problem never disappeared.

Can someone here help me, has some ideas or experienced the same? I would be glad to hear from you! I'm also curious if this could maybe mean my graphic card is broken? (Although somehow it's hard to imagine for me)

Thanks a lot for every thought you're sharing with me.

Edit: Today it has happened again with the new ATI drivers.

Edit 2: Please check out the new answer I've posted to this problem. It might be that a solution to this frustrating problem exists now. Scroll down to see it.

Source: (StackOverflow)