hdmi interview questions

Top hdmi frequently asked interview questions

Is Displayport preferable to DVI for monitor connections? [closed]

I'm currently running a single Dell 24" (2408WFP) screen, but am considering adding a second. The problem I've got is that I'm currently using the DVI connector to the on board graphics, so will need to purchase a new graphics card.

The 2408WFP also has HDMI and DisplayPort inputs, so I was wondering if DisplayPort is worth considering at present over DVI? On the same subject, does anyone actually manufacture reasonably priced DisplayPort cards - all I've managed to find seem to very expensive workstation cards aimed at financial/design/simulation markets.

Source: (StackOverflow)

Does HDMI cable "quality" actually affect transmission?

I really don't want to pay a ridiculous price for a "name brand" HDMI cable if it doesn't really do anything for me. I'm just curious: now that most transmission is digital (packetized) is there such a thing as a "quality" cable?

I suspect that if the cable works at all, I'm safe saying I have a quality connection. I just want to double check. Some of these reviewers complain that generic cables "create noise, lack bandwidth, can't handle X, etc". I'm skeptical of these reviews.

If the logic for HDMI cables and quality can be applied to cables in general, please elaborate on that as well.

Source: (StackOverflow)

Send only video out with HDMI cable

My laptop has an HDMI out port and my new TV has HDMI in ports. Hooking them up so that I can use the laptop to play movies with sound on the TV works fine. However, I haven't been able to find a way to send out just video without sound. Sometimes I want to keep the audio coming through my laptop speakers so I can put headphones on and not have everyone around me tell me to turn the sound down... is there a way to do that?

Source: (StackOverflow)

What kind of cable looks like HDMI but has only one slanted edge?

What kind of cable/connector is this? It looks like an HDMI connector, with very similar pins, but it has a slanted edge on only one side and the other side is square.

Source: (StackOverflow)

how do I connect an Apple Thunderbolt display to a PC?

I have this Apple Thunderbolt display.

it works fine with my Macbook Pro. I also want to connect it to my Windows laptop via HDMI. I bought this converter, which converts the HDMI signal to DisplayPort. Everything plugs in just fine, but the display just does not turn on. I tried a variety of resolution settings.

Note: this display does not have a power button. It just turns on when I plug it into a Mac. There is no activity when I plug it into the DisplayPort converter.

Source: (StackOverflow)

Two identical external monitors, one through HDMI another VGA. Text on VGA looks blurry

I've tried a lot of things but I can't seem to get it working correctly. Below is my setup.

Laptop is a Dell Latitude E6535, which has Video chip/card NVidia NVS 5200M.

I have two external AOC IPS i2367Fh 23-Inch Screen LED Monitor. One connected to the laptop through HDMI and another through VGA.

Problem is, the text and overall image looks perfect and sharp on whichever of the two I connect through HDMI, while the VGA one just does not seem right, text does not look crisp, and just blurry enough that you cannot tell it's blurry, but you feel there's something wrong about it. I asked the wife to look at both monitors without telling her there was an issue, and she said the same thing.

Now, I have made sure it is not the cable because like I said, if I connect the monitor that looks odd through HDMI instead of VGA (i switch which monitor uses which cable), then it looks fine and the other looks bad. I've also tried two VGA cables.

When I go to the NVDIA settings, only one of the monitors can use the NVS 5200M chip, while the other one takes the Intel HD Graphics 4000 adapter, and I think that depends on which one I make my main display, but making the VGA monitor my main display (even though it would use the NVS 5200M) does not fix the issue. It will still look bad.

The resolution that I'm using is the native one for the monitors, 1920 x 1080.

I already tried tuning ClearType but did not fix it either.

Any ideas are welcome. Thanks


Thank you everyone for the responses/suggestions. It seems in order to get it working I will need either a docking station or an adapter. I'm considering the following:


The USB to HDMI adapter did NOT work for me at all. I just went with the docking station and DP to HDMI adapter and it worked flawlessly. Crisp image and text on both external monitors.

Source: (StackOverflow)

How can I switch an application to a different playback device on Windows 10

Googling for solutions to this problem seems to lead to third party solutions like this, sometimes with scripting like this. I'm no stranger to either, but It seems crazy to me that there isn't a better solution.

I'm on Windows 10. I use HDMI audio to my monitor from my AMD video card, and I use a set of headphones plugged into the back of my motherboard. I use both often, but right now I need to tediously switch the default playback device to switch between them (sometimes, for reasons I don't yet know how to replicate, I even need to fully disable the device currently making sounds.

This is what I see in the volume mixer after starting my computer up with AMD HDMI as the default, and opening Chrome to watch a video:

AMD HDMI is playing System Sounds and Chrome's audio.

When I switch to the Speakers device

About to select Speakers device.

I see that it is not assigned any applications, and I see no way to reassign applications:

Speakers device is lonely.

I can usually hammer all of the applications over to the Speakers device by switching the default playback device, and usually vice versa. But that sometimes that only switches some applications, and sometimes it doesn't do anything. But that's besides the point...

How can I assign an application to a specific audio playback device? The Volume Mixer looks like the right place, but it doesn't seem to do it unless I'm missing something.

Source: (StackOverflow)

How do I fix monitor detection in Windows 7?

I'm using Windows 7 + Windows Media Center for my HTPC. It works great except from one annoying issue. Whenever I turn off my TV while listening to music, the music stops for a second or while Windows 7 tries to figure out what monitor is attached. After that second it settles down on a default 800x640. While not a big deal, it is annoying as I don't want to have the TV on while playing music.

Is there anyway to fix the monitor/disable monitor auto-detection on Windows 7 so it would not start recalibrating everything when I turn off my TV?

Source: (StackOverflow)

Is there any DVI to HDMI converter with audio? [closed]

I have a laptop with VGA and DVI outputs, but no HDMI. I want to connect it to a HDMI TV.

Is there any DVI→HDMI converter that also grabs audio? Obviously, the DVI port does not carry audio signal, so the converter must grab the audio from somewhere else (either a USB or the headphones plug).

This is what I want: {DVI + audio from laptop} → {HDMI to the TV}


  • It is a laptop, I can't change the video card. (and no, I don't want to buy another laptop now)
  • Yes, I can use the VGA-in at the TV. In fact, I'm already using it, but as the signal is analog, the image quality is not as good.

Source: (StackOverflow)

Green flickering pixels that move with black images

Strange question... Occasionally, on my LCD screen, pixels that should be black flicker rapidly and constantly between black and green, about 4 flickers a second.

The crazy part is, unlike dead/stuck pixels, they are relative to content on the screen and move with it.

For example, I might be looking at a web page with a picture that has lots of black. There might be a couple of green flashing pixels in that black that shouldn't be there. I scroll the page, and the green flickering pixels move with the image. It seems that everyphysical pixel is fine, but somehow something interprets part of the image in a way that causes flickering green...

It's not just in a web browser. My first thought was to blame a trolling blogger cunningly uploading an animated gif that simulates a failing pixel... but it happens in a wide range of applications. It seems to occur randomly, other than that it seems to only occur in areas of pure black, and it's always pure 100% green.

It happens rarely enough that it's not a big deal, but it's such a strange problem it bugs me. I can't find any info on anything like this. I'm not even sure if it's hardware or software.

Any ideas? (windows 7 laptop connected to LCD by DVI to HDMI cable)

Source: (StackOverflow)

Why is HDMI->DVI image sharper than VGA?

I have a Dell U2312HM monitor connected to a Dell Latitude E7440 laptop. When I connect them via laptop -> HDMI cable -> HDMI-DVI adaptor -> monitor (the monitor doesn't have a HDMI socket), the image is much sharper than with laptop -> miniDisplayPort-VGA adaptor -> VGA cable -> monitor. The difference is difficult to capture with a camera, but see my attempt at it below. I tried playing with brightness, contrast and sharpness settings, but I can't get the same image quality. The resolution is 1920x1080, I'm using Ubuntu 14.04.





Why is the quality different? Is it intrinsic to these standards or should I suspect a faulty VGA cable or mDP-VGA adaptor?

Source: (StackOverflow)

I don't have a monitor; how to play copy-protected content?

Let me preface this with the fact that I am totally blind so do not have a monitor hooked up to my computer.

I have a cable card tuner that I would like to use to record and play TV shows. It appears that I can record anything I want but cannot play it back if it has the copy protection flag set since I don’t have a monitor hooked up to my computer.

What can I do that will allow me to play back copy-protected content?

Source: (StackOverflow)

Connecting a 2560x1440 display to a laptop?

Having read Jeff Atwood's blog post on Korean 27" IPS LCDs, I've been wondering to what extent these are useful in a notebook + large display situation.

I own a Lenovo Thinkpad Edge E320 with 2nd gen. integrated Intel graphics. According to the spec from Intel, this should support HDMI version 1.4, and, using DisplayPort, resolutions up to 2560x1600. HDMI version 1.4 supports resolutions up to 4096×2160, however, according to c't (German), the HDMI interface used with Intel chips only supports 1920x1200. The same goes for the DVI output - dual-link DVI-D, apparently, is not supported by Intel.

It would appear that my laptop cannot digitally drive this kind of resolution. Now what about other laptops?

According to the article in c't above, AMD's integrated graphics chips have the same limitation as Intel's.

NVIDIA graphics cards, apparently, only offer resolutions up to 1900x1200 over HDMI out of the box, but it's possible, when using Linux at least, to trick the driver into enabling higher resolutions. Is this still true? What's the situation on Windows and OSX?

I found no information on whether discrete AMD chips support ultra-high resolutions over HDMI.

Owners of laptops with (Mini) DisplayPort / Thunderbolt won't have any issues with displays this large, but if you're planning to go for a display with dual-link DVI-D input only (like the Korean ones), you're going to need an adapter, which will set you back something like €70-€100 (since the protocols are incompatible).

The big question mark in this equation is VGA: a lot of laptops have it, and I don't see any reason to think this resolution is not supported by the hardware (an oft-quoted figure appears to be 2048x1536@75Hz, so 2560x1440@60Hz should be possible, right?), but are the drivers likely to cause problems?

Perhaps more critically, you'd need a VGA to dual-link DVI-D adapter that converts analog to digital signals. Do these exist? How good are they? How expensive are they? Is there a performance penalty involved?

Please correct me if I'm wrong on any points.

In summary, what are the requirements on a laptop to drive an external LCD at 2560x1440, in particular one that supports dual-link DVI-D only, and what tools and adapters can be used to lower the bar?

Source: (StackOverflow)

How to stop windows resizing when the monitor display channel is turned off / switched to different source

I have a new 6870 Amd Radeon adapter with its drivers set to 1080p 60Hz resolution hooked up to a 2008 47" high end Samsung HDMI based TV.

However, when the tv is turned to a different HDMI input -(when I come back into windows) somehow Windows decides to resize all the open apps to a lower resolution - including some of the side docked hidden pop-outs. When it resizes those though - it just sticked the pop-outs in the middle of the screen and all the resized windows from the open applications in the top left corner - all of them stacked on top of each other and resized to the smaller resolution.

The things that seem to be ok after returning are the icons on the desktop, the taskbar, and the sidebar.

Anyone have any knowledge of 1) how this happens 2) why it happens 3) how to stop it from resizing the applications and some of the docked pop-outs (they are not really resized after returning - they are just stuck in the middle of the screen approximately where they would be if the right or bottom sidebar should be if the screen was resized to that lower resolution).

My hypothesis is that upon losing HDMI signal - that Windows is told by something (driver, or windows itself) that the resolution to be without a signal being present (noting that HDMI signals and handshakes are two way on HDMI devices. If it loses the signal or the tv is switched to another device - then the display adapter must figure that out and tell Windows or figures it out and designs randomly to change the display size).

Any and all help is most appreciated. I asked AMD/ATI - but they said they don't know why or how this is happening. I was hoping that maybe this is THE place that the super users truly go to - those that develop display adapter drivers, or that dive deeply into these areas of windows. If there is better sites or just competing sites - please advise - noting I have already written AMD/ATI.

Response / Additions 4/7/2011

It is really nice to get your reply Shinrai. (BTW is it proper etiquette on these forums to have a discussion?) Yet 'only one issue' - I am using a single display in this case - so Windows doesn't move application windows to another desktop. Windows (or something) decides to shrink the desktop it currently has and resize all windows to the maximum size of the desktop. As such I would be glad if Windows would just keep the current size of the one desktop that is in operation.

I also know that this does NOT happen on monitors connected with DVI. There I have had one and two monitors setup and it doesn't resize those screens at all when disconnecting monitors, turning them off, whatever... they stay solid - everything in place - to such an extent that if you forgot the other monitor is off - you will have troubles finding some windows without using one of the control app utilities.

So if I could even get the HDMI handling by Windows (or the display driver) ( 1] which is doing this anyway the display driver or Windows - and 2] where is that other resolution size (1024x768) coming from - its not the smallest and its not the largest?) to be having like DVI - Life would be golden (for this aspect anyway).

** found others with same problem in this thread:

Source: (StackOverflow)

Converting DisplayPort and/or HDMI to DVI-D?

Newer Radeon video cards come with four ports standard:

  • DVI (x2)
  • HDMI
  • DisplayPort

enter image description here

If I want to run three 24" monitors, all of which are DVI only, from this video card -- is it possible to convert either the HDMI or DisplayPort to DVI? If so, how? And which one is easier/cheaper to convert?

I did a little research and it looks like there isn't a simple "dongle" method. I found this DisplayPort to DVI-D Dual Link Adapter but it's $120; almost cheaper to buy a new monitor that supports HDMI or DisplayPort inputs at that point!

There's also a HDMI to DVI-D adapter at Monoprice but I'm not sure it will work, either.

AnandTech seems to imply that you do need the DisplayPort-to-DVI:

The only catch to this specific port layout is that the card still only has enough TMDS transmitters for two ports. So you can use 2x DVI or 1x DVI + HDMI, but not 2x DVI + HDMI. For 3 DVI-derived ports, you will need an active DisplayPort-to-DVI adapter.

Source: (StackOverflow)