resolution interview questions
Top resolution frequently asked interview questions
I know the original VGA standard was meant to output 640x480 and that other standards over the original VGA connector are developed to output a higher resolution. (SVGA, XGA, etc.) But I was wondering if there's a specific limit to the resolution that the VGA connector can take.
Furthermore, are, and if so how are for example DVI and HDMI limited on resolution?
This question already has an answer here:
I know that there's a previous question about this
but it doesn't have any real answers despite having been viewed 12,400 times, and the fact that it's been closed. With that in mind...
Why in the world is 1366x768 resolution a real thing? It has an aspect ratio of 683:384, which is the weirdest thing I've ever heard of while living in a 16:9 world.
All screens and resolutions I've been familiar with have been 16:9 aspect ratio. My screen, 1920x1080, is 16:9. The 720p that I'm familiar with is 1280x720, also 16:9. 4K that I'm familiar with, 3840x2160, is also 16:9. Yet 1366x768 is 683:384, a seemingly wild break from the standard.
I know there are plenty of other resolutions all over the place, but 1366x768 seems to dominate most of the mid priced laptop world and also seems unique to the laptop world. Why don't laptops use 1280x720 or something else as a standard?
Nowadays it seems that Full HD isn’t enough anymore and the terms “4K,” “QHD” and “UHD” are thrown around interchangeably.
At the same time, there is not just one “4K” resolution in the catalogs. I have seen resolutions such as 2560 x 1600, 3440 x 1440, 3840 x 2160, 4096 x 2160 being advertised as 4K. But it can’t all be 4K, right?
Is it that 4K is not defined correctly, did the technology grow independent from the naming conventions, or do the advertising companies just refuse to burden the customers with correct informations?
Also, on a sidenote, if 4K means 4xFullHD (2 x 1920 by 2 x 1080 => 3840 x 2160), shouldn't FullHD be called 2K?
I'm running the enterprise evaluation (Build 9200) of Win8 and VirtualBox 4.2.4 r81684 and my actual display is 1920x1200. When I use the host-F key to enter full screen mode, the best I can configure in Windows is 1600x1200 which is fine, but I'd rather get the whole screen in play rather than see a letter boxed OS.
First I tried running the Install Guest Additions but windows didn't run any installers that I could see as described in the VirtualBox documentation.
I have allocated the maximum amount of RAM (256MB) to the Display Video Memory and don't see any way to load drivers after searching the VirtualBox documentation. I can enable or disable 3D and 2D Acceleration and these settings do not affect the outcome. I've set the monitor count at 1 and not enabled the Remote Display server. Since special things happen in each corner, Fitt's law is making it a pain to hit the targets for the corners to explore the UI whether I'm running the OS in a window or full screen.
Am I missing a setting somewhere in Windows or VirtualBox to fill in my true display resolution since it's not sensing it correctly? I'm open to hacking a driver file or other steps if needed to get the correct resolution set.
I noticed this happen right around the same time I noticed a stuck red pixel in the middle-lower section of the panel. I don't think the two things are related but they may be since it was about the same time. To be more specific, it seems to be a stuck red sub-pixel: when displaying white, it isn't visible.
Windows used to know the name of the monitor, and displayed it in the "screen resolution" screen, but now it calls it a "Generic Non-PnP Monitor".
What is strange is that it now gets detected with a strange resolution of 1919x1200. The monitor hardware itself appears to be mysteriously reporting that one of its vertical scanlines is gone. I am very very glad indeed that it has chosen to treat whatever failure it's encountered in this graceful manner rather than simply stop working, but I am ever curious about just what it is that actually happened.
I don't know how I can test this without physically counting pixels. There's just too many lines. Since windows (and almost every game I've played on this computer since) recognize the screen has having 1919 horizontal res, I feel pretty certain that the configuration signals being sent by the monitor is actually a 1919x1200 one.
Has this ever happened to anybody else? What could have gone wrong in the hardware to cause this?
Update: I've been trying to install Ubuntu 12.04 and I had this monitor connected. The loading of the liveDVD image (on USB) kept hanging during loading and it wasn't showing any errors that meant anything to me.
Then I gave up on Ubuntu and tried loading up Linux Mint 13 64-bit. This time before it hung up it displayed some helpful info which claimed something to the effect of "EDID invalid". Which makes a lot of sense. So i plug in a different display and the thing loads up just fine.
I guess that means it's an EDID problem.
The question is still not solved! How can I fix the EDID? Is this stored in some ROM chip on the display, in which case I'm screwed? It is impossible to install Linux when this monitor is plugged in with DVI and I have tried for a long time to get it working above 640x480 (where the Nvidia control panel window does not even fit the screen) with no success. I will be relegated to using the monitor only with VGA it seems.
I am using Microsoft's Live Mesh program to remotely access a PC running Windows 7, running on 2 screens: one with 1920x1080 and the other 1920x1200 resolution.
I am accessing them via an old laptop with 1024x1078 resolution. The result is such tiny icons/commands that it is difficult to try and change the screen resolution from 2 high-res displays to the single low res display.
It would be great if there was a command line way of doing this. Or perhaps there is way through live mesh to do it. Any ideas?
I haven been shopping around for monitors and it seems that everything from 19" to 24" have a max resolution at 1920x1080. Is there some technical reason for this or is that just how it happens to be.
I have a 23" monitor at this resolution and it seems to me that If I had anything any larger, this resolution would feel a little low.
I'm trying to run Windows 8 in VirtualBox. My laptop's display is exactly 1366x768. Windows 8 disables some of its features if the resolution is less than 1366x768, so I need to run the guest OS fullscreen.
The problem is, VirtualBox refuses to run the guest at 1366x768. When VirtualBox is "fullscreen", the guest is only 1360x768 -- six pixels too narrow. So there's a three-pixel black bar at the left and right sides of the display.
This user had the same problem, but the accepted answer is "install the Guest Additions", which I've already done; that got me to 1360, but not to 1366.
According to the VirtualBox ticket tracker, there used to be a bug where the guest's screen width would be rounded down to the nearest multiple of 8, but they claim to have fixed the bug in version 3.2.12. I'm using version 4.1.18 and seeing the same problem they claim to have fixed, so either they broke it again, they were wrong about ever having fixed it, or my problem is something else entirely.
This answer suggested giving the VM 128MB of video memory, and claimed no problems getting 1366x768 afterward. When I created the VM, its display memory was already defaulted to 128 MB. I tried increasing it to 256MB, but with no effect: the guest is still six pixels too narrow.
My host OS is Windows 7 64-bit, and I'm running VirtualBox 4.1.18.
How can I get VirtualBox to run my guest OS fullscreen at my display's native resolution of 1366x768?
I'm wondering if it's possible to resize the desktop on an RDP session on the fly
I realize you can do it before you connect, but I'm looking to resize it on the fly similar to how vmware works. If I have it in a window that's 800x600 I'd like the remote desktop to be resized to 800x600... but if I maximize my local window or go full screen, I'd like the remote desktop to assume the resolution of the local PC, or the window dimensions.
VMWare does this exactly how I want with an option called "use host settings for monitors"
As I scale the window, the desktop on the guest os scales, I'd like to do this on an RDP session?
I've got a 22-inch, 1680 x 1050 monitor. Brand X2gen, model MW22U. Connected by DVI cable to my NVIDIA GeForce 9600 GT graphics card.
Several days ago my monitor stopped working. It displayed nothing after the boot process.
I loaded Safe Mode which forced it to 1024 x 768, which did display. I installed updated drivers for my graphics card (NVIDIA GeForce ION Driver 185.85) which forced it to 1024 x 768 when I then returned to Normal Mode.
But I cannot set it to a resolution higher than 1024 x 768!
I then tested it with another monitor--a 20-inch, 1680 x 1050 Dell--which was detected successfully and did run at its native resolution. So, the problem is the monitor, not the computer or video card.
I switched back to the problem 22-inch monitor which was still suck at 1024 x 768. I noticed that the monitor was "Generic Non-PnP". I think before it was "Generic PnP". I changed the monitor driver to "Generic PnP". That didn't help.
I've installed and reinstalled NVIDIA GeForce ION Driver 185.85, but that doesn't fix it.
I've tried to add "Custom Resolutions" in the NVIDIA Control Panel. That errors with: "Custom mode test failed."
How might I be able to force Windows to use this monitor's native resolution?
I just started experimenting with the Hyper-V of Windows 8 and installed Ubuntu on it. However when I launch it my screen resolution is small.
How can I scale Ubuntu to the resolution of my screen?
My monitor is 1920 x 1200 and the computer is running Windows 7.
Using VirtualBox and Ubuntu 10.04, the screen of the Ubuntu can only be 800 x 600 or 640 x 480. Is there a way to change that.
I resized the VirtualBox window and Ubuntu still thinks the max is 800 x 600.
I have a 19" monitor with a native resolution of 1600x900. This provides crisp clear text, but it is too small. My vision is not as good as it once was. In order to see easily I have reduced the resolution to 1280x720 and have chosen LARGE fonts. The text is larger but not as clear and it makes me scroll my screen horizontally because it doesn't "fit" the screen
If I graduated to a 23" monitor with a native resolution of 1920x1080 with a normal font would that fix my woes? How would the text size on my 19" 1280x720 LARGE font compare to the text size on a 23" 1920x1080 with regular font? .
Using Remote Desktop from a device with a hi-res screen (say, a Surface Pro) is decidedly tricky - as everything displays 1:1 scale and so looks tiny.
If the machine you are remoting into runs Server 2008 R2 or later, you can change the dpi zooming setting (see here).
But for older hosts, that doesn't work.
Using normal Remote Desktop, you can connect with a lower resolution, say 1280x768, and turn on smart-sizing. However smart-sizing can scale down (to display a huge desktop in a small area) but does not seem to scale up (to display a small desktop in a big area).
Using the Windows 8 Remote Desktop App, you can zoom - but you cannot set the default resolution of the host.
What I want is a lower resolution in the host, scaled up to fit my screen.
So both of those are close to what I want, but dont quite work. So question is:
Does the Remote Desktop App allow screen resolution to be set somehow?
Is there some other Remote Desktop client that can handle zooming better?
Can you please tell me how can I find out the screen resolution and dpi for the screen on my Macbook Pro? (I got it last year, 2009.)