smallseo.info

display interview questions

Top display frequently asked interview questions

Why does 1366x768 resolution exist? [duplicate]

This question already has an answer here:

I know that there's a previous question about this but it doesn't have any real answers despite having been viewed 12,400 times, and the fact that it's been closed. With that in mind...

Why in the world is 1366x768 resolution a real thing? It has an aspect ratio of 683:384, which is the weirdest thing I've ever heard of while living in a 16:9 world.

All screens and resolutions I've been familiar with have been 16:9 aspect ratio. My screen, 1920x1080, is 16:9. The 720p that I'm familiar with is 1280x720, also 16:9. 4K that I'm familiar with, 3840x2160, is also 16:9. Yet 1366x768 is 683:384, a seemingly wild break from the standard.

I know there are plenty of other resolutions all over the place, but 1366x768 seems to dominate most of the mid priced laptop world and also seems unique to the laptop world. Why don't laptops use 1280x720 or something else as a standard?


Source: (StackOverflow)

How to disable the screen orientation hotkeys in Windows (XP)?

I'm using Eclipse in Windows XP. One of my favorite shortcuts is CTRL+ALT+DOWN (or CTRL+ALT+UP) to duplicate a line.

I just found that on this machine (XP ThinkPad, with NVIDIA graphics driver), this is has the effect of fliping the screen upside down, which I will rarely use.

How can I disable this? Or, if that is difficult, is there a way to give Eclipse hotkeys precedence over any other hotkeys? I'm not sure if this is an OS 'feature' or a background utility, or a function of the graphics driver, but any suggestions that would help me track this down and eliminate it would be appreciated.

I've looked through the choices on the NVIDIA control panel, and I don't find any hotkey options.


Source: (StackOverflow)

What's the correct monitor height for large monitors?

Conventional ergonomics guides suggest aligning the top of the monitor to where the operator looks at straight on. It seems doubtful that that still applies to today's 24" and 30" and larger displays.

What was the reasoning behind that rule? What's the correct way to position a huge display according to current research?

Would anyone have a reference to research confirming the "2/3 up" rule?


Source: (StackOverflow)

Why are laptop screens sized the way they are?

We've been discussing this in the Comms Room on Serverfault, and thought it might make a good question on SuperUser...especially if there's a clear answer. The hope is that it is a Good Subjective question.

Why do laptop screen sizes come in the fractional sizes they do instead of 11/12/13/14/15"? The most frequent I see advertised are 11.6", 12.5", 13.3", 14", 15.6". What's the reasoning behind it? keyboard size? ergonomics? resolution requirements? Most are LCD screens just like TV's, and yet TV's are advertised as whole numbers (19", 26", 46", etc.).

Looking at actual LxWxD dimensions on laptops doesn't really help since screen bezels vary in size.

For instance:

example 11.6" laptop dimensions = 11.55" x 8.50" x 1.27" -- this is due to a rather large bezel.

Whereas my x1 carbon touch, 14" diagonal screen but dimensions = WQHD Touch: 13.03" x 8.94" x 0.55" (Front)-0.79" (Rear) -- again bezel...if it could be edge to edge that would be different, and "normal math" would insist the actual "monitor size" was about 15.5", which it is if you include the bezel.

SO:

Are there actual equations/ratios/mathematical factors in determining screen sizes on a laptop that make certain sizes more common than others? Note I stated screen size (like the common 11.6", 13.3", 15.6", etc.) and not actual dimensions of the monitor lid itself.

TO HELP CLARIFY THE QUESTION:

I'm asking why those particular fractional sizes are so common? Look at HP, Lenovo, and Dell. They all tend to go with those screen sizes. Is it because it is what the consumers are used to seeing/using? Is it dictated by resolution requirements that dictate the screen size (meaning 11.6" works out resolution wise, but 11.7" doesn't)? Or is it something else? If you want to hone in on one: Something somehow determined that 11.6" was a good common screen size...I'm curious what that was.


Source: (StackOverflow)

Why don't websites immediately display their text these days?

I've noticed that recently many websites are slow to display their text. Usually, the background, images and so on are going to be loaded, but no text. After some time the text starts appearing here and there (not always all of it at the same time).

It basically works the opposite as it used to, when the text was displayed first, then the images and the rest was loading afterwards. What new technology is creating this issue? Any idea?

Note that I'm on a slow connection, which probably accentuates the problem.

See below for an example - everything's loaded but it takes a few more seconds before the text is finally displayed:

enter image description here


Source: (StackOverflow)

Connecting 2 External Monitors to a Laptop?

Equaling 3 Displays Total (or 2, if the laptop display cannot be used).

I work at home on two large monitors, but at the office on a laptop with a single large monitor. Is it possible to attach two (or more?) external monitors to a laptop without having them clone each-others display?


Source: (StackOverflow)

Turn off display in Windows on command

Is there a way to turn off the display in Windows (7), preferably without using additional software?

Powershell script works fine, but leaves command-line window after turning on the display.


Source: (StackOverflow)

Displaying XML in the Chrome browser

I love the Chrome browser, but I use XML quite a lot in my development work and when I view it in Chrome I just get the rendered text.

I know that the source view is slightly better, but I'd really like to see the layout and functionality that Internet Explorer adds to XML, namely:

  • Highlighting
  • Open/close nodes

Any ideas how I can get this on Chrome?


UPDATE:

The XMLTree Extension is available on Google Chrome Extension Beta Site.


Source: (StackOverflow)

Why does text look so Horrible on my HD monitor?

I just bought a 1080p 22" Samsung Syncmaster 2333HD (connected via HDMI) and the picture and video quality is great but the text quality is absolutely horrible. This monitor has a built in HD TV tuner.

Even as I type now all the text in this text box as well as in the browser toolbar and start menu, etc looks weird - like it all has a white outline around it that makes it jagged and hard to read. It hurts my eyes just to look at it.

I am running my PC in the suggested native resolution of 1920x1080, so what's the problem?

Is this one of the unavoidable downsides of using a HD monitor? Is there a solution to the problem?


Source: (StackOverflow)

Generating usage logs that prove my Internet connection is flaky

I need a way to generate reports or logs that prove that my Internet connection is flaky. My Comcast connection is very flaky but if I ask their support to send someone over it will probably work fine while the guy is here.

I found and tried "Connection Monitor" from CSGWare Corp but it does not create the kind of reports or graphs I'd need to be able to convince my ISP that their link is intermittent.

What I need to be able to do is have the software monitor my connection and produce a record of when the connection dies or when, for example, ping time climbs dramatically.

Can I get Connection Monitor to do this or is there another program that does?


Source: (StackOverflow)

Triple (3) Monitors under Linux

I have a 3 monitor setup (each 1680x1050) via an Nvidia NVS440 (2 GPUs, 2 outputs per GPU totalling 4 outputs); this works fine under Windows XP,7 but caused considerable headaches under Linux (Ubuntu 9.04).

I had previously used an XFX 9600GT and the onboard XFX 9300GS to produce the same result but the card was noisy and power hungry and I was hoping that there was some magical switch in the NVS4400 that got rid of this annoying problem - turns out the NVS440 is just 2 cards on one physical PCB :-p (I searched the net high and low for people using this card under Linux but found nothing, if anything the card uses less power and is fan less so I was to benefit from it either way)

Anyway, using either set up there were 5 solutions available:

  • Have 3 separate X instances, all un joined
  • Have 3 separate X instances, adjoined by Xinerama
  • Have 2 separate X instances - One using twin-view, both adjoined by Xinerama
  • Have 2 separate X instances - One using twin-view but no Xinerama
  • Have a single Twin-view setup and leave the 3rd screen unplugged :-p

The 4rd option, using 2 separate X instances and twinview (but no xinerama) was the best balance in terms of performance and usability but caused 2 really annoying issues

  • You couldn't control (without altering the shortcuts) which screen an application opened onto - and once it was opened you couldn't move it to another screen without opening up terminal and forcing it to move
  • Nvidia's overriding or falsifying of Xinerama breaks and the 2 screens joined by Twin view behave like a single huge screen causing popups to open in the middle of both screens and maximising of windows stretches to the width of the first 2 screens
  • Firefox can only run one instance as the same user so having multiple firefox windows requires at least 2 users

The second option "feels" like the right option, but OpenGL is basically disabled and playing any sort of game or even running anything graphical causes a huge performance drop and instability - even trying to run a basic emulator for gba or gens just causes the system to fall over. It works just enough to stare at your desktop and do nothing but as soon as you start doing some work - opening windows, dragging things around - running multiple copies of firefox it just really feels slow.

The last open, only going dual screen works perfectly and everything performs as required, full GPU acceleration - two logical screen spaces - perfect, just make it work across GPUs like windows! :-p

Anyway, I know RandR was supposed to pick up the slack when it would introduced GPU objects of sorts to allow multiple GPUs to be stitched together to create one huge desktop at a much deeper layer than Xinerama. I was wondering if this has now been fixed (I noticed X server 1.7 is out) and whether anyone has got it running successfully?

Again, my requirements are:

  • One huge desktop to drag any window across
  • Maximising of windows to each screen (as XP does)
  • Running fullscreen apps on the primary screen and disabling the mouse from moving onto the others or on all 3 stretched

Finally as a side note; I am aware of the Matrox triple (and dual) head splitter but even the price they go for on eBay is more than I can afford atm, my argument: I shouldn't have to buy extra hardware to get something to work on Linux when it's something that's existed in the windows world for a long time (can you tell I don't get on with X :-p); If I had the cash I'd have bought the latest version of this box already (the new version finally supports large resolutions as the displays I have 1680x1050 each).


Source: (StackOverflow)

Is it possible to power on/off a monitor using the computer?

I was wondering if it was possible to power on/off a display using a computer connected via HDMI. Let me explain :

I want my computer to power off my monitors (not standby mode) when I don't use it (no keyboard/mouse input) for more than 15 minutes, and power them back when such input is received. My monitors are connected over HDMI, so I was wondering if it was possible to use the CEC functionality with a computer. If is it possible, then is there a hardware requirement ?

My point is that I often take a break from my computer, but forget to turn off the screens, and I would prefer to shut down the screens completely instead of putting them to standy mode

Thanks a lot


Source: (StackOverflow)

What are the advantages of 10-bit monitors?

To support 10-bit color the following are needed:

  • A monitor supporting it.
  • A GPU supporting it (only AMD FirePro and NVIDIA Quadro support this?).
  • Compatible software. Unless I am mistaken there are very few programs out there supporting 10-bit color. Photoshop is a notable example.

The questions are about how 10-bit monitors perform in comparison with 8-bit monitors:

  • In which situations would a 10-bit monitor give a noticeable advantage over an 8-bit monitor (say, for professional photography)?
  • Have 10-bit monitors been compared against 8-bit monitors based on subjective or objective tests? What were the results?
  • Human eyes can see only 10m colors, so would using a monitor with 1b colors make a difference?

Source: (StackOverflow)

How could I safely fix my walking "dead pixel" bug?

I need suggestions.

I've got a live little bug inside my macbook pro screen for 2 days now.

I've tried to film it using my iPhone, but it ain't that good. :(

Should I try to open it? o_O

C'mon, looking for ideas here! :)

edit: Here's an alike video.

It's not moving anymore for now... I hope it isn't dead! Right when I've found a possible solution along with many ideas on that link: suction cup; monitor off and lamp on to attract it out; scratching the screen (made it move a little); and got to know there's no warranty for this "feature" (also known as bad design in a jargon).

edit2: It's been "fixed" on its own. Just check the answer.


Source: (StackOverflow)

What did computers use for output before monitors?

How did the early industrial computers, such as UNIVAC, ENIAC, MARK I, etc display output before monitors existed?

Did the first personal computers, like the Altair 8800 or the Simon use monitors, or did they use some alternate output as well?


Source: (StackOverflow)