# HDMI scaling problem



## CraigNZ (Apr 22, 2010)

System configuration:

Motherboard: Gigabyte GA-H55M-UD2H
Processor: Intel core i3 540
O/S: Windows 7 x64
TV: Samsung Series 7 63" Plasma 3D
Connection: HDMI

The Intel video chipset recognizes the TV as 1920 x 1080 32 bit 60Hz. The TV sees an incoming signal on the HDMI of 1920 x 1080 @60Hz. But for some reason the desktop is larger than the TV display area, both horizontally and vertically. To fix this I can adjust the H and V scaling options to 70%, or I can configure the TV display from 16:9 (1920 x 1080) to "Fit to Screen", either of these will reduce the desktop to fit the display area of the TV.

But I don't understand. If HDMI is digital then it should be sending pixel information of 1920 x 1080, and this would match the native resolution of the display, so everything should be displayed perfectly. But it isn't. So it would appear either:

a) the native resolution of the TV is not 1920 x 1080, it is less hence the larger desktop display

b) the desktop under windows is not 1920 x 1080, it is larger

c) the signal is not digital but rather analogue and thus the voltage levels could be larger than what the TV expects.

And I'm not sure what the scaling adjustment does on the video chipset .. does it reduce the number of pixesl, or does it reduce the analog voltage (which doesn't make sense because the HDMI interface is supposed to be digital).

Anyone got some ideas what is wrong here?


----------



## nholmes1 (Oct 7, 2010)

Almost all TVs have overscan built in, this is where closed caption info and other various data is stored. This process started with CRT TVs and has stayed to this day, even TV's that can do 1:1 pixel mapping don't always show the full 1920x1080 in the visible frame.


----------



## CraigNZ (Apr 22, 2010)

Interesting. So if I understand correctly, the TV when it is configured in 16:9 mode and receives a 1920x1080 image (which is 16:9) then it expands the image to more pixels? I mean eventually it has to map the information to native pixels, and since the desktop is not visible it means the 1920 pixels must have been upscaled to say 2200 pixels wide. But if the video image had sub titles, wouldn't they now get chopped off? I think I actually noticed this on a video recently.


----------



## nholmes1 (Oct 7, 2010)

Its more likely that the entirety of the 1920x1080 are not visible on screen. They don't get chopped off, just masked out of the active viewing window.


----------



## CraigNZ (Apr 22, 2010)

So I guess the best solution is to configure the TV to "Fit To Screen" mode. This would then have the TV processor do the image adjusting (pixels) to reduce the incoming 1920 x 1080 to whatever it needs. Do you agree?


----------



## nholmes1 (Oct 7, 2010)

Well if you have to see the full screen and your tv doesn't have an underscan option then you would need to make adjustments in the video drivers. If you are not using it as a desktop exclusively and mostly watching video/movies I would set the vid card drivers to use Display scaling and deal with the small overscan.


----------



## CraigNZ (Apr 22, 2010)

The video driver has a H and V scaling setting, in %. The TV also has a scaling option but simply "Fit to Screen" .. so I am guessing that the TV one is dynamic. The difference I guess is the TV is dynamic and can auto adjust based on the source coming in (e.g., desktop vs a DVD vs a Blu Ray). Also, the TV will do all the processing and thereby reducing any possible loading problem on the PC and the TV scaling may be more optimized for their electronic design.

But a question, when scaling, what is actually going on? Is the software (in the TV or video card) actually remapping all of the pixels down to some other size? For example, mapping the V pixels from 1080 down to say 1024? And if so, doesn't this introduce distortion into the resulting image? Maybe it is better to do no processing of the image and somehow get windows to resize the desktop.


----------



## Infrasonic (Sep 28, 2010)

I don't have much experience with plasma TV's but for all of the LCD's that I've connected to PC's the "16x9" or "Pixel by Pixel" modes did not overscan (this includes Samsung LCD's). Maybe someone with experience with plasmas would have some better input but I would assume they both work the same way because they are displaying the same resolution. 

Do you have another PC/Laptop/Netbook you could connect to isolate the problem to the TV?

You may also want to consider a firmware update.


----------



## CraigNZ (Apr 22, 2010)

I have temporarily resolved the problem by first setting the desktop resolution to 1680 x 1050. This best matched the display area on the plasma screen and I can see everything out to the edges perfectly. For video watching I then configured Windows Media Center to a resolution of 1080p 60Hz and it displays the videos perfectly. When I exit MCE it returns the TV to the desktop resolution. I can see this happening because when there is a resolution change the TV displays the new resolution incoming signal so I can confirm the changes.

The only problem is MCE now has its edges off the screen which is only the close application and a couple of other controls .. but since I am using a remote control it does not matter since the remote initiates all of those actions.

So for now this is the best solution, and seems to work very well. The video is now sharper and there is no software processing changing resolutions. What saved me is MCE can be configured to run at a different resolution than the desktop, so it changes the video card dynamically.


----------



## Nissan-SR20-Man (Feb 4, 2011)

If using ATI video, check the catalyst software versions. When I changed HTPC over to HDMI my software did the same thing. Had to upgrade it


----------

