Okay, this is something I have done.
S-Video can run at a maximum of 640x480'ish (aka, not good), so in order to actually use the HD part of your TV, you will either need to connect to your TV via DVI or Component (which has three seperate cables, red green and blue). Both of those can run at high definition resolution. After you get the proper cables in place (DVI is the easiest, component has to run through a converter), set your resolution to 1920x1080 (interlaced) for the highest resolution, or 1280x720 for a slightly lower, but progressive-scan instead of interlaced screen. On my display, 720p (1280x720) looks better, but 1080i (1920x1080) looks great for gaming.
About the edges of the screen not appearing, that is the way that TV's are built, 3-7% of the edge of the screen is lost to what is called "overscan". TV's are bulit that way so that you never get a border around the edge of your screen, and older TV's couldn't quite handle the precision necessary, so instead of undershooting the edge of the screen, they overshot it. With NVidia's latest driver, you have to seek for it, but it has some special HDTV modes, which basically just sets your screen to a slightly insane resolution that hopefully has less overscan than the standard resolutions. Powerstrip, to me, seemed to do more damage than good, so be a little wary about it, and it can't really do much on modern HDTV's (they typically are fixed-frequency for the high resolutions, and you can't adjust it enough to correct for the overscan).
PM me if you need any more help.
|