Widescreen Gaming Forum

[-noun] Web community dedicated to ensuring PC games run properly on your tablet, netbook, personal computer, HDTV and multi-monitor gaming rig.
It is currently 08 Jul 2024, 16:32

All times are UTC [ DST ]




Post new topic Reply to topic  [ 33 posts ]  Go to page 1, 2, 3, 4  Next
Author Message
PostPosted: 17 Apr 2010, 23:56 
Offline

Joined: 30 Mar 2006, 09:16
Posts: 156
Okay, let me first get it out of the way that I know the latter is superior.

However, between all the technical things I've been thinking about and my lack of time to figure it out myself, can someone explain, and ideally SHOW why the latter is better when using the same resolution?

The problem is easy to highlight.

Say I run a game at 1440x900. On my 30" (2560x1600). Nvidia (or ATI's) scaling is enabled. Thus the game is using 2560x1600 pixels.

VS, running the game from it's in-menu choice of 2560x1600. Also using 2560x1600 pixels.

Yet the latter looks better... so... why exactly again?

Same goes for TV's and a 360 or PS3. Regardless of whether you use 480 720 or 1080, they're all filling the screen, and using all the display's pixels... but obviously the latter resolutions are superior.


I'd make an image myself comparing low rez vs high rez @ same screen size/area but maybe someone has one lying around already?


This all comes back to dosbox in the end and me figuring out the different settings in the config file.


Top
 Profile  
 


PostPosted: 18 Apr 2010, 00:06 
Offline
User avatar

Joined: 22 Mar 2006, 03:09
Posts: 1296
The gist of why native resolution looks better than a scaled resolution is because of the following.

Ill use simple numbers to make it easier, you have a 300x300 screen.

If you run at 300x300 your graphics card uses the textures in the game to provide 300x300 pixels of image so each pixel is individually calculated based on the detail of the textures in game

If you run at 150x150, that many pixels are rendered with the textures of the game. Then, the graphics card goes to output it to the monitor and has to make it 300x300 so it upscales it. Since 1 150x150 pixel has to provide data for 4 300x300 pixels the graphics card has to guess at what colors the 4 pixels should be that combine to make up that 1 pixel.

So lets say the 150x150 is a shade of grey, the upscaling may make two of the pixels black and two white, or two light grey and two dark grey based on surrounding pixels. The actual textures in 300x300 mode will tell the computer exactly what colors it should be using where upscaling uses an algorithm to determine what colors the 4 pixels should be based on that 1 pixel.


Top
 Profile  
 
PostPosted: 18 Apr 2010, 00:34 
Offline

Joined: 30 Mar 2006, 09:16
Posts: 156
are you basically saying that scaling is guesswork, while native resolutions are based on actual information?

i think this probably really needs some pictures hey. haha. for comparison images, the lower resolution captures have to be zoomed in so that they're the same size as the larger resolution, right?


Top
 Profile  
 
PostPosted: 18 Apr 2010, 05:50 
Offline
Insiders
Insiders

Joined: 07 Nov 2005, 04:16
Posts: 3010
If a game uses vector graphics, as all 3D games do, the vectors have basically infinity information. The higher the resolution, the more pixels are used to render the vectors, and every single pixel represents a piece of the infinity information. The more pixels you have available when rendering, the more information you grab from the vectors.

With GPU scaling of a lower resolution, the games are rendered at a lower resolution, and the scaler doesn't care that the images are rendered vectors. It just sees a grid of pixels, which represents finite information, and it can only guess what would have gone in between the pixels if there was more information.

However, if you're talking about DOSBox, then this is a completely different matter. Take something like The Secret of Monkey Island - the graphics are rendered at 320x200, and there's nothing you can to do override this. If you increase the resolution of DOSBox to 1920x1200, you aren't rendering at 1920x1200. You're simply making your system perform scaling at the software level instead of the GPU level (and your GPU is probably doing the scaling routines anyway).


Top
 Profile  
 
PostPosted: 18 Apr 2010, 06:19 
Offline
Insiders
Insiders

Joined: 07 Nov 2005, 04:16
Posts: 3010
Here are some pictures:

Rendered at 720p:


Rendered at 720p, scaled up to 1080p:
(contains no more information than the original 720p image does)


Rendered at 1080p:
(contains more information than the original 720p image does)


Here are some closeups:



Top
 Profile  
 
PostPosted: 18 Apr 2010, 07:58 
Offline
User avatar

Joined: 14 Nov 2006, 15:48
Posts: 2356
I should note that what cranky is doing in his image editor ends up producing a better picture than GPU scaling. Their algorithms really are not that great compared to the decent image editors.


Top
 Profile  
 
PostPosted: 18 Apr 2010, 09:40 
Offline

Joined: 30 Mar 2006, 09:16
Posts: 156
hmmm.

but is gpu scaling still the best way to enjoy a game when its max rez is still less than the display?

say, playing Starcraft for example. you're either gonna play it in a small native rez box in the centre of the widescreen, or scale it up with the GPU

or are there other solutions that scale up the game better than the GPU does?



also:
in dosbox I can either set the fullscreen resolution to something really big, and not use nvidia scaling, OR I can leave it small or 800x600 (I think the driver's minimum) and turn on scaling to get it filling the picture?

what's the difference in practice? in image? performance? is one software scaling (done on the cpu) and the other gpu scaling?


Top
 Profile  
 
PostPosted: 18 Apr 2010, 10:18 
Offline
User avatar

Joined: 14 Nov 2006, 15:48
Posts: 2356
or are there other solutions that scale up the game better than the GPU does?


Monitors with built in scaling circuitry usually do the best job.


Top
 Profile  
 
PostPosted: 19 Apr 2010, 03:32 
Offline
Insiders
Insiders

Joined: 07 Nov 2005, 04:16
Posts: 3010
I should note that what cranky is doing in his image editor ends up producing a better picture than GPU scaling. Their algorithms really are not that great compared to the decent image editors.

I don't think that's true, really. There's only so much "great" you can get out of an algorithm that turns a low resolution into a high resolution. Nvidia scaling doubles every other scanline and interpolates. How much more can you improve on that? Plus, I'm using MS Paint. I'll eat my hat if MS Paint produces a better picture than my GPU does.

say, playing Starcraft for example. you're either gonna play it in a small native rez box in the centre of the widescreen, or scale it up with the GPU
or are there other solutions that scale up the game better than the GPU does?

Hack solution aside (since trying it with Battle.Net is a bad idea), scaling it up with the GPU is the best solution.

in dosbox I can either set the fullscreen resolution to something really big, and not use nvidia scaling, OR I can leave it small or 800x600 (I think the driver's minimum) and turn on scaling to get it filling the picture?
what's the difference in practice? in image? performance? is one software scaling (done on the cpu) and the other gpu scaling?

If you set DOSBox to use your native resolution, then the software will scale the 320x200 game up to your native resolution. However, the software may very well accomplish this through GPU scaling routines anyway. So the end result would be the same, only the GPU scaling is controlled by DOSBox instead of your drivers.

Personally, I just let DOSBox use default behavior, in which it doubles the lines, making the games go from 320x200 to 640x400. That 640x400 is then scaled to my native res through GPU scaling.

Monitors with built in scaling circuitry usually do the best job.

No they don't. Most monitors do a horrible job of scaling, and make the image look awful when it isn't native resolution. Even ones with decent scaling, such as mine, still aren't as good as GPU scalers. Consoles are the only reason to use monitor scaling.


Top
 Profile  
 
PostPosted: 19 Apr 2010, 10:38 
Offline
User avatar

Joined: 14 Nov 2006, 15:48
Posts: 2356
Monitors with built in scaling circuitry usually do the best job.

No they don't. Most monitors do a horrible job of scaling, and make the image look awful when it isn't native resolution. Even ones with decent scaling, such as mine, still aren't as good as GPU scalers. Consoles are the only reason to use monitor scaling.


Eh my 2007WFP does a much better job than my 8800GTX. But yeah I'm sure it depends on the monitor. (and the card.. http://hardforum.com/showthread.php?t=1509360 )


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 33 posts ]  Go to page 1, 2, 3, 4  Next

All times are UTC [ DST ]


Who is online

Users browsing this forum: Facebook [Bot] and 3 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  




Powered by phpBB® Forum Software © phpBB Group