G
Guest
Let me tell you a simple fact about antialiasing...
72/96 dpi displays are the cause of antialiasing.
There are three options:
- Ignore it
- Supersampling/Multisampling
- 300+ dpi display
The third is the best option, as the antialiasing sampling methods smooth
the "jaggies" by rendering the graphics at much higher resolutions than is
actually displayed, you lose detail. Since the GPU is calculating more
pixels, games and graphics slow down.
The human eye cannot easily resolve objects more than 300 dpi. As a result,
a 300+ dpi display can do perfect antialiasing without losing resolution, at
no performance cost.
For a more visual explanation:
Supersampling
-------------
Apparent Resolution: 1280 x 1024
Actual Resolution: 2560 x 2048 (4x Supersampling)
Resolution Loss: 3932160 pixels (75% of pixels lost)
Performance Penalty as per apparent resolution: Yes
300+ dpi display
----------------
Apparent Resolution: 2560 x 2048
Actual Resolution: 2560 x 2048
Resolution Loss: 0 pixels
Performance Penalty as per apparent resolution: No
Results: For the Supersampling method, the GPU is rendering more pixels than
you can actually see. For the 300+ dpi display method, the GPU is rendering
just as much pixels as you can actually see.
Statement: 300+ dpi display "antialiasing" - no performance penalty
Conclusion: If you have a 300+ display, you can turn off all antialiasing
features and devote more GPU power to rendering more complex visuals - the
display will do the antialiasing for you, without the GPU knowing.
*** Supersampling: the GPU renders more pixels than is displayed so that
when the extra pixels are scaled back to the apparent resolution jaggies can
be minimized. Although the jaggies are reduced, performance suffers.
72/96 dpi displays are the cause of antialiasing.
There are three options:
- Ignore it
- Supersampling/Multisampling
- 300+ dpi display
The third is the best option, as the antialiasing sampling methods smooth
the "jaggies" by rendering the graphics at much higher resolutions than is
actually displayed, you lose detail. Since the GPU is calculating more
pixels, games and graphics slow down.
The human eye cannot easily resolve objects more than 300 dpi. As a result,
a 300+ dpi display can do perfect antialiasing without losing resolution, at
no performance cost.
For a more visual explanation:
Supersampling
-------------
Apparent Resolution: 1280 x 1024
Actual Resolution: 2560 x 2048 (4x Supersampling)
Resolution Loss: 3932160 pixels (75% of pixels lost)
Performance Penalty as per apparent resolution: Yes
300+ dpi display
----------------
Apparent Resolution: 2560 x 2048
Actual Resolution: 2560 x 2048
Resolution Loss: 0 pixels
Performance Penalty as per apparent resolution: No
Results: For the Supersampling method, the GPU is rendering more pixels than
you can actually see. For the 300+ dpi display method, the GPU is rendering
just as much pixels as you can actually see.
Statement: 300+ dpi display "antialiasing" - no performance penalty
Conclusion: If you have a 300+ display, you can turn off all antialiasing
features and devote more GPU power to rendering more complex visuals - the
display will do the antialiasing for you, without the GPU knowing.
*** Supersampling: the GPU renders more pixels than is displayed so that
when the extra pixels are scaled back to the apparent resolution jaggies can
be minimized. Although the jaggies are reduced, performance suffers.