Yes I know not another bioshock thread but, this is not about widescreen this time!
Heres a fix for both Nvidia and ATI cards enabling AA in bioshock!
http://forums.tweakguides.com/showthread.php?p=71963#post71963
General
- Antialiasing is not supported natively by BioShock, that is there is no way of enabling it within the game itself at the moment. This may change if 2K bring out a patch allowing this, especially in DX10 mode.
- Antialiasing is not possible in any way using Windows Vista under DX10 mode. The reason is unclear, as Unreal Engine 3 should support AA under DX10 as Tim Sweeney has said.
- I've experimented with the UseSoftwareAntiAliasing=True and EdgeBlurSize=0.500000 variables in the Bioshock.ini and they seem to have absolutely no impact on image quality when disabled, enabled or altered, either in DX9 or DX10 mode. Neither does forcing AA through the graphics card control panel work in Vista when UseSoftwareAntiAliasing=False, so it's not that setting which is preventing it from working.
Nvidia Cards:
Windows XP - Antialiasing is definitely possible on Nvidia GeForce 8 cards under Windows XP DX9 when forced in the Forceware Control Panel. The restriction to GeForce 8 cards appears to be because Bioshock uses HDR which means that out of all Nvidia cards only the GeForce 8 series can do HDR+AA. Evidence is posted on the previous pages in numerous screenshots, it seems to be genuine MSAA not supersampling.
Windows Vista - Antialiasing is not normally possible in DX9 or DX10 mode. There is however a way of forcing Antialiasing, and that is by renaming the main Bioshock.exe file to R6Vegas_Game.exe, and adding -dx9 to the game's shortcut. That is, the shortcut should look like this:
"E:Program Files2K GamesBioShock DemoBuildsReleaseR6Vegas_Game.exe" -dx9
When you then launch Bioshock and check under the Graphics Options, the DX10 option will be totally greyed out and can't be changed, meaning the game is running under "pure" DX9 mode. You can now force AA through the Nvidia control panel as normal, and it will definitely work. Here's two screenshots I've taken in Vista, DX10 NoAA on the left, DX9 "pure" 16AA on the right (zoom in on left side of lighthouse):
ATI Cards:
Windows XP, Vista - It is not normally possible to use Antialiasing in Bioshock on any ATI card, whether in DX9 or DX10, even though X1X00 cards and newer can do HDR+AA. This is likely an ATI driver issue.
However ATI X1X00 and HD2X00 users can apparently force the game to use AA in DX9 if you rename the Bioshock.exe file to Oblivion.exe, make sure Catalyst AI is enabled (as that's what detects the game-specific optimizations/capabilities), and then you should be able to force AA in your Catalyst Control Center. You probably need to add -dx9 to the shortcut in Vista as per the example further above, but I can't personally test this though others have confirmed this method works. I.e should look something like this:
"E:Program Files2K GamesBioShock DemoBuildsReleaseOblivion.exe" -dx9
There may be texture glitches and other oddities when Antialiasing is forced on Bioshock, and I suspect this is why the drivers have been specifically designed to prevent forcing AA on Bioshock, especially in Vista. I'm still not sure why DX10 mode doesn't support forced AA, but whether due to glitches or not, hopefully 2K and/or the graphics companies will rectify this in the near future, preferably by giving us an in-game AA option at least for DX10 mode.
Obviously I'll look into all this much further when I can finally test properly with the full version of the game in around 48 hours time. But it seems there's a lot of confusion over AA in Bioshock in all the forums I've checked, and aside from the misconceptions flying around, the whole topic is probably distracting people a bit too much.