I hardly see any 1680x1050 panels advertised any more. They're all either 1366x768 or 1920x1080, with the odd 1600x900... this appears to be primarily because the adverts have everyone obsessed with the idea that 16:9 is better.
The point I was trying to make was more that the high-end cards now only really stretch their legs when they're pushed - and that means high resolutions. ;) In many cases, you don't start seeing the serious bottlenecks on the high-end cards until you start pushing upwards of seven million pixels; even 2560x1600 can't show a bottleneck up the way 6064x1200 can.
Anyone that buys one of the top-end cards and runs a single 1680x1050 monitor isn't really bothered about the details; they obviously want the fastest - whether for bragging rights or whatever - and don't care that there would be better solutions which cost a lot less to implement. :)
I bought a Geforce 9800GX2 when it came out to game at 1600x1200, though it did that easily except one area, Crysis lol. Now i run a GTX 275 since my 9800GX2 died on me awhile back and game at 1366x768 but this also means right now i can max almost anything I want, and 2 I don't have to upgrade for awhile. I have a buddy that does the same thing, he plays at 1440x900, he bought a 7800GTX on launch day, then he bought an 8800GTX for Crysis, and now just bought a GTX 570 but his monitor is still the same, some of us buy the high end so we can forgo upgrades for a few years, as if someone had bought an 8600GTS they would't have lasted even at 1366x768 for more than a year or 2