EVGA's Precision software is a very nice tool for overclocking. All three clocks can be manually adjusted as well as fan speed, and up to ten custom user profiles can be saved for quick setting changes. The Precision software can be set to launch automatically on boot, load a previously saved profile and minimize itself to the taskbar, which makes overclocking a "set it and forget it" kind of deal once you've determined the card's capabilities. The Precision utility also shows off another feature of the GTX 200 series cards, their ability to reduce clock speeds and power draw when idle. A short log of each setting's history is graphed, and as the image above shows, the GTX steps down the core/memory speed as low as 300/100 when not under heavy use.
The details of this power saving feature can be further explored using NiBiTor to examine the GTX 260 BIOS. Clock speeds of 300/100 are specified when in 2D mode and 400/300 when in 3D mode, with the regular settings of 576/999 specified as extra. nVidia doesn't offer an explanation concerning what triggers the card to change modes however some articles claim that relatively low power 3D applications such as DVD or HDMI playback use the 3D mode whereas most games will boost the card to full speed. I did observe some apparent glitches in the mode switching as occasionally frame rates would drop off during a benchmark run and exiting out to the desktop would reveal that speeds had throttled back to the lower setting. In these instances a reboot was required to get the clock rates functioning properly again. These issues only occurred when overclocking and operating on the upper threshold of stability, so possibly this is a reaction by the card to throttle down when errors are detected?
The voltage tab of NiBiTor shows that nVidia is throttling the power on the GTX as well, which undoubtedly helps contribute to the power savings observed when compared to previous generation cards. Only 1.03v are used when in 2D mode and 1.06v in 3D, with the reference clocks requiring a relatively low 1.12v, which is comparatively somewhere between an 8800GT and GTS 512. Of particular note however is the inclusion of a fourth voltage setting of 1.18v that is not being used in the stock BIOS. This highest setting is likely reserved for certain overclocked edition cards, and like the well-known 8800GT BIOS vmod, offers those owners brave enough to tamper with their BIOS the ability to increase voltage and obtain higher core frequencies without physically altering the card.
Using the EVGA Precision utility I was able to overclock the GTX 260 to a benchmark-stable 700/1250 core/memory speed. This is a 124MHz increase on the core and 251MHz memory or roughly a 20-25% gain over stock. Unlinking the core and shader speeds can sometimes give you a bit more on either one individually however in my experience they usually max out fairly close. Using higher clocks on either core or memory produced unpredictable results, often resulting in artifacts and lower benchmark scores, and occasionally the aforementioned clock throttling issue.
I also changed the core voltage to the 1.18v setting and flashed the altered BIOS to the card. This allowed me to reach nearly 740 on the core, an increase of 36MHz from the previous high. Load temperatures only went up a few degrees but because the factory fan speed throttling was designed to deal with stock voltage I did not feel comfortable leaving the Auto setting enabled and opted to manually adjust this higher, which also increases noise. An aftermarket cooling solution would be ideal for this type of modification however unfortunately few exist yet for the GTX 200 series. Given the difficulty in removing the stock cooler and the possibility of voiding the warranty, most users will probably stay with the stock cooler. For this reason I elected to use the default voltage and the lower 700/1250 clocks as the "overclocked" setting in all benchmarks and game tests.
How does the performance of the GTX 260 stack up? Find out next.