Page 6 - Overclocking & Test bed
Overclocking & Tweaking
As most of you with most videocards know, you can apply a simple series of tricks to boost the overall performance a little. You can do this at two levels, namely tweaking by enabling registry or BIOS hacks, or very simple, tamper with Image Quality. And then there is overclocking, which will give you the best possible results by far.
What do we need?One of the best tool for overclocking NVIDIA and ATI videocards is our own Rivatuner that you can download here. If you own an ATI or NVIDIA graphics card then the manufacturer actually has very nice built in options for you that can be found in the display driver properties.
Where should we go ?
Overclocking: By increasing the frequency of the videocard's memory and GPU, we can make the videocard increase its calculation clock cycles per second. It sounds hard, but it really can be done in less than a few minutes. I always tend to recommend to novice users and beginners not to increase the frequency any higher then 5% of the core and memory clock. Example: If your card runs at 500 MHz (which is pretty common these days) then I suggest you don't increase the frequency any higher than 25 to 50 MHz.
More advanced users push the frequency often way higher. Usually when your 3D graphics start to show artifacts such as white dots ("snow"), you should back down 10-15 MHz and leave it at that. Usually when you are overclocking too hard, it'll start to show artifacts, empty polygons or it will even freeze. Carefully find that limit and then back down at least 20 MHz from the moment you notice an artifact. Look carefully and observe well. I really wouldn't know why you need to overclock today tested cards anyway, but we'll still show it ;)
All in all... do it at your own risk.
- Any generic 8800 GT at standard is at 600 / 1500 / 1800 clocks (core / shaders / memory).
- This standard OC for this card is at 600 / 1512/ 2000 (core / shaders / memory).
- We overclocked it towards 700 / 1750/ 2400 (core / shaders / memory).
That is such an excessive overclock. Memory could be pushed heaps further.
|
||
As you can see, the result is a notably faster performing card. The game you are looking at is Call of Duty 4.
Image Quality setting:
- 4x Anti Aliasing
- 16x anisotropic filtering
- All settings maxed out
Hardware and Software Used
Now we begin the benchmark portion of this article, but first let me show you our test system plus the software we used.
Mainboard
nVIDIA nForce 680i SLI (eVGA)
Processor
Core 2 Duo X6800 Extreme (Conroe)
Graphics Cards
Various GeForce Series 8 cards
Memory
2048 MB (2x1024MB) DDR2 CAS4 @ 1142 MHz Dominator Corsair
Power Supply Unit
Enermax Galaxy 1000 Watt
Monitor
Dell 3007WFP - up-to 2560x1600
OS related Software
Windows Vista
DirectX 9/10 End User Runtime
NVIDIA ForceWare 169.09
NVIDIA nForce 590/680iplatform driver 9.53
Software benchmark suite
Call of Duty 4
Crysis
World in Conflict
Ghost Recon: Advanced Warrior 2
S.T.A.L.K.E.R.
War Front: Turning Point
F.E.A.R.
Prey
3DMark05
3DMark06
A word about "FPS"
What are we looking for in gaming performance wise? First off, obviously Guru3D tends to think that all games should be played at the best image quality (IQ) possible. There's a dilemma though, IQ often interferes with the performance of a graphics card. We measure this in FPS, the number of frames a graphics card can render per second, the higher it is the more fluently your game will display itself.
A game's frames per second (FPS) is a measured average of a series of tests. That test often is a time demo, a recorded part of the game which is a 1:1 representation of the actual game and its gameplay experience. After forcing the same image quality settings; this timedemo is then used for all graphics cards so that the actual measuring is as objective as can be.
Frames per second | Gameplay |
<30 FPS | very limited gameplay |
30-40 FPS | average yet very playable |
40-60 FPS | good gameplay |
>60 FPS | best possible gameplay |
- So if a graphics card barely manages less than 30 FPS, then the game is not very playable, we want to avoid that at all cost.
- With 30 FPS up-to roughly 40 FPS you'll be very able to play the game with perhaps a tiny stutter at certain graphically intensive parts. Overall a very enjoyable experience. Match the best possible resolution to this result and you'll have the best possible rendering quality versus resolution, hey you want both of them to be as high as possible.
- When a graphics card is doing 60 FPS on average or higher then you can rest assured that the game will likely play extremely smoothly at every point in the game, turn on every possible in-game IQ setting.
- Over 100 FPS? You have either a MONSTER of graphics card or a very old game.