|February 25th, 2007, 02:54 PM||#1|
Old Jedi master
Join Date: Sep 2006
Location: Manchester, UK
Nvidia 8800GTX overclocking, does the card overclock like you think it does?
Like everyone else I thought you could clock NVidia cards 1Mhz at time, thing is it actually looks like it does not work like that at all.
To start we have to remember there are 3 "zones" that over clock on the cards, keeping it simple there is the GPU core, the shader frequency and the memory frequency.
Getting really specific there are actually 3 memory "zones" but all the 97.XX drivers have a bug that locks down 2 of the zones and only allows you to over clock 1... for the sake of this blog article we will just call the 1 zone the memory.
I usually use ATItool or Systool by W1zzard over at Techpowerup for my video card over clocking adventures...issue is they look to clock the cards up 1Mhz at a time; thing is I have found some weirdness with the over clocks so after some brainstorming with friends it was decided to take a look at Rivatuner.
Now...Rivatuner has a nice graphical interface that shows hardware monitoring; it also shows the actual frequency of the 3 zones we are trying to over clock.
As you see from the graph core clock, shader domain and memory are shown, what I found weird was watching these clocks while over clocking 1 Mhz at a time with systool or Rivatuner its self.
Stock card clocks on my OCZ8800GTX cards are:
1350 Shader domain
In theory if i set 568 for the core it should go up 1Mhz...fact is it doesn't.
Here is what I found.
554 thru 571 = 567
572 thru 584 = 576
585 thru 603 = 594
604 thru 616 = 612
617 thru 634 = 621
635 thru 661 = 648
662 thru XXX = 675
I stopped here but the trend continues.
Now we need to add the shader clock into the mix:
577 thru 588 =1350
589 = 1404....AHHA
After a little detective work the shader clock rises every 23Mhz or close to it.
Here are where I found the shader clocks changed.
Again I stopped here but the trend continues.
So we have magic core clock points and magic shader clock points. So when clocking your card up remember you have to set the clock range the GPU can work at but be mindful the shader clock may be able to work higher as its reference clock change position falls usually within a GPU clock range..IE they don't change at exactly the same reference point if you take the GPU frequency as that reference.
An example would be setting 635 on the core would give you an actual GPU core frequency of 648MHZ. Now there are 2 shader clock reference points within the 648 range, 1512MHZ upto 656MHZ on the GPU core clock although the core is actually still at 648Mhz and then 1566 from 657Mhz GPU core clock onwards.Remember from 662 the GPU core clock jumps to 675MHZ so you need to be careful that the core can clock this high.
Now we need to bring the memory into the mix also:
897 thru 908 = 900
909 thru 927 = 918
928 thru 940 = 936
941 thru 958 =945
959 thru 985 = 972
986 thru 1003 = 999
1004 thru 1116 =1008
1117 thru 1035 = 1026
1036 thru 1048 = 1044
1049 thru 1066 =1053
1067 thru 1101 = 1080
1102 thru 1110 = 1107
I stopped here but you see the trend of how the memory clocks.
What does this mean overall, well...for my AMD 6000+ article I quoted video card clocks of 660/1100; it now seems i was running 648/1080 with a shader clock of 1566Mhz, I actually did test 662 on the core and it would freeze in most benches but around 660/661 was fine...now I know why 662=675 which was a jump to much for the unmodified card.
Of course this could be Rivatuner reading the frequencies wrong,it could be systool/rivatuner setting the frequencies wrong but I doubt they are.
What does this mean for all the reviews out there showing cards clocked at 650Mhz core or 1090 on the memory etc etc...well I think they may have their review data wrong.
Investigate for yourselves and see what you all find.
grab rivatuner over here
grab systool here
Thanks for the help last night
|Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)|
|Thread Tools||Search this Thread|