Wrong GPU Memory Clock Shown

Zackptg5

New Member
Took the update to v6.32 and found that the wrong GPU memory clocks is shown. I thought it was some kind of bug with the latest nvidia driver but msi afterburner is still showing the correct frequency. But when I overclock it (in this case, +350mhz), it shows a very different value (afterburner shows correct one in screen shot). v6.30 is fine so it's something with the update.
My GPU is the MSI Gaming X 2070 Super and it's a pretty garbage card for overclocking
Changed the mem overclock from +350 to 0 during the debug, hopefully that'll help :)
 

Attachments

  • Bug.png
    Bug.png
    481.3 KB · Views: 5
  • Bug.HTM
    250.9 KB · Views: 1
  • Bug.DBG
    1.7 MB · Views: 1
Last edited:

Zach

Well-Known Member
Nothing wrong...
7350MHz devided by 4 = 1837.5MHz

The 7350Mhz is the effective speed and the 1837.5 is the real speed. GDDR6 like GDDR5 is a quad-data-rate RAM that it is called DDR(double data rate) for unknown reason to me. Never really cared about. They should call it QDR to avoid confusion but maybe too many names already do that...
Double or Quad data rate meaning that on each cycle (1Hz) 2 or 4 signals (of data) are send through the RAM (in or out).

This is applied to every RAM, system RAM or GPU RAM.
DDR1~4 and GDDR1~4 is double data rate
GDDR5/6 is quad data rate.
 

Zackptg5

New Member
Fair enough but why the change in behavior? Maybe add a GPU Memory Clock Effect Speed entry then, I'm sure I won't be the only person confused by that and keeping it consistent with how other tools display it would decrease confusion
 

Zach

Well-Known Member
That I dont know... the change I mean.
HWiNFO, for me at least, always reported the real GDDR speed for any GPU I may have or had installed.

To display both real and effective speed for any ram is rather pointless I think and takes space and resources for the software and the system in general. I believe @Martin (the author of HWiNFO) is trying to keep it in controll in terms of sensor quantity.

You may notice that for CPU cores it does display both real and effective speeds but that is entirely different. CPU core speeds is a full dynamic variable that depending on loads and all the C-states (active/sleep states) of the cores and for that the distinction between them is "needed". For any DDR/ GDDR this is not. You just multiply the real speed with 2 or 4 depending on the Dual/Quad Data Rate technology and you have the effective. Its that simple.
 
Top