GPU Power vs GPU Power Normalised

zugidor

New Member
Question: What's the difference between the GPU Power and GPU Power (normalised) stats? Normalised is showing a higher GPU Power percentage by about 2% than the non-normalised stat, meanwhile the non-normalised stat seems to fluctuate more but for the most part they stay within  2-4% of each other. Which one should I use? Which one is more accurate? I'm using a Palit Geforce GTX 1060 Super Jetsream.
 
Since the power is reported as percentage of the GPU TDP (Thermal Design Power) and not in absolute units, it depends on the TDP value for GPU. And because the TDP limit can be adjusted by user (or manufacturer), the relative power value is then normalized to the fixed TDP value.
 
Martin said:
Since the power is reported as percentage of the GPU TDP (Thermal Design Power) and not in absolute units, it depends on the TDP value for GPU. And because the TDP limit can be adjusted by user (or manufacturer), the relative power value is then normalized to the fixed TDP value.

so does that mean that the non-normalised value shows the GPU Power as a percentage of the adjusted TDP and the normalised value uses the default/original TDP? Is that why the normalised value is greater?
 
Sorry, I'm not sure how exactly this is calculated, because NVIDIA doesn't tell us the exact meaning of those values.
 
Back
Top