Displayed vs Presented Framerate?

RTXAimer

New Member
Hello, I am currently trying to debug some issues with in home game streaming with the help of HWInfo and have noticed that the software reports two different framerates displayed and presented. When I am streaming to my client from my host PC there are some periods where the displayed framerate can drop by up to 20% while the presented frame rate stays constant at 60fps. This correlates stuttering on my client but there is no stuttering on the host screen even when the reported displayed framerate drops. What is the difference between how the two are calculated?
 
Back
Top