-
Why would refresh rate affect performance?
I have a PC connected to my TV using a DVI -> HDMI cable and use it to play DVD content. When the display is set to 1920x1080 @ 24hz, average cpu utilization is around 80% and spikes to 100% causing slight pauses every once in a while. When the display is set to 1920x1080 @ 60hz, cpu utilization is significantly less, averaging around 30% and never spiking to 100%. Nothing else has been changed, just the display settings.
I find this pretty strange and wonder if anyone has any insight as I would obviously like to run it at 24hz. If anything I would think a lower refresh rate would be easier on it in general.
The pc is a 1ghz p3 with 512mb ram and a nvidia 7600gs video card running Windows Media Center 2005. The TV is a Sony 46" XBR4. I know its a really low end pc, but its a collection of parts I've acquired over time and would rather use it than throw it away.
-
Great White Shark
Why would you want to rung it @ 24 Hz? Anyhow, more memory may also help... I'm guessing that forcing the number of frames to be lower than 60 is taking extra headroom ...
-
The TV I have is a 120hz LCD and accepts a 24hz input. This feature provides an image free of 3:2 pulldown artifacts (motion judder) when watching film content, since 24 is a multiple of 120.
This is one of the reasons I bought the TV and would like to use a PC to feed it an appropriate signal. Bluray and HD-DVD players can output a 1080p/24 signal, but I'm not ready to buy one yet. Plus I don't think any of them output this way when playing DVDs, only the new HD media. I can see why too, 1080p upscaled DVD content at 24hz is pretty damn good. Which is why buying a new player isn't really a necessity right now. I just found it odd that I got significantly worse performance just by changing the refresh rate.
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
|
|