Okay so this is a multi faceted question. I have numerous devices that I watch Netflix and other video on all on the same TV. They all look good. I'm pretty obsessed by picture quality and there is a noticeable degradation in picture quality when using my PC. I've spent hours adjusting TV settings, GPU colour settings etc but the result is that I end up USING, say my Xbox ONE, to watch stuff off a HDD because it just looks terrible on my PC. Multiple devices work great off the current TV contrast, brightness etc and I don't want to change everything just to use my pc.
So why is a 4K capable chip, and graphics card unable to provide a decent picture that other devices provide (with the same data--be it a Netflix ep or other HD Input) without even spinning up their fans?
Now my TV is a 4k cheapy. But I'm very happy with the colours etc on all my devices except the PC and I cant seem to get it right and I would like to save on power by not having two devices running all the time when I truly only need one.
So why is a 4K capable chip, and graphics card unable to provide a decent picture that other devices provide (with the same data--be it a Netflix ep or other HD Input) without even spinning up their fans?
Now my TV is a 4k cheapy. But I'm very happy with the colours etc on all my devices except the PC and I cant seem to get it right and I would like to save on power by not having two devices running all the time when I truly only need one.