ok, about 3-4 yrs ago i purchased a sony sxrd kds-r50xbr1. Now this tv won many awards for its quality picture and it still has one of the best pictures i have yet to see.
However on to true 1080p. I bought this tv cause it was described as 1080p but it wasnt until i bought a ps3 (about month ago) that i realized that my tv would NOT accept 1080p signals from hdmi.
I have heard that it will accept 1080p from a computer? but im not sure if this is completely true. If so, is there a way i can set my tv up to watch blurays in true 1080p???
Im kinda pissed about my tv, but i did buy it 3-4yrs ago ao i guess i cant complain too much besides the fact i paid nearly 3500bucks for it.
What kind of computer input does the TV have? DVI? VGA?
I know they make HDMI to DVI adapters as I was just looking at them not too long ago. You could pick one of those up for a few bucks and give it a shot.
I just bought a new TV and have it hooked into my computer via VGA. I am still waiting on my Blu Ray player, so I havent seen 1080p yet. Only 1080i and 720p, so I cant really understand if you are missing anything or not yet.
Feed it 1080i and see how it looks. That signal is still the same as 1080p resolution. It is not scaled/upconverted like 720 or 480 would need to be. It will just duplicate frames and fill in to get 1080p.
1080i is the same as 1080p? 1080 INTERLACED still needs to be scaled if he has a true 1080p set which is different then feeding the set a true 1080 PROGRESSIVE feed…
I watch bluray now, and the picture is amazing, but i cant help but to find myself wondering if it could be better if it was rely 1080p. I mean the tv is 1080p but the inputs arent or some crap. lol i rely have no clue what the hell is going on.
Right now my ps3 feeds it as 1080i.
Im think i may have been mistaken before about being able to input 1080p at all, i think the tv i have just isnt capable.hmmmm
this TV will not take 1080p from any source… it line doubles and scales whatever source to output 1080p. Basically it was a bit of a marketing trick that they played on you.
Violator, I’m not so sure that you’re right about 1080i doing no scaling to display on a 1080p set… I’m thinking this one over, but a 1080i input would have to be scaled to output on a 1080p
e.g. the simplest way to process 1080i is to use just 540 lines as mentioned above. The TV throws away every other line of information i.e. 1,3,5,7 are left over. To work out what it wants for line 2, it averages the information in lines 1 and 3. On straight lines this could bring about the twitter jpow is seeing from his PC input. In fact using material with slatted lines (classic example is the park bench scene in X-Men) is a handy way to see how good the 1080i processing is in a display or processor.
Better TVs use more accurate methods e.g. analysing the strem for movement and discarding only regions of an image, or even more specifically just pixels, where movement is detected (and thus information must be interpolated).
so no matter how you look at it, a lot of processing is going on when you display a 1080i source on a 1080p display.