I know, I whined about the frame rate in store. Funny, isn’t it?
Yep, I bought a BRAVIA V-Series 32-incher. Baseline: very happy with it. The 32 inch model is a perfect size replacement for the 68cm 4:3 TV with side speakers – it fits snugly into the same area, but provides a much wider and only-very-slightly-reduced-in-height picture. When not watching TV, it’s a really good monitor for web browsing at a distance, a far cry from the S-video-to-composite quality of the old TV, fantastic as a TV though it was.
I figured I could tweak out the frame rate issues, but they haven’t been an issue with “PC Mode” – the VGA input.
Wasn’t all flat water, though: I had a bit of a fight on my hands getting the RADEON 9550 in my Media Centre to talk to it.
See, Sony in their consumer-focused wisdom (cheap shot, but they deserve it) figured that the BRAVIAs didn’t need DVI. There’s VGA on all models, and HDMI only on the V series, and darn it, that should be good enough. I wondered if some pointy haired manager thought that not offering DVI would somehow curb piracy – for whatever reason, it ain’t there.
So instead of offering DVI, with the image quality benefit that the digital transfer typically entails over VGA, you get to try your luck with a DVI to HDMI converter cable. And at least in my case, some combination of the ATI RADEON and the Sony seemingly misreporting its native resolution as 1900×1200 causes my 1360×768 native (well, close enough to it) res to get squashed into the centre of the screen, with a massive wasted area.
And even that was an improvement, because with the original set of RADEON drivers I tried, I’d get nothing – just an occasional purple flash (or continued flashing on occasion), and a glimpse of the screen it should have been showing, then nothing. The RADEON CATALYST 5.11 drivers seemed to fix that particular aspect, but then you have to deal with the RADEON 9XXX series’ dislike for DVI in general (feeling like my Apple Cinema Display experiences all over again…). VGA: no probs.
So, long story short, I’m using a VGA cable until I can replace the RADEON with an Nvidia 6600 and try again with the HDMI->DVI cable. Sigh.
Anyway – there’s no appreciable interference with the VGA cable (certainly not at 6 feet, and not detectable up close either), but I want my digtal output. I was skeptical of DVI being a huge improvement over VGA until I got a monitor with dual inputs – then it was night and day.
And finally, I have a TV that the MCE Remote can do the single-button-on-or-off thing with – the old telly needed to have a channel selected to switch on.
More whining to follow as I find more stuff out. But it kicks ass as a telly so far.