I get asked about tech stuff occasionally (once or twice… an hour), and one of the questions I get asked that doesn’t relate to fixing a problem directly with Mr Gates’ software is “what do I need to look out for when buying an HDTV?”. High Definition TV is awesome and is definitely worth buying, but requires some thought.
HDTV LCDs/Plasmas have been around for a few years but there’s a bunch of different elements that consumers need to look into when choosing one which can make it a bit confusing.
First, resolution: The higher resolution of the image is what makes a TV ‘High Definition’ – regular TVs have standard resolution of around 720 by 480 pixels. 720p TVs up this resolution to 1368×768 (TV output is usually at 1280×768). ‘True’ HD TVs output at a mammoth 1920×1200 pixels. The higher the resolution, the better the potential for the ultimate image displayed as you can have twice as many dots defining the picture. For a neat diagram to give you a sense of quite how much the different HD formats add to the picture, check the image on this Wikipedia page. Not all ‘TrueHD’ panels are created equal (most are produced by Sharp and Samsung and developed into their own product lines by other HDTV manufacturers), and there are some very cheap ones on the market which will display a poor quality image even with the highest quality source. I don’t see the point in getting anything other than 1080p at this point — they’re getting cheap enough; but do make sure you get a good enough one for your needs.
Which takes us onto… the Standard Definition Engine. Given the paucity of HDTV content out there (only Sky HD and Virgin have HD channels in the UK at the moment, although future evolutions of Freeview and Freesat will give us new means of accessing content), and even these are only in 720p. There is no ‘TrueHD’ broadcast content. As such, you need to make sure that your HDTV makes standard definition signals look passable. They won’t look as good as proper HD sources regardless, but some normal Standard Definition engines make better work of the signals received through SCART leads et al. If you don’t know whether you have any standard definition sources… well, you do. Your DVD player, VCR, Sky box and Freeview receiver are all most likely working in SD. If you have anything that uses a SCART lead or a composite or S-Video lead, it is working in SD. So test the TVs – make sure they aren’t just showing you Blu-Ray content, which is in 1080p, make sure you watch analogue BBC1 or some such to make sure it’s not pixellated to hell.
The final thing I think is really vital is the connectivity. New, HD sources will output in HDMI (a new cable socket, essentially, about a million times better than SCART). Many new HDTVs come with only one HDMI socket. Even two will start to feel like too few once you’ve got a games console hooked up. I have a lead in from my desktop PC and one from my DVD-PVR and have run out — so when I (eventually) get a PS3 I’ll need to get a switcher box. Which will be annoying. So look for 2-3 HDMI sockets minimum, and make sure you also have VGA in, RGB, sockets as well.
There are other things that matter to some people – plasma vs. LCD, response times, rated lifetime of panel, energy rating, integrated DVB etc. But I think most of this stuff is secondary to the top three. Remember, just buying an HDTV won’t necessarily improve your viewing experience — you’ll probably need to get some HD sources at the same time, so save some cash for a DVD player / freeview box that ‘upscales’ to 1080p (includes clever software making standard def signals look high def).
Anyone think different? Let me know in the comments. Be interesting to see how quickly this advice dates…