TIPS and INFO when buying that new TV - Read this thread before you buy that TV

Status
Not open for further replies.

petetherock

Moderator
Moderator
Joined
Jan 7, 2007
Messages
9,626
Reaction score
828
More HDMI info

Courtesy of this website:

http://hometheatermag.printthis.cli...m/hookmeup/hdmi_101/index.html&partnerID=3830

[size=10pt]Hook Me Up
HDMI 101
By Joshua Zyber • March, 2009
How important is HDMI 1.3 anyway?
The HDMI standard was developed with noble intentions. Most people in the home theater hobby know the hazards of cable clutter. When you have a lot of equipment connected this way and that by separate audio and video cables, you wind up with a tangled mess of wires behind your equipment rack or entertainment center. The problem is compounded by component video (three cables just for picture) and multichannel analog audio (six to eight more cables!). Now factor in a DVR, a couple of DVD players, a Blu-ray player, a video processor, and an A/V receiver all interconnected in one theater room. If you want to add or remove any piece of equipment, you’ll have to squat behind the rack with a flashlight and trying to trace each cable from end to end. Which unit did this blue one come from? If I plug that red cable into here, will I get my picture back, or will my speakers start blaring obnoxious noises?

HDMI was supposed to help with all that. One cable carries both video and audio. Better yet, it carries high-definition video and high- resolution multichannel audio, plus it has all the latest copy-protection protocols that the Hollywood studios demand. In theory, it’s the perfect connection standard for Blu-ray. One HDMI cable out from the Blu-ray player to an A/V receiver, and another HDMI cable out from the receiver to an HDTV should be all it takes to get stunning 1080p picture and lossless audio, all fully encrypted with a minimum of cable clutter. So why are there so many different versions of HDMI? And which ones do you need to be concerned with?

HDMI is an evolving standard that first came to market before all of its features were finalized. The original HDMI version 1.0 established the basic parameters for transmitting high- definition video and uncompressed audio. This was followed by several revisions that added, among other features, support for the DVD-Audio format and some PC applications. For home theater purposes, any HDMI connection type from 1.0 to 1.2a will transmit 1080p picture and multichannel PCM sound equally well. However, at the very least, they will not carry the native digital bitstreams for the advanced Dolby TrueHD or DTS-HD Master Audio lossless audio formats.
The most significant revision to the HDMI spec came with version 1.3, which added support for a few new features that are useful for home theater applications. (Later versions such as 1.3a, 1.3b, and 1.3c add more remote control options and other improvements to their functionality, but they add nothing directly related to additional core audio or video.) In order to benefit from these new features, both ends of the signal chain—as well as any switches, splitters, or other intermediary devices—must be compliant with HDMI 1.3. As a result, HDMI 1.3 has become a marketing tool for many manufacturers to encourage consumers to upgrade their Blu-ray players, A/V receivers, and even all of their cables. You wouldn’t want to be noncompliant with all of the latest features, would you? Of course, this begs the question: Does a Blu-ray viewer really need HDMI 1.3 to get the most out of the format? The answer is a resounding maybe. To delve a little deeper, let’s take a look at what HDMI 1.3 offers that you can’t get in previous versions.

On the video side of things, HDMI 1.3 increases signal bandwidth and allows for the transmission of more color detail. Only HDMI 1.3+ can carry the Deep Color or x.v.YCC formats that promise billions of possible colors, smoother color gradients, and the elimination of banding artifacts. (Naturally, these will only work if both the source and the display are compatible.) That certainly sounds great, but there’s just one problem. The Blu-ray spec doesn’t support either Deep Color or x.v.YCC. Even if a Blu-ray player claims compatibility with these formats (and several do), no Blu-ray Discs are actually encoded with an extended color gamut. Those billions of new colors don’t exist in the Blu-ray source. Any standard HDMI connection can transmit the full video quality that’s available on a Blu-ray Disc.


Does that make HDMI 1.3 irrelevant for video? Not necessarily. At present, a few models of HD camcorders will record content with Deep Color or x.v.YCC. There has also been speculation that some video games may be encoded with one or the other in the future. Although Blu-ray Discs don’t contain the expanded color detail, some Blu-ray player models (such as the recent Pioneer BDP-51FD) may be able to interpolate those extra colors internally, which essentially upconverts the color signal. To take advantage of that, you’ll need HDMI 1.3 and a Deep Color–capable display. On the other hand, some displays may be able to perform that interpolation themselves, negating the need for the Blu-ray player to do it. In the end, there may be some cases where HDMI 1.3 is useful, but it is not strictly necessary for video.

The audio situation is more complicated. Blu-ray Discs can contain movie soundtracks in several possible formats. Regular DTS or Dolby Digital 5.1 work the same as they did on DVD. An S/PDIF cable or any version of HDMI can transmit those lossy codecs without issue. As I mentioned earlier, uncompressed multichannel PCM will also work just fine with any HDMI connection. (S/PDIF doesn’t have enough band- width for that.) Where things get tricky is the usage of the newer audio formats: Dolby Digital Plus, DTS- HD High Resolution Audio, Dolby TrueHD, and DTS-HD Master Audio. Dolby Digital Plus and DTS-HD High Resolution Audio are rarely used on Blu-ray these days, but Dolby TrueHD and DTS-HD Master Audio are very common. All four of these new audio formats have the same transmission limitations. In order to hear the full high-resolution soundtrack, your Blu-ray player must either decode the format internally or transmit its native bitstream to an A/V receiver or surround processor.
Players that decode the advanced audio codecs convert the audio to PCM. The decoded PCM should result in no loss of quality, and it can be output over any HDMI connection. (Some player models may also offer multichannel analog outputs.) In this case, HDMI 1.3 is not needed. Unfortunately, not all Blu-ray players are built with the ability to decode those high- resolution formats in full quality. Some Blu-ray players can only decode standard DTS or Dolby Digital 5.1. And a number of early players decode Dolby TrueHD but not DTS-HD Master Audio. In either case, you’ll need to transmit the codec’s native bitstream and let your A/V receiver or surround processor do the decoding. This will require HDMI 1.3 on both the Blu-ray player and the receiver or surround processor.

Either decoding to multichannel PCM or passing the native bitstream will give you high-quality lossless sound. The choice between letting the Blu-ray player decode the audio or transmitting the native bitstream will depend on the specifics of your equip- ment. For example, the Sony PlayStation 3 offers no bitstream option for the advanced audio formats, but it will decode them internally to PCM. On the other hand, the Panasonic DMP- BD30 will not decode Dolby TrueHD or DTS-HD Master Audio itself but can transmit their native bitstreams. Secondary audio from commentaries, Bonus View, and BD-Live content complicates this decision even further, as the only way to seamlessly mix disc and secondary audio is to let the player handle the decoding.
Older A/V receivers and surround processors may include HDMI inputs that can accept multichannel PCM but not the newer formats. And some A/V receivers and processors—even a few current models—have HDMI inputs that will not handle any type of audio at all over HDMI. Their HDMI inputs are strictly video. If yours is one of these, the only way you’ll be able to listen to the new high-resolution audio formats is from the player’s multichannel audio outputs to the multichannel analog inputs on your A/V receiver or surround processor. In either of these situations, the player must be able to perform the decoding. Every system will have its own particular needs.
For both video and audio, HDMI 1.3 is useful in some home theater applications, but it’s not necessarily required. If you buy new equipment today, the presence of HDMI 1.3 will help with future-proofing if nothing else. However, with a bit of care, it’s still possible to obtain the highest-quality video and audio available from Blu-ray even with older versions of HDMI.
[/size]
 

petetherock

Moderator
Moderator
Joined
Jan 7, 2007
Messages
9,626
Reaction score
828
Hooking up

From here:
http://hometheatermag.printthis.cli...om/hookmeup/107hook/index.html&partnerID=3830



Connections for a High-Def World
By Dana Whitaker • January, 2007

Now that you've bought an HDTV, make sure you hook it up correctly.
Ah, the golden age of television. The only thing I loved more than Lucy was the solitary input on the back of my TV. It was a simpler time. Now we must choose between 300 channels and only slightly fewer inputs. Add HDTV to the mix, with all of its inherent confusion, and it's a recipe for connection disaster.
107hook.1.jpg

Component Inputs
Studies consistently show that many HDTV owners aren't watching HD content, either because they aren't getting it from their provider or they don't have their TVs connected correctly. A quick call to your cable/satellite provider can take care of the former, and a quick read of this HD-connection primer can help with the latter. Some TVs and projectors may have HD-capable inputs not listed here; however, for simplicity's sake, we've chosen to focus on the three you're most likely to find on a consumer HDTV.
All three of these connections can support HD resolutions up to 1080p, but it's up to your TV's manufacturer to decide which resolutions to support through each connection. For bandwidth and copy-protection reasons, many choose not to accept 1080p via component video; some don't even accept it through HDMI. An owner's manual will usually list which resolutions each connection supports, so you should do your homework before you buy.
107hook.2.jpg

Component Cable
Component Video
Digital connections may be the buzz, but component video is still the standard for sending HD signals from a source to a TV or receiver—at least for the time being. Component video isn't just for HD, either. It's often the highest-quality output on SD sources like DVD players, as it allows for a much cleaner, more colorful picture than lower-quality analog connections like S-video and composite video.
107hook.3.jpg

DVI Cable
Component video splits the video signal into three parts, labeled Y, Pb (or **), and Pr (or Cr). Both the connections and the cable are color coded: green for Y, blue for Pb, and red for Pr. The Y element carries the signal's luminance information—which is primarily green, plus enough red and blue to make the image look black and white—and the horizontal and vertical sync pulses that tell the TV when a new frame begins. The Pb element carries the remaining blue, and the Pr element carries the remaining red. RCA is usually the connector of choice for component video, although some professional and high-end equipment may use the twist-and-lock BNC connector for a more secure connection. Prepackaged three-in-one component video cables are easy to find, or, in a pinch, you can simply use individual 75-ohm video cables for each element, as long as they're each the same length and type.
I should stress that not every component video input has the bandwidth to pass a high-definition signal. If one of your HDTV's component video inputs is labeled "480i/480p," that means you should only use it for standard-def sources. With most new HDTVs, it's safe to assume that, unless the component input is labeled otherwise, it is HD capable. If you own an off-the-shelf upconverting DVD player, it is unlikely that you'll be able to view store-bought DVDs at a 1080i or 720p resolution through component video; you must use a digital connection.
107hook.4.jpg

DVI Input
Component video is a stable connection that can travel over long distances with minimal degradation, which is why it remains a popular choice in the custom-installation world. In spite of its current popularity and ubiquitous nature, component video will someday fade from the HD landscape, primarily because it lacks the copy protection to protect high-quality digital sources.
DVI
Like component video, the Digital Visual Interface (DVI) connection splits the video signal into three elements to improve image quality. Unlike component video, the digital signal remains in digital form, sent in a format called Transition Minimized Differential Signaling, which divides the signal into its green, red, and blue/sync elements. This connection has its origins in the computer realm and has the bulk and appearance to show for it.
107hook.5.jpg

DVI-to-HDMI Adapter (front)
DVI allows you to transmit a fully digital, uncompressed HD signal from source to TV, bypassing the digital-to-analog conversion processes that can potentially degrade signal quality in an analog connection. Because the DVI signal is not compressed, it's much too large to be recorded, which means DVI—and consequently HDMI—is not an option for sending HDTV from a set-top box to some form of digital recording device. (See sidebar.)
To further protect content, DVI employs a form of copy protection called HDCP, or High-bandwidth Digital Content Protection, which prevents DVI devices from communicating properly unless both have HDCP in place. In the home theater realm, many first-generation DVI-equipped displays did not use HDCP and therefore will not display an image coming from a new DVI-equipped DVD player or set-top box. If you're having no luck getting a picture when connecting two devices via DVI, chances are that one of them lacks HDCP—and I'm afraid you're completely out of luck.
107hook.6.jpg

DVI-to-HDMI Adapter (back)
Of the three connection types, DVI is the least reliable over a longer video run. The official DVI spec only requires that the equipment maintain the signal up to 16 feet, although you can purchase DVI extension and repeater devices to allow for a longer run. As HDMI becomes increasingly popular, DVI connections are disappearing from new video products.
HDMI
The High-Definition Multimedia Interface (HDMI) represents the evolution of DVI. In terms of how it handles a video signal, HDMI is identical to DVI, passing a fully digital, uncompressed signal between source and display. HDMI ups the ante by allowing for the passage of uncompressed multichannel audio and control information. These uncompressed signals are too large to be recorded.

HDMI Input
HDMI has a smaller, more user-friendly form factor than DVI, but the two are usually compatible through the use of a simple adapter, as long as both employ HDCP copy protection. HDMI is more reliable than DVI over a longer video run. If your run is longer than about 20 feet, it may be worth the extra money to invest in a high-quality HDMI cable or an HDMI amplifier or repeater from a company like Gefen or Acoustic Research.
For all of its potential benefits, HDMI can be frustrating to use in its current form. Communication failures abound. Sometimes, you must cue up devices in a particular order to ensure that you get a picture. Other times, a source and a display will communicate well—until you add an A/V receiver to the mix. We've already encountered several instances in which an HDTV and a high-definition DVD player had difficulty communicating over HDMI. Silicon Image, the company that invented HDMI, has attempted to address this through their Simplay HD certification process, which tests a product's HDCP functionality and its interoperability with other HDMI devices. Products that sport the Simplay HD logo have been verified to work together. This is a step in the right direction, but only a few manufacturers' products are currently certified. (For a list, go to www.simplayhd.com.)
107hook.10.jpg

HDMI Cable
If and when its bugs are worked out, HDMI could usher in a new golden age of TV, filled with beautiful HD programming and TVs with just a single input. Those would be the days.
Want to Archive Your HDTV Recordings? Ask for FireWire
The good news is that there are plenty of HD DVRs and Media Center PCs that allow you to record your favorite TV shows in HD. The bad news is that, until Blu-ray and HD DVD recorders hit the market (hopefully this year), saving those recordings to a disc presents a challenge. FireWire, or IEEE 1394, remains our best hope for transferring and archiving HD video signals. If your TV or cable/satellite set-top box has active FireWire ports, you can transfer HD recordings—at least of unencrypted channels like ABC, CBS, and NBC—to a recording device, such as a D-VHS recorder or a computer with a Blu-ray or HD DVD burner. In 2004, the FCC mandated that cable companies must give you a FireWire-equipped cable box if you ask for one. So speak up.

FireWire Inputs

FireWire Four-Pin Cable (left) and Six-Pin Cable (right)
Audio in HD
Optical and coaxial digital audio connections have long been the standard method of transferring multichannel digital audio signals from an HDTV or HD set-top box to an A/V receiver, but HDMI is coming on strong. Electronics manufacturers are paying more attention to a receiver or pre/pro's ability to accept both video and audio via HDMI so that you only have to run one connection from a high-definition source, such as a cable box. Current HDMI specs allow you to pass PCM, Dolby Digital 5.1, DTS, DVD-Audio, and SACD; Dolby is the most common format for HDTV broadcasts.
With the arrival of high-definition DVD and the new uncompressed Dolby True HD and DTS-HD Master Audio formats, everything gets a bit more complicated (naturally). With the first crop of high-definition DVD players, you can send compressed Dolby Digital and DTS audio the old-fashioned way, through the optical or coaxial outputs. Or you can let the player decode the Dolby TrueHD and DTS-HD Master Audio internally as uncompressed, multichannel PCM and send it over HDMI to your receiver. The HDMI 1.3 spec, which is just beginning to appear on products, lets you pass the Dolby TrueHD or DTS-HD bit stream directly through HDMI to the receiver—if manufacturers and content providers enable it. Confusing? You bet.
 

petetherock

Moderator
Moderator
Joined
Jan 7, 2007
Messages
9,626
Reaction score
828
Buying on a budget

Everyone has a budget and some of us have a more modest budget than others. This does not mean we should give up owning a nice flatscreen.

Other ways to stretch the dollar are possible.

However what you should consider is:

- avoid getting simply the best specs on Paper for the dollar -- just because there is full HD for 1k, does not mean that TV will be good

- check your viewing diet -- SD and full HD LCD TVs can be a bad idea on cheap panels.

Instead you can get HD ready and get a better panel

Get a run out model and get the previous high end model at a much lower price.

Go to trade shows and get new demo sets.
 

petetherock

Moderator
Moderator
Joined
Jan 7, 2007
Messages
9,626
Reaction score
828
Good read on HDTV and resolutions:



The truth about 1080p
In the last couple of years, HDTVs with 1080p native resolution have taken over the market at nearly every price and size point. But as we've been saying all along, once you get to high-definition, the difference between resolutions becomes much more difficult to appreciate.

We've done numerous side-by-side tests between two same-size HDTVs, one with 1080p resolution and another with lower resolution, and every time it's been almost impossible to see the difference with regular program material, especially when that material is moving. The difference becomes even more difficult to see at smaller screen sizes or farther seating distances--say, more than 1.5 times the diagonal measurement of the screen. For example, to see the benefits of stationary 1080p content on a 50-inch screen, you'll generally need to sit about 6.5 feet or closer. Few viewers want to sit that close, especially when low-quality content seen at that distance (remember the "garbage" maxim?) looks so bad.

The main visible benefit of 1080p native resolution comes when the display is asked to show computer sources. With a PC set to output 1080p resolution and a 1080p display that can accept it, computer desktops and text generally look superb, and quite a bit better than when displayed on a TV with lower native resolution. But for movies, games, and other standard video material, the benefits of 1080p are negligible unless you're sitting quite close.

That doesn't matter much anymore though. 1080p native resolution is so common among HDTVs, and has so little impact, that you shouldn't even consider it as a factor in your purchasing decision. As we mentioned at the top, factors like contrast and color are more important to image quality, and unfortunately, you can't depend on a specification sheet for an accurate representation of those factors.

1080p/60 versus 1080p/24
1080p HDTVs are a dime a dozen, but not all 1080p HDTVs are created equal. First off, some older HDTVs with 1080p resolution couldn't accept 1080p sources at all. More recently, the advent of Blu-ray has delivered another video format variation to worry about: 1080p/24.

The numbers 24 and 60 refer to frame rate. Moving video is composed of a certain number of frames transmitted every second that combine in the viewer's mind to create the illusion of movement. The nominal rate for film is 24 frames per second, while the rate for video is 30 frames per second. In standard 1080p video, which is technically 1080p/60, each frame is repeated twice. Every 1080p HDTV sold today can accept and display 1080p/60 sources via its HDMI inputs.

Not every 1080p HDTV properly displays 1080p/24 sources, however. Most Blu-ray players, as well as the PlayStation 3, have a setting that lets the player transmit 1080p/24 video directly. Blu-ray Discs with movies that originate on film are encoded at 1080p/24 to preserve the proper cadence of film--that characteristic motion that's smooth but not too smooth. If your player is set to output 1080p/24 directly, and your TV can properly display it, you're seeing the image as close as possible to what the director intended--how it looks when displayed on a cinema screen from a film projector at your local movie theater.

Generally, for an HDTV to properly display 1080p/24 it needs to have a refresh rate at some multiple of 24. The standard refresh rate for HDTVs of all varieties is 60Hz, which is not a multiple of 24. There's no benefit to sending these displays 1080p/24 instead of 1080p/60. If the HDTV can actually show the signal (some cannot), the result usually looks the same regardless of the setting on your Blu-ray player.

On the other hand, increasing numbers of LCD TVs have refresh rates of 120Hz or 240Hz, for example, while a few plasmas refresh at 48Hz, 72Hz, or 96Hz. All are exact multiples of 24. Some of these HDTVs come closer to preserving the cadence of film than others, and some can introduce extra dejudder video processing (usually user defeatable) that also affects cadence. Unlike with resolution, there's no easy way to tell from the spec sheet if a display with a multiple of 24 as its refresh rate handles 1080p/24 correctly, although most such displays that we've tested do.

For most viewers the visible benefits of 1080p/24 are slight. Displays that cannot show it correctly can nonetheless produce a viable semblance of film's cadence, one that to experienced viewers appears to stutter slightly, especially in pans or camera movement, instead of move more smoothly like true film cadence. But for purists interested in seeing every last benefit of film, 1080p/24 signals mated to a 1080p/24-compatible display are worth the investment.
 
Last edited:

petetherock

Moderator
Moderator
Joined
Jan 7, 2007
Messages
9,626
Reaction score
828
Contrast Ratio

http://news.cnet.com/8301-17938_105-10148015-1.html?tag=hdtv;hdtv_l


At the 2009 Consumer Electronics Show, manufacturers quoted contrast ratio specs of 1,000,000:1 or 2,000,000:1 for upcoming LED-based LCD displays (Vizio and LG, respectively), which are similar to the specs quoted by Samsung and Sony for their current LED models. Those numbers sure do sound impressive, but what do they mean in the real world?
Very little. It's true that in general, a higher contrast ratio can indicate that the display produces a deeper level of black, with all of the picture-quality benefits that brings--but then again it might not. Despite the million-to-one contrast ratios of the Samsung and Sony LED sets we reviewed, we observed better black-level performance in the Pioneer PRO-111FD. Pioneer doesn't publish a contrast ratio spec for that television, but has claimed that its black levels are so deep as to be "immeasurable."


Manufacturers are free to use whatever method they like to "measure" the contrast ratio of their displays. The big numbers you see quoted most often are for "dynamic" contrast ratio, which takes into account changes the (usually LCD) display makes to adjust for fluctuations in the brightness of the content--namely, lowering the backlight in dark scenes and bringing it up in lighter ones. Then there's the "native" contrast ratio number, always much smaller than the dynamic one, where the display doesn't perform these adjustments. Both of these numbers are usually derived from the measurement of a full-white screen and a full-black screen (so-called full-on, full-off measurements), which is obviously not representative of actual program material.

That's why we hope you'll pay as little attention to published contrast ratio specs as we do. We rarely mention them in reviews, and when we have to refer to them in news or blog posts we try to put them in context, comparing last year's specs from the same manufacturer with this year's, for example. We're still working on performing contrast ratio measurements ourselves as part of TV reviews, so look for that to happen this year. When it does, we doubt we'll publish anything close to "one million."
 

petetherock

Moderator
Moderator
Joined
Jan 7, 2007
Messages
9,626
Reaction score
828
HDTV resolution

From CNET:

http://www.cnet.com/hdtv-resolution/?tag=mncol;txt

HDTV source resolutions
If you read those three axioms closely, you'll see that source is everything with HDTV. Or, as George Fueschel first said, "Garbage in, garbage out." High-definition sources today come in one of three different resolutions: 1080p, 1080i, and 720p. Comparing the latter two, 1080i has more lines and pixels than 720p, but 720p is a progressive-scan format that should deliver a smoother image that stays sharper during motion. 1080p combines the superior resolution of 1080i with the progressive-scan smoothness of 720p. True 1080p is restricted to Blu-ray, some video-on-demand sources and the latest video games, however, and none of the major networks has announced 1080p broadcasts.

HDTV source resolutions
If you read those three axioms closely, you'll see that source is everything with HDTV. Or, as George Fueschel first said, "Garbage in, garbage out." High-definition sources today come in one of three different resolutions: 1080p, 1080i, and 720p. Comparing the latter two, 1080i has more lines and pixels than 720p, but 720p is a progressive-scan format that should deliver a smoother image that stays sharper during motion. 1080p combines the superior resolution of 1080i with the progressive-scan smoothness of 720p. True 1080p is restricted to Blu-ray, some video-on-demand sources and the latest video games, however, and none of the major networks has announced 1080p broadcasts.

Despite the obvious difference in pixel count, 720p and 1080i both look great. In fact, unless you have a very large television and excellent source material, you'll have a hard time telling the difference between any of the HDTV resolutions. It's especially difficult to tell the difference between 1080i and 1080p sources. The difference between DVD and HDTV should be visible on most HDTVs, but especially on smaller sets, it's not nearly as drastic as the difference between standard TV and HDTV.
 
Last edited:

petetherock

Moderator
Moderator
Joined
Jan 7, 2007
Messages
9,626
Reaction score
828
Burn in on plasmas

This article is very useful

http://reviews.cnet.com/4520-6449_7-6844370-1.html


1.What exactly is burn-in?
Plasma, like tube TVs and older CRT rear-projection televisions, is a phosphor-based screen technology. Due to uneven wear on the phosphors, if you let a static image sit on your screen for too long, that image can end up leaving a ghost of itself behind--it appears burned in to the screen. The biggest potential for burn-in occurs when you have a high-contrast image--such as bright text set against a dark or black background--because some pixels are turned on to the max while others nearby are completely turned off.
A good example is when you watch 4:3 video on a widescreen display and have black bars framing the image on either side (the pillarbox effect). Also, you get black bars on the top and bottom of a picture when you watch 2.35:1 movies on a 16:9 display (letterbox), which is the standard aspect ratio for all HDTVs. Then, of course, there are the news and stock tickers that run across the bottom of the screen when you watch various news channels, including CNBC, Bloomberg, CNN, and ESPN.
Watch TV for a few hours with those images sitting there, and you could end up with an after-image of the bars or the ticker visible on other scenes. These after-images will be most evident when you're watching a brighter scene with the picture filling the whole screen.


4. Are there some simple tips to follow to prevent burn-in?
Our video guru, Senior Editor David Katzmaier, says the potential for burn-in is greatest during the first 100 or so hours of use, "during which time you should keep contrast low (less than 50 percent) and avoid showing static images or letterbox bars on the screen for hours at a time." He personally has a three-year-old 50-inch plasma at home and notices that, after his wife watches the TV in the 4:3 mode (with black bars on either side of the image) for hours on end with no widescreen shows, he sometimes detects those after-images of the bars. But they quickly go away when he watches material that fills the whole screen (or he convinces her to use the gray bars).
"I just don't worry about it," he says. "Yeah, you can get some image retention once in awhile if you look hard enough after hours of static images, but even then it's temporary, not permanent."
Update 02-29-2008: Thanks to some readers' comments below, we have a few other tips to help remove burn-in if it occurs. Commenter gmccnet got good results by recording bright static on a VCR and playing it for 24 hours to almost completely remove the after-image. You could also simply leave a normal, widescreen channel on overnight or longer--just make sure it isn't one that goes to color bars in the early-morning hours. Discovery HD Theater is a good choice.

7. So, if I have CNBC, CNN, ESPN, Fox News, or Bloomberg on all day long, is it a problem?
Look, if your primary use for your TV is watching stations that have stock or news tickers running on them eight hours a day, buy an LCD. The reason you want a plasma is because you can get a big-screen model (50 inches or larger) that offers deeper blacks and better off-axis viewing for less money than an LCD. And if you're a day trader sitting at home, playing the markets during the day and watching movies at night, get a small LCD for daytime use and a big plasma for nighttime viewing. End of story.
 

petetherock

Moderator
Moderator
Joined
Jan 7, 2007
Messages
9,626
Reaction score
828
Getting the best deal and what happens if you don't

We all want to have the bragging rights to the best deal, the lowest price.

Sometimes it happens but once we buy something, another person may get a better deal somewhere else, or as prices slide, so someone else will enjoy a lower price.

We should deal with it and move on. Thats life.


BUT if the price is unreasonable and we feel cheated, thats when we should seek redress.

Note - this is not for that $50-100 less which someone else scored over your deal. That means the other person has better bargaining skills.

First, approach the sales rep who dealt with you - that is also why is is important to note your saleman's name and contact. Particularly at a show, if you buy on the last day, make sure you get the branch that the sales person is normally working at.

It will also be important if something goes wrong with the set or there are missing accesories.

Next check with the sales manager or the store head. Note: you have no case if you paid little more or missed out on a freebie, again that is down to bargaining skills.

Finally CASE or even the police if it is really cheating, which is unlikely for displays unlike smaller items where you may get them from small operations that may fly by night.

The bigger companies would like to protect their reputation and bad press is not good for business. Emphasise this.

However they don't really care if you paid xx$ more with them compared to another company.

In such cases, just sit back and enjoy the sit.

Cheers


Moral of the story: Do Your Homework
 

petetherock

Moderator
Moderator
Joined
Jan 7, 2007
Messages
9,626
Reaction score
828
1.3a is good enough for HDMI and 3D

http://www.cedia.com.au/index.cfm/page/news_detail/id/217


Comments were put together by CEDIA members David Meyer (Kordz) & Michael Heiss
3D WILL run through existing HDMI cables. However we do believe that 3D will ‘up the ante’ in terms of quality of cable, but from a bandwidth perspective, nothing changes... yet. Virtually all currently deployed HDMI over 5m in length (generally) are NOT High Speed, rather Standard Speed certification level, but with sufficient headroom to enable 1080p operation. Note that these cables are only certified to 720p/1080i (provided they are certified at all!), and used without compliance to 1080p level. Most installers don’t really care about certified performance, just “whether it will work” – unfortunate, but fact. This will however become more of an issue moving forward.
1. 3D from Blu-ray has been mandated at an initial maximum of 1080p/24 per eye, meaning effectively 1080p/48 combined data rate – less than the current 2D standard of 1080p/60. So if the HDMI cable supports 1080p/60 fine now, it will also support 3D from Blu-ray no problem. Note: it is insufficient to talk resolution without referencing frame rate as it does not otherwise define the data rate. Gaming has been defined under the new specification 1.4a (out just last week) that 720p/60 per eye be supported, but gamers will likely want 1080p in due course. When this happens I predict that we’ll see gaming go to 1080p/60 per eye, meaning nearly 9.0Gbps – DEFINITELY High Speed and nothing less – currently cables that support 1080p/60 in 2D, but without High Speed certification, will NOT support 1080p/60 in 3D. In the meantime though, support for such high res/frame rate has not been made mandatory, and is merely speculative.
2. It is NOT necessary to upgrade to a so-called “HDMI 1.4” cable to enable 3D support. Also, any cable which is referred to by the manufacturer as “HDMI 1.4” is in fact non-compliant due to its breach of the HDMI Logo & Trademark guidelines. So, should you care if a cable is simply mislabelled? Absolutely! Labelling the cable in a compliant manner is the easy part; making the cable to perform in a compliant manner is actually the really hard part. If a manufacturer can’t get the small stuff right, how can they be trusted with the big stuff?
3. For broadcast, the HDMI 1.4a specification mandates support for 720p & 1080i @ 50/59.94/60 refresh rates (NOT 1080p at all), using “over and under” and “side by side” 3D formats. This means both left and right eye images share the same frame, keeping bandwidth the same as current 2D equivalents, but effectively halving the resulting resolution per eye when displayed on screen. Bottom line, Standard Speed HDMI is fine for broadcast
4. So will HD Set top boxes need to be HDMI 1.4 compliant to handle 3D? This all depends on whether the set top box will have any requirement to know that an incoming broadcast signal is 3D, and flag it as such. If so, then firmware will need to be upgraded, effectively changing the device to HDMI 1.4a compliance (I suspect this will be the case). If it’s just a slave and throughputs the signal passively, with the broadcaster flagging the content for a display to recognize it as 3D and do its thing, then the boxes wouldn’t need an upgrade and 1.3 spec is fine (highlyunlikely). Either way there will not be any hardware change, at least not specifically for the 3D feature. That is, it is expected that all devices will require 1.4a compliance to support 3D, but that does not mean having to buy all new devices – some will simply be firmware upgraded. Sony are already offering this with some of their Blu-ray players.

As for HDMI Ethernet Channel (HEC), this is an optional extra feature of both devices and cables, with the latter requiring the additional label “...with Ethernet” on cables. As Michael says, HEC is not used at all for 3D – this is absolutely true. The Audio Return Channel will use the HEC for best results, but can also still work in “Single Line” mode through cables without the Ethernet Channel. So choosing a HDMI cable with Ethernet Channel opens up support for distribution of Ethernet over HDMI, and the most robust operation of Audio Return Channel. It is NOT required for 3D.

So, suffice to say that HDMI cables that currently support 1080p/60 can continue to be used for 3D from all sources, but with new installations, upgrading to true certified High Speed will certainly give a far superior degree of “future proofing”, especially when considering where gaming is likely to go.

We hope this helps answer some of the mysteries out in the industry.
 

petetherock

Moderator
Moderator
Joined
Jan 7, 2007
Messages
9,626
Reaction score
828
Using that calibration disc

For the average consumer who takes a new LCD or plasma TV out of the box, what is the best way to set it up for home use? What is the name of the DVD used for adjusting and fine tuning?

http://blog.ultimateavmag.com/scottwilkinson/out_of_the_box/

Most buyers of a new TV simply take it out of the box, turn it on, and leave it at that. Unfortunately, they probably aren't seeing the best image quality the set is capable of—most likely, it's too bright and too blue, the whites are clipped, and the blacks are crushed.
The good news is that it's relatively easy to improve the picture dramatically. If the TV complies with Energy Star 3.0 energy-saving guidelines, the first time it is powered up, a message will likely appear on the screen asking if it's going to be in a home or retail environment. Select "Home," which typically sets the picture controls so that the TV is not in its so-called "torch" mode—as bright and blue as possible to attract attention on a showroom floor. Remember that your home is nothing like a retail environment, so the optimum settings are completely different. (In a few cases, selecting "Home" puts the TV in its torch mode anyway, so be sure to follow the steps outlined here in any event.)
Next, open the TV's picture menu, find the Picture Mode control, and select Movie, Cinema, or similarly named setting. This further refines the picture controls to produce the most accurate image possible—that is, the color of gray is as close to neutral as possible, with little or no bias toward red, green, or blue, and the white level is not maxed out, which usually leads to white clipping.
To verify that the color of gray is as neutral as possible, find the control called Color Temperature or Color Tone, which might be in an Advanced submenu. This control has settings labeled Cool, Normal or Standard, and Warm—in some sets, such as those from Sony and Samsung, there are two Cool and two Warm settings. Make sure this control is set to Warm (or Warm2 in a Sony or Samsung set). Most often, the Warm setting produces the most neutral color of gray, which is technically called the grayscale.
If your TV is THX certified, one Picture Mode setting will be labeled "THX," which is generally the most accurate preset mode. In my reviews of such sets, I've found the THX mode to be reasonably accurate, but I was able to get it even closer with some additional tweaking. In most cases, however, the THX mode prevents you from adjusting the picture controls, so if you want to take the next steps as described below, you'll need to select the Movie or Cinema picture mode and make sure the color temperature is set to Warm.
You can call it quits at this point, and the picture will be a lot better than if you had simply taken the TV out of the box and watched with its default settings. But it can be better still. If you want to take the next step, you need a setup disc, which typically costs around $25 to $35—my favorites are HD Benchmark and Digital Video Essentials: HD Basics on Blu-ray and HDTV Calibration Wizard on DVD. (Digital Video Essentials on DVD is more comprehensive, but the menu system is very confusing for consumers.) I especially like HDTV Calibration Wizard for consumers because it guides you through the process step by step. To illustrate the following procedure, I'll use screen shots from HD Benchmark.
One last point before we get into it—try to perform this procedure in the same lighting conditions you will be using to watch the TV. Ideally, you should use subdued or no lighting, but if that's not practical, use whatever lighting will be present while watching, which will affect how you perceive the image.
100416-pluge.jpg

If you're not using HDTV Calibration Wizard, call up a test pattern called PLUGE—if there's more than one, select the Low PLUGE. Find the Brightness control in the TV's picture menu and raise it's value until you see several dark vertical stripes—one will be darker than the background and one will be lighter. The background is technically called "video black," the darkest stripe is said to be "below black," and the lighter stripe is said to be "above black." (In the PLUGE Low pattern from HD Benchmark shown here, there are four stripes—the two on the left are below black, and the two on the right are above black. Also, there's a faint checkerboard pattern in the background. The image above is intentionally misadjusted so you can see the below-black stripes.)
Reduce the Brightness control until the below-black stripe just disappears. In the PLUGE Low pattern from HD Benchmark, both below-black stripes should not be visible, and the checkerboard pattern should be just barely visible. In some cases, you might not see the below-black stripe at all because the TV or player clips anything below video black. If so, reduce the Brightness control until the above-black stripe is barely visible.
100416-clipping.jpg

Next, find a test pattern called Contrast or Clipping. This pattern has sections that are above the level called "video white." Unlike black, you want to be able to see above video white—officially, there isn't supposed to be anything above video white, but real program material sometimes does include such information, so my philosophy is to set the TV so it can display this information. Find the TV's Contrast control (sometimes called Picture) and increase it until the above-white part of the pattern looks solid white—it might well look this way at the default setting—then lower it until the above-white areas are just visible.
In the Clipping pattern from HD Benchmark shown above, the white rectangle in the upper left quadrant is the one to look at—it consists of two squares, each with concentric areas of above-white information. At the correct setting, you should be able to see all of these areas as distinct from the others. Ignore the colored squares.
Before moving on, go back and check the PLUGE pattern again to make sure the Brightness is still set correctly. The Brightness and Contrast controls often interact, so you might have to go back and forth a couple of times to get them both right. These are the two most important controls to set properly in order to see the best possible picture on your new TV.
The next step is to set the Color (sometimes called Saturation) and Tint (sometimes called Hue) controls, but these are not as important for a couple of reasons. First, in most modern TVs, they are pretty close to correct out of the box. Second, they often require you to use a blue filter to set them while displaying a test pattern called Color Bars. Most setup discs come with a blue filter, but these filters are notoriously inaccurate, especially with LCD TVs, and they often lead to incorrect settings. Some TVs provide a so-called "blue-only" mode, which disables the red and green colors, making it easy to set the Color and Tint controls accurately, but most don't.
100416-colorbars.jpg

If your TV doesn't offer a blue-only mode, I recommend just leaving the Color and Tint controls at their default settings, which are probably fine. If your TV does have a blue-only mode and you want to make sure they are set correctly, display the Color Bars test pattern and engage this mode, then raise and lower the Color control while looking at the pattern; you will see some bars and other areas of the image appear to change in brightness relative to each other. Set the Color control so these areas are the same brightness. Repeat this process with the Tint control, which will cause different bars to vary in brightness. As with Contrast and Brightness, Color and Tint are interactive, so you might need to tweak them a couple of times before you're done.
100416-sharpness.jpg

The last step is setting the Sharpness control. Display a test pattern called Sharpness and increase the Sharpness control until you see white borders around the black lines on a gray background. Then, decrease the Sharpness control until these white borders disappear. In many sets, the best Sharpness setting is its minimum value. This is as far as consumers can go, and it will dramatically improve the picture quality of any new TV. If you want to go the extra mile, you must have a qualified technician perform a grayscale calibration, which requires some serious training and expensive equipment, so such a calibration typically costs several hundred dollars. But it will probably further improve the picture, and you will be assured that the TV is producing the best picture it possibly can.
 

petetherock

Moderator
Moderator
Joined
Jan 7, 2007
Messages
9,626
Reaction score
828
Calibration discs

Too many discs can add confusion to new buyers and newbies to HT.

Firstly, understand the controls on your TV, amp or speakers. Go through the basic terms like speaker placement, SPL meters, color, brightness, contrast etc first before you venture further. Then you will need to see if you use your TV in 2 settings – day and night. Then you may need to use 2 different settings for each light condition.

For video, the main thing is optimising your settings to give you the best picture quality (PQ) and similarly for sound, the ideal setting to make you feel part of the sound.

For video, it should have the basic brightness, colour, contrast and tint sections, then it may have test screens to centre the picture, and further tests of alignment, bleed, interlacing etc, but the first few should be present.

Good test discs also have realistic scenes after test screens to show what real skin tones look like.

Discs should be easy to navigate, understand and also have explanations for each section.

Sound testing can be done after basic setting up of the speakers, measuring their distances and also using the built in auto-EQ functions like Audyssey, YPAO, MACC etc first.

There should be a sound sweep through all the speakers, and not all discs cover 7.1 channels, so take note. Then there should be a frequency sweep from low to high, and there should be enough breaks between each frequency and each of sufficient duration (or you can advance the section) so you can take readings.



First the free:

Tests screens on TV:
Sure it is in SDTV and depends on your reception, but it is a good start.

Few things come free, but the THX labelled discs all come with basic color and sound calibration, which can be very useful.
http://www.hometheatermag.com/advicefromtheexperts/407cali/

Radio:

The BBC male voice is an excellent test of how natural your centre speaker sounds, does it really sound like someone is in front of you? And when you use the 7 channel stereo mode, do all the voices reach your ears at the same time?

Life – BBC

On Disc 1 there is a setup for Hi Def tune up which is as simple as you can imagine and IMO, excellent and even better than some of those fancy AVIA or DVE discs!


Those that cost $$:
http://reviews.cnet.com/4520-6463_7-5085739-3.html#3

Sound and Vision Calibration DVD –
Simple with many explanations, good for beginners.


Joe Kane’s DVE- available in BR and DVD.

Oldie and goodie, covers most of what is needed but the ease of use isn’t too great, especially for beginners.

AVIA - available in BR and DVD.

Newer version of an old hit – comprehensive and pretty easy for newbies – good for mid to high level users.

The Spears & Munsil High Definition Benchmark Blu-ray Edition

If you bought a Oppo BD 83, this comes free and is good for beginners to mid – level.


HD HQV Benchmark Blu-Ray Disc
http://www.hdtvsupply.com/hqv-benchmark.html

This is a mid to high level test disc, but it is not too hard to use.
 
Last edited:

petetherock

Moderator
Moderator
Joined
Jan 7, 2007
Messages
9,626
Reaction score
828
Taking your first steps into Hi Def?

Getting a new Blu Ray player?

Then take note of these steps:



- what is your viewing distance?

Take some time to read the thread above on this and if you sit pretty far away from the tiny 40", say > 2m then you are not really using the full benefits of a Full HD tv.

- what is your viewing diet?

If you are trying out BR, then the majority of your viewing will still be free to air channels which can be as bad as VCD quality. Then that Full HD LCD will only make it worse. Otherwise sit closer to enjoy full HD and move back for SD TV.

- Are you watching BR solely?

I use a HD Ready 50" plasma for SD TV at 3m. The images are great.
I use a Full HD 50" plasma for all my BR at 2.3m. The details are awesome. Sit any further and you won't appreciate the details. If you use it at the same distance for SD TV, it will be too close.

So a compromise is needed. I don't think many of our members use their single TV in the hall for all Hi Def viewing. So where you place your chair is important as a halfway point to allow you to appreciate Hi Def whilst not making Channel 5 look bad.

YMMV... but don't follow blindly and upgrade to a new Full HD TV for nothing.

Also the video chips / scalers will differ. If you watch BR on a Full HD TV, you won't want any processing - use the "Direct" mode.

I have previously posted in this thread on scalers & video chips, it will be worth your while to read it and explore.

Cheap Full HD TVs will have lousy video chips, so if you need the video processing, then it matters. It gets used in SD TV, DVDs, your ripped to hard disk videos, and even for PC work. There is no free lunch, and going cheap will mean compromise somewhere.

Choose wisely instead of listening to those who recommend Full HD blindly.

Let your eyes be the final judge.
 

petetherock

Moderator
Moderator
Joined
Jan 7, 2007
Messages
9,626
Reaction score
828
HDMI:

This is an excellent link:
http://www.hdmi.org/learningcenter/kb.aspx?c=7#49

Note that the terms 1.3 and 1.4 are no longer in use, so beware if someone is trying hard to sell you a cable based on this.

There is only "Standard" or "High Speed".

The main difference is the former is good up to 1080i.
Q. What is the difference between a “Standard” HDMI cable and a “High-Speed” HDMI cable?
Recently, HDMI Licensing, LLC announced that cables would be tested as Standard or High-Speed cables.

Standard (or “category 1”) HDMI cables have been tested to perform at speeds of 75Mhz or up to 2.25Gbps, which is the equivalent of a 720p/1080i signal.
High Speed (or “category 2”) HDMI cables have been tested to perform at speeds of 340Mhz or up to 10.2Gbps, which is the highest bandwidth currently available over an HDMI cable and can successfully handle 1080p signals including those at increased color depths and/or increased refresh rates from the Source. High-Speed cables are also able to accommodate higher resolution displays, such as WQXGA cinema monitors (resolution of 2560 x 1600).
 

Dr.Vijay

Administrator
Administrator
Joined
Jan 1, 2000
Messages
26,984
Reaction score
1,229
[Article] HardwareZone's HDTV Buying Guide Essentials

Unknown to many, buying a television for your home involves more preparation work than anticipated. With our HDTV Buying Guide Essentials, you can now equip yourself with the necessary knowledge to make the right purchase instead of regretting an impulsive buy later.
[BR]break[/BR] [BR]break[/BR]

Read the full article here...
 

Dr.Vijay

Administrator
Administrator
Joined
Jan 1, 2000
Messages
26,984
Reaction score
1,229
[Article] HardwareZone's 3D TV Buying Guide Essentials

Should you go with active or passive? With a multitude of 3D TV technologies and models available in the market, it isn't easy to decide on the right television type with TV manufacturers clamoring to offer the best 3D experience in the industry. Not to worry though, for we're here to help. [BR]break[/BR] [BR]break[/BR]

Read the full article here...
 
Status
Not open for further replies.
Important Forum Advisory Note
This forum is moderated by volunteer moderators who will react only to members' feedback on posts. Moderators are not employees or representatives of HWZ. Forum members and moderators are responsible for their own posts.

Please refer to our Community Guidelines and Standards, Terms of Service and Member T&Cs for more information.
Top