Tuesday, 18 September 2007

More Resolution Confusion…

In an earlier post I alluded to the fact that the human eye may be the limiting factor to the combination of screen size, resolution and viewing distance. And while that is true to a large extent, we need to clarify that when choosing a display, and working your mind over the question of if you need to spend the extra cash for full 1080p resolution, or if 720p at a cost savings will be adequate, or indeed, an imperceptible difference. Well…things are never simple.

Fortunately, the question of 1080p vs 720p may never come up with a growing number of home theater enthusiasts. It’s simply a question of future-proofing your purchase. If you buy a 1080p display today, it’s going to be the flat-out highest resolution display you can buy for a while, at least for the near future. There are already higher resolution projectors in the works, but as far as displaying pixel-for-pixel any HD source available, 1080p will do it. So why even concern ourselves with something less? Oh yea, money. That again.

As I wrote earlier, the average 20-20 vision eyes can only see things as small as 1/60 of a degree of arc wide or tall. That means that if a single pixel of your display is smaller than that, you can’t see it as an individual pixel. And that would seem to indicate that if a 720p display is small enough or far away enough, you wouldn’t be able to see the pixels of either it, or a similar sized 1080p display, so what’s the diff? Ok, sit down.

The first problem with the 20-20 vision limit is that while that is considered “normal” vision, it is now apparent that many more people either have better than 20-20 vision, or have had their vision corrected (notably via contact lenses) to somewhere closer to 20-10. In other words, they can see 1080p pixels much farther away than a person with 20-20 vision. So perhaps 720p won’t be so non-visible to as many people as we thought! A 1080p display would offer those eagle-eyed viewers a noticeably better picture.

Second, you have to realize that since a lot of broadcast, cable, satellite and HD discs contain 1080i or 1080p material, something has to happen to scale that image down to 720p. And, to add insult to injury, most plasma TVs aren’t actually 1280 x 720 pixels at all, but rather 1365 x 768 meaning that even native 720p material must be scaled a bit. Scaling downward is a tricky business, and not for the squeamish. Obviously, if you didn’t have to scale at all, you’d be better off, and that’s what 1080i or 1080p material is like on a 1080 display. Any other resolution means scaling. And scaling means artifacts that can be much bigger than the pixels themselves, and more easily seen by even our mediocre 20/20 eyes. So, again, advantage 1080p.

So, three reasons to opt for 1080p:
1. The 20/20 vision argument isn’t really about 20/20 vision, but should be more appropriately the 20/10 argument. That pushes minimum viewing distance based on acuity out farther.

2. Scaling artifacts may be more visible than the pixels of a 720p display, so why scale if you don’t have to?

3. It's about as "future-proof" as you can get. 1080p is likely to be the flat-out highest resolution that source material will be available in for quite a while.

Got it? 1080p is better, so if you can, get it.

Whew! I now wish to take aim at a misconception that is so prevalent that one of our distributors (who should really know better) casually said to me in conversation that 1080i is half the resolution of 1080p. I now take aim squarely down my telescopic site at that one. So listen up.

The resolution of anything called 1080 (i or p) is 1920 x 1080 pixels. There are two other important aspects, though, the frame rate (60, 30 or 24) unfortunately, also noted as a number with a p behind it (i.e. 24p), and the method of scanning, either Progressive, or Interlaced (1080p or 1080i). So the full specification of a video format should include the resolution, interlace or progressive indicator, and frame rate, as in "1080p 24p". If you don't have both parts, you don't have the whole story. There are actually many different possibilities. The Advanced Television Standards Committee has published these recommended standards:


Vertical LinesPixelsPicture Rate

1080
1920 60i 30p 24p

720
1280 60p 30p 24p



That's how the ATSC defined it, and I was surprised how hard it wast to find, buried in their technical specification documents. Somehow the consumer electronics industry has managed to re-write these figures in a much more confusing way (what a shock). Here are the same formats again:


1080p 30p
1080p 24p
1080i
720p 60p
720p 30p
720p 24p


And to make it much worse, the second figures are typically left off. The result is:
1080i (which is really 1080 lines by 1920 pixels at 60 fields, interlaced)
1080p (which is really 1080 lines by 1920 pixels at either 30 or 24 progressive frames)
720p (which is really 720 lines by 1280 pixels at 60, 30 or 24 progressive frames)

So when we ask "Is 1080i better than 1080p?" we are really asking "Is 1080i 60i better than 1080p 30p?" And we can reduce that to the difference between scanning an image with an interlaced pattern twice as often as a single progressive scan. And there it is: once every 1/30th of a second, or twice, interlaced, every 1/30th of a second. It amounts to what happens between scans. In the case of a progressive scan, the camera shutter snaps the image, and the entire image is scanned in one pass. In the case of an interlaced image, the camera snaps an image, and odd lines are scanned, then the camera snaps an image 1/60th of a second later, and the even lines are scanned. There is a 1/60th of a second time offset between interlaced scans. That does two things. First, motion may appear smoother (more scans per second can do this) and second, objects in motion may appear to have jagged edges (caused by half the scan resolution). But all is not lost. Motion adaptive interpolation within the video processor inside the display device can (and should) re-assemble all this interlaced mess using a process called 'interpolation'.

How adept the display’s interpolation circuits are determines how much motion jaggy artifact you see, and less artifact is better, but usually more expensive to obtain. But we’re only talking about 1080i video here, which is a 60Hz based frame rate. What about film, which is 24 frames per second?

Just to get a 24fps film to work at all on a 60hz based display system takes a process ‘telecine’. Going back to the early days of TV, engineers found that just projecting a 24fps film onto a camera image tube scanned at 30fps results in a rolling flicker bar due to the mismatched frame rates. Normal film projectors use two-blade shutters that expose each frame to light twice for an end flicker rate of 48Hz. That just simply doesn’t play well with the 60Hz field rate of NTSC TV. Early telecine machines were projectors that focused their image on a TV camera image tube, with five-blade shutters that were synchronized to the TV station’s house sync signals. The five-blade shutter means each frame is “exposed” to the image tube five times, resulting in a flicker rate of 120hz, which is a multiple of 60Hz. That got rid of the rolling bar, and left us with an odd way of converting film's 24fps frame rate to 30fps TV. Ultimately, what took place was a process called 3-2 pull-down, pull-down referring to a projector pulling the film down one frame at a time into its gate. 3-2 pull-down means that as a 24fps film gets transferred to video, a cadence, or pattern of 3 video fields, then two fields, then three, and so on. The resulting video is 30fps with 24fps embedded. Some displays are able to detect and undo this 3-2 cadence, and make it back into 24fps. Not very many displays can deal with the resulting 24p video, so it has to then be converted, sometimes again with interpolation, to a frame/field rate the display can handle. The short story around all of this is that for material that originated on 24fps film, the differences between 1080i and 1080p drop into irrelevance.

What is relevant is how well a display deals with all of these interlace/progressive and various frame rate issues. It’s all about interpolation, image processing, and display refresh rates. So the simple answer is (get ready, this is what you've been waiting for…) the differences between 1080i and 1080p are irrelevant for some material, and depends on how elegant the video processing is for others.

At this point you may be asking for a simple answer. There isn’t one, except this: for critical applications, like home theaters, media rooms, or your large primary screen, stick to 1080p displays. For the 42” TV in the bedroom or rec-room, you could save a little money on a 720p set. It’s highly likely that in a few years, 720p sets will be largely off the market, because the cost of 1080p displays will have dipped to match them, and most of the market wants the bigger numbers. After all, once you open your check book, doesn't 1080p sound better than 720p? I thought so.

There's much more to picking out the right display for your home theater or media room. Call the experts at Platinum Home Theaters for professional assistance and competitive pricing!

Wednesday, 22 August 2007

You Light Up My Screen....

Back to the old screen brightness thing again. This time, the question is, “How do I pick a projector that will light up my screen to 16fl?” I wish this were easy. Actually it is. You call Platinum Home Theaters, and we design your theater for you. We take care of site lines and sound, seating and acoustics, screen size, shape, material, projector, projector position and lens, control systems, décor, the whole works, or any part of it. You get what you really want, you don’t take a chance and spend your hard earned money for something that doesn’t work, and we both walk away smiling. You get our knowledge and expertise; we get you as a client and friend. But so much for the easy way, I know some still want to duke it out with physics, so here we go again. Got your slide rule ready?

In an earlier post, we noted that projector light output is measured in luminous flux, with the Lumen as the unit, actually a unit measurement of energy (actually is 1/683 joules per second, for those who care). ANSI Lumens are figures obtained with a defined test procedure that makes projector comparison more valid, and Home Theater ANSI Lumens defines the test procedure more tightly, and more realistically, as projectors are only tested after they are ISF calibrated to the D6500 color standard. Lower numbers result due in part to the fact that a metal-halide lamp emits a native white light that is a warmer color than D6500, and calibrating a projector can only be done by inserting precise amounts of light loss in controlled areas of the spectrum. Note I said, “light loss” – you can only reduce the intensity of a particular color, you can’t add it in magically if it isn’t strong enough. To increase the strength of one color, you actually have to lower the other two. The result is proper color rendition, but lower overall screen brightness. Sadly, not many projector manufacturers want lower Lumen numbers, so very few use Home Theater ANSI lumens.

As you look at projector data, do you find your eyes glaze over? They should. A look at any projector data base reveals tons of specs that are all hard to digest. But lets try. The biggie is Brightness, usually in ANSI Lumens. Remember, this is a measure of luminous flux, or all the light coming out of the projector. All that luminous flux has to hit a screen before it makes a picture, so we’re only part-way there with ANSI lumen figures. But it’s a start. Ignoring, for the moment, that you are dealing with an uncalibrated projector operating in “torch” mode (full uncalibrated white output) for that big beautiful ANSI lumen figure, here’s how you calculate the resulting screen brightness in Foot Lamberts:

Brightness (fl) = Projector Light Output (ANSI Lumens) / Screen Area (W x H)

So if we had a 1300 ANSI lumen projector and a 10 foot wide 16x9 screen…
Screen area is 10 x 5.625= 56.25 sq ft
1300 / 56.25 = 23.11 fl

I have no idea why water analogies work so well, perhaps it’s the fact that everyone has used a hose. So here’s what’s happening from a wet point of view. Lets say your projector is really a hose with a nozzle pointed downward, and a bucket below is the screen. Your hose sprays out exactly one gallon of water into that bucket. But now you raise your hose, or widen out the spray (take your pick) to cover, evenly, 16 buckets. You still spray out only one gallon, but it now disperses over all those buckets. How much lands in one of them? 1/16th of a gallon, right? The water in the buckets is the light on your screen. The more screen you have to fill, the less light hits any given area. It’s called the “Inverse Square Law” and is pervasive in light, sound, and all other forms of energy. And that’s what we are working with when figuring out how bright your screen will be with a given water output of your projector…or should I say light? If you get water out of your projector, write and tell us at once!

How much lower will the calibrated maximum be? Depends on the projector, but it’s always lower. Remember, you loose light in calibration to D6500. As you can see, though, a 1300 ANSI lumen projector might be a little dim once calibrated.

But then there’s contrast. Your eye likes contrast, and is indeed capable of a huge contrast ratio. Projectors, while getting better, are not as capable. But it may not matter anyway. Take a look at your projection screen with the projector off and the lights on. What color is it? White, right? Or light grey? What you’ve just seen is actually black, or at least as black as black would be on that screen with the room lights up. How do you get that huge contrast ratio of 2000:1? Simple. You gotta make the room dark…very dark. The reason is that your projector has a bright light limit that we just talked about. Even when it’s turned off, or lens capped, the blackest black will be defined primarily by the ambient light falling on the white screen. If you get that to nearly zero, you will start to realize that contrast ratio.

Here’s an example. Say your projector has a specified 2000:1 contrast ratio, you have a white screen with a gain of 1, and a 1300 ANSI lumen projector. With a little soft incandescent light in the room of only 1 foot candle (geezopete, another measurment unit!) your 2000:1 gets bumped down to 34:1. Lets cut that light on the screen down to a measly .1 footcandle. Bingo, you now have 286:1. Still not great. In fact, to get even close to that 2000:1 spec, you’ll have to get screen ambient light down to .001 footcandle.

Now lets work it from the other side. We can’t cut screen ambient light, for some reason, like windows (!). Lets say screen ambient light sticks at 3 footcandles, and that’s all the better we can get it. If we could get a bright enough projector, we could still get your 2000:1 contrast ratio back, right? Um….sort of. We could, if we only had a projector that could spit out 100,000 ANSI lumens! And then we’d land at around a 600:1 contrast ratio, which might almost be acceptable. Too bad we can’t get a projector like than in through the front door. Or plug it into a standard wall outlet. Or pay the bill for it.

The secret: you have to keep light off the screen to save projector lumens and dollars.

We’ll deal with plasma and LCD screens another time, but keep this in mind…look at a plasma screen turned off in a lit room. What color is it? Black. And that’s the black you’ll get in that kind of room light. Hmmmm!!

Tuesday, 21 August 2007

….and the War Continues...

Of course we’d have to declare a winner to perpetuate a war, right? That’s about what’s happened in the HD DVD vs Bl-ray Disc wars. About two months ago, we blogged that Blu-ray Disc seemed to be the defacto winner, due mostly to the news that Blockbuster had announce that they would only be renting Blu-ray Discs, based on their in-house test marketing. Added to the victory chant was the news that sales for Blu-ray software seemed to be winning. Then HD DVD claimed an equally early victory based on hardware sales, and the fact that HD DVD players are cheaper than Blu-ray Disc players (still true), and so would capture the hardware market.

I think we declared a qualified draw at that point. And if not, we should have. Here’s the latest news from the front:

Two studios have announced they will now exclusively support HD DVD. They are Paramount Pictures and DreamWorks Animation SKG. Paramount Home Entertainment claims they will publish release-day and date as well as catalog titles on HD DVD only. This decision overturns Paramount’s earlier decision to support both formats.

There are now 3 major studios exclusively supporting HD DVD: Paramount, DreamWorks, and Universal. Weinstein Company is also an HD DVD supporter, but has not singled that format out exclusively yet. Paramount also claims that their decision will include all films they distribute by Paramount Pictures, DreamWorks Pictures, DreamWorks Animation, Paramount Vantage, Nickelodeon Movies, and MTV Films, all under the Paramount Home Entertainment umbrella.

The exception to the rule? Who else but Stephen Speilberg, who hasn’t committed to either format, but Close Encounters was recently slated for Blu-ray Disc release.

Blu-ray Disc is supported, again exclusively, by Sony Pictures, Disney, Fox, MGM and Lions Gate.

There is only one studio actively supporting both formats, and that’s Warner Brothers, with their unique dual-format disc (HD DVD on one side, Blu-ray Disc on the other).

What on earth could cause studios to suddenly cuddle up to HD DVD? Could it be money? Naw…ok, maybe….ok, yes it probably was. The site www.deadlinehollywood.com seems to think that some mysterious backers payed off Paramount and DreamWorks to the tune of $150 million for their exclusive HD DVD jump. The studio’s official comments are not even worth quoting.

Ding…..Round 2…or is it 3…or 4?

Monday, 20 August 2007

Screen Brightness - Lumens, LUX, and Lamberts

I recently had a conversation with my friend Mike Lake about the brightness of theater screens, and how that brightness relates to home theater screens. During the conversation Mike challenged me to not only define target home theater brightness levels, but to come up with a way for the average home theater enthusiast (who’s that?) to measure screen brightness using something commonly owned, like a SLR or DSLR camera, or a light meter like a 1 degree spot meter.

Thanks for the challenge, Mike! Here’s what I came up with.

Screen Brightness Defined

Lumens, LUX, and Lamberts

A lumen is standard unit of luminous flux…if you wish, it’s how much light is emitted. When a projector’s light output is measured in lumens, it is an attempt to place a number on the maximum amount of light coming from the projector. It is not now bright your screen will be!

LUX is a measurement of illuminance. It takes into account the area over which the luminous flux (lumens) is spread. For example, 1000 lumens spread over one square meter results in 1000 lux. Back the light off until the same luminous flux now fills 10 square meters, and you get 100 lux over that area.

Screen Brightness, or how bright a screen will look, involves measuring the light reflected from its surface to our eyes. It takes into account the luminous flux (lumens) falling over its entire area (lux) and how reflective the surface of the screen is. It literally is a measure of the light bouncing off the screen. Luminance is measured in foot-lamberts.

How bright a screen is has impact on the image presented, and not in a small way. To help quantify screen brightness, some test method had to be standardized. In film projection, it’s the projector without any film projecting onto the screen. With digital projectors, it’s a 100% white image. It’s interesting to note that the two are not identical. Film base attenuates the light through the projector, so a white film frame would measure lower, but digital projectors use no such film, so 100% is 100%. The target luminance is between 12 and 22 foot-lamberts (fl). The target is 16fl, but a group of surveyed viewers much preferring the 22fl screen brightness. Many movie houses are dimmer, around 7-10fl. Yes, it’s a cost thing. Xenon bulbs are expensive, and last longer if you don’t burn them as bright.

Here's your first answer: The target luminance for a THX Home Theater Screen is 16fl, same as a commercial theater, and brighter would be better...and more expensive!

Note at this point that we are talking Foot-Lamberts, not lumens. They are not the same, and don’t even really relate directly to each other. To reiterate, a lumen is standard unit of luminous flux. A projector that provides 1000 lumens of light will provide that flux regardless of how big the screen is, or how far away it is. To change the luminous flux of a projector you have to do something in the light chain, like boost the lamp current, or get a bigger (faster) lens. Think of it as the total amount of light emitted. The current standard for projectors is know as ANSI Lumens. ANSI, the American National Standards Institue, has standardized the method used to test projectors. The method involves, among other things, testing multiple areas of the light source. Home Theater ANSI Lumens is a measurement standard created by Runco International, and most significantly differs from ANSI lumen measurement in that the projector under test is first calibrated to ISF (Imaging Science Foundation) standards at 6500K, the color temperature required for an accurate video image. The projector is then targeted to a standard screen, and the resulting light falling on the screen is measured at 9 points with a LUX meter, then averaged and multiplied by the surface area of the screen. The resulting measurement is much lower than the standard ANSI lumen equivalent, but is a better indicator of projector performance than the measurement of a projector running wide-open and uncalibrated.

Back to the screen. If everything in your home theater design was correct, you should hit the same 16fl luminance target figure that theaters try for. In fact, the THX Home Theater standard is 16fl, but they talk about trying to duplicate the image seen in mastering houses, which calibrate 100% white to 35fl. That’s quite a range!

Now, how to you know you’ve got it right? The best way is to measure luminance in foot-lamberts directly. If you have the right kind of light meter (Konica-Minolta makes an industrial unit for this purpose), you just aim it at the screen, pull the trigger, and read your meter. Of course, you don’t have that meter. But, like Mike, you may have a DSLR or film SLR camera with built-in light meter. Well, you’re almost there. It takes a bit of math, though.

Second Answer: Measuring Luminance in Foot-Lamberts with a camera

With a camera, it’s best to use a telephoto lens, or get close to the screen. Your object is not to try to measure the entire screen, but try for a small section, ideally, 1/9th. Set your camera for ISO 100, and your shutter speed to 1 second. This places your camera in the range where useful EV figures can be converted to foot-lamberts with our little chart. We picked 1 second because many new zoom lenses only open to f4 or so, and we need the extra sensitivity. With a test DVD (any THX certified DVD has the THX Optimizer on it, which will work fine), put up a 100% white frame, and take a light reading by pointing your camera at the white area and noting the f-stop and shutter speed. Plug them into the formula:

EV = 3.3 Log10 (f²/T)
Where:
f=f stop
T is exposure time

We include a chart for this, if the math is to hard. The chart is limited, but you can get some useful luminance data with it anyway. (For the technologists, we’ve stuck with the Minolta recommended K of 1.3)

Your target is 16fl, which is between EV8 and EV9, or between f16 and f22 (ISO100, 1 sec)

Spot-meter or camera set to ISO100

100% white screen, Table is below...










































































EV 1 sec .5 sec ft-L
1 f-1.4 f-1.0
2 2 1.4
3 2.8 2 .33
4 4 2.8 .65
5 5.6 4 1.3
6 8 5.6 2.6
7 11 8 5.2
8 16 11 10
9 22 16 21
10 32 22 42
11 45 32 84


Ok, the challenge is met! The next is finding a projector that will hit 16fl reflected from your screen given its size. Oh, and is 16fl really enough, given a high level of ambient light in the room? At some point, I’ll need to get paid for this stuff….

Thursday, 2 August 2007

42” TVs – “the new 34”

Yes, we’re on about screen size again. Why? Simple. It’s the single biggest factor in choosing a new digital TV or projector. This post will place a “perspective” on two issues that are pervasive in the TV selection process.

First, is 42 bigger than 34? Now, you math whizzes, don’t freak out. But the answer is a qualified “NO”! And anyone that’s “upgraded” from a 34” standard TV to a new 42” plasma or LCD will agree with this.

As we’ve noted, the new shape of screens is the wide 16x9 aspect ratio. But much of what’s available on broadcast, satellite, and digital cable TV is still the old 4x3 shape. Just watching the old shape programs on your new TV causes consternation. If you do nothing, you end up with what looks like a small, square-ish picture inside your big wide screen. And it is indeed no bigger than your old 34” set! The height of the screen tells the tale. For a 34” diagonally measured TV, the screen height is just over 20”…the same as your new 42” wide screen. Hmmm! So, even though the screen is bigger, the standard 4x3 video image isn’t. Unless….

The second issue is more than a bit annoying. In fact, I’m wincing as I’m writing this paragraph. Because some people don’t like to see the little square image inside their big TV, or are bothered by the black side bars beside it, TV manufacturers build in a ‘stretch’ function that distorts the 4x3 image by stretching it to fill the new 16x9 screen. OK, fine, if you gotta have that feature. But rather than seeing the 4x3 picture as it is supposed to be, it now fills your screen with people that look bloated and overweight, car wheels that look like eggs, fat-faced newscasters, extra-wide cereal boxes, and can it really be that those pencil-thin models now look...um...normal? I'll bet seeing themselves that way is enough to drive some of them to anorexia nervosa. So is a TV with the cubbies really a good trade-off? The purist fairly shouts “No!”, but yet you can’t go into a bar, restaurant, or retail store without seeing bloated pictures. We all know the camera adds 10lbs, but just how many cameras are on these people? 4? We should all complain, and lobby for ‘un-stretched TV”! Snatch that remote and hit the "aspect" button until it looks right. I’m stumbling off my soap box now.

The point really is, for much of what we watch, a new 42” set is no bigger than an old 34” set. Just be aware of that fact when you nervously tap your credit card on the check-out counter. You’re not really buying bigger. In fact, we think 42" should not even be called a "big screen" at all. We know they are all now on sale, and some are even much less than $1000. Just know exactly what you are buying, and it isn't all that big.

Here’s a link to Screen Math, a site dedicated to analyzing the size and shape of TVs. http://www.screenmath.com See our earlier posts about screen size to learn how big a screen you should really consider and why.

Friday, 20 July 2007

HD-DVD vs Blu-ray war…not quite over yet?

A few weeks ago we posted a story that indicated the “war” was over…at least, according to the Blu-ray folks. We also noted that since Blockbuster now rents only Blu-ray, a major battle had been won.

This week, the HD-DVD camp retaliated with stats that indicated their growth ahead of Blu-ray. We don’t have the details, and they could easily be “Lying with Statistics”, but they claim HD DVD hardware sales growth at 37% and software sales growth at 20% for 1Q 2007, while Blu-ray hardware sales were down 27% and software sales were down 5% in the period from 1Q to 2Q 2007.

Hmmmm!

Ken Graffeo, Universal Studios Home Entertainment HD strategic marketing executive VP stated “The numbers are clear — HD DVD is steadily gaining momentum and market share…” and added “With HD DVD CE players now at MSRP prices starting at $299 and with strong marketing campaigns around new HD DVD titles with Web-enabled interactive features, we’re continuing to raise the bar for the consumer experience.” Of course, Ken is also co-president of the HD DVD Promotional Group. What would you expect him to say, Blu-ray is better or winning in any possible way? Don’t think so.

The Blu-ray folks still hang on PS3 for their numbers. Andy Parsons, Pioneer Electronics advanced product development senior VP and representative for the Blu-ray Disc Association, responded with, “What’s interesting is that [the HD DVD Promotional Group] keeps trying to disregard the importance of the PS3. It sounds like they are trying to redefine the story a bit. Our position is that you don’t try to separate the traditional home theater player from the PS3 because we know that there are a significant percentage of people who own PS3s who are using them to watch movies. There is no way we could be outselling those guys 2-to-1 on the titles as we’ve been doing since the beginning of this calendar year if not for PS3.”

Parsons went on to acknowledge that dedicated hardware sales of HD DVD players were driven by price, and the recent price drop of Toshiba’s player to $299 boosted their numbers. But he then added this highly astute observation;” we continue to think its content that drives the whole market, not hardware.”

We agree…to some extent. When the price of “entry” into hardware is significantly higher than the competitor, and the available content is more or less similar, the cheaper hardware system wins. But if available content is significantly more diverse, available, cheaper or of higher quality (not the case with this war), then content wins. Evaluating content isn’t so easy, it’s at least in part subjective. Soooo…..

Let the battle continue, and let the consumer win!

Source: TWICE www.twice.com

Wednesday, 18 July 2007

How Big Should My Screen Be?

How Big Should My Screen Be?

A simple answer might be “how much money do you have?”, but that would ignore the root problem. And it’s not about money, but satisfaction with the result anyway.

The choice of a screen size is actually driven by emotional response. The single most popular screen size in the market today is 42”. That’s where the best prices are, and the 42” set doesn’t demand much from the owner in terms of mounting, or placing on a piece of furniture. In that scenario, the choice is driven only by price and physical size (or lack of it). And that’s the emotion we’re talking about…price and the stress of spending a lot of money on a new TV. That said, it’s funny how many 42” TV owners we hear saying “I wish I’d bought the larger one” though. A 42” screen, as hard as it may be to believe, can have the visual impact of a postage stamp in many homes. In fact, if you’re watching standard TV, an old 4x3 TV will end up with a bigger image on it than the same picture centered on a new 16x9 wide screen 42” TV!

We can separate video screen viewing into two categories: Home Entertainment (casual viewing), and Home Theater (approximate the large, involving image of a theater). Both uses are completely valid, but will result in very different choices.

Home Entertainment screens can be smaller for a couple of reasons. The most obvious is that with this type of use, the viewer doesn’t care about filling his peripheral vision with image, and may not desire to be that involved with the program content. Having a smaller screen separates the view from the content, and isolates him from the “suspension of disbelief” goal of the feature movie maker. Also, smaller screens allow more positional flexibility. You can put smaller screens in a wider variety of places, hang them on smaller mounts, and place them on furniture. So long as we realize the use dictates the size, the Home Entertainment use screen can be 42”, even if the viewer sits 12 feet away.

But that’s not going to work for our other category, Home Theater. If you remove the “Home” qualification from the category title, “Theater” is what remains, and that’s what drives the screen size choice in this world. There are many charts, calculators, and graphs available on web sites that help simplify the process (some are linked below). There has been quite a bit of research into this application, and organizations like SMPTE, THX, and others have specifications that are based on viewing angle with the goal to filling a significant portion of your peripheral vision with picture. We also like to add a resolution parameter to this, since filling your vision with a low resolution image doesn’t do anything to support suspended disbelief.

So the first factor to consider is viewing angle. THX recommends a 36 degree viewing angle, and a minimum of 26 degrees, while SMPTE recommends 30 degrees. For a 50” diagonal screen, that works out to be a viewing distance of 5.6 feet for the recommended THX angle, and 7.9 feet for the THX minimum viewing angle.

Now, let’s look at the same TV from a standpoint of resolution. Assuming a 1080p display, the THX recommended distance will always be slightly closer than the point at which a person with 20/20 vision can no longer see individual pixels. This is because the THX standard is weighted towards favoring peripheral vision rather than resolution, and is actually a pretty smart tradeoff, weighting the entertainment power of a large picture over invisible pixels. (See the post: Can I See 1080p?)

But, as you might begin to see, most TVs purchased are way too small for THX specs. So as you consider screen size and your application, run these numbers to see if you’re in the THX ballpark: Take the screen width (not the diagonal measurement!) and multiply by 1.54. The result is the distance at which you’d need to sit for the optimal THX 36 degree viewing angle. Multiply the screen width by 5, and that’s the maximum recommended viewing distance.

As a closing thought, all of this assumes you are working with a screen aspect ratio of 16x9. These days, that's no longer always true. Technology exists that permits projection of a full 2.35:1 true wide screen image...well beyond this half-baked, slightly narrower than 1.85:1, 16x9 stuff. Oh, don't get me started! But we'll right on that subject soon too.

As always, a professional home theater consultant will be able to optimize your home theater floor plan for screen size and seat position, as well as optimize performance of both your picture and sound.

Links to screen size charts and calculators:
A screen size chart based on resolution
A very nice screen size calculator
Carlton Bale addresses screen size and 1080p

Audyssey MultEQ – Fixing room problems other equalizers don’t even know are there

If you’ve been around the Home Theater scene for a while, you’ve no doubt run into the concept of equalization. Simply, and equalizer attempts to compensate for a frequency response problem by pre-filtering the audio with the inverse characteristic. If you have a 200Hz bass peak, an equalizer will create a complimentary dip to compensate for it, with the intended result being smooth, flat response.

Ah, if only we lived in an that ideal world. Or an anechoic chamber. But such is not the case. In reality, our home theaters or multi-purpose home entertainment rooms are far from perfect acoustic spaces. There is a different set of room-imposed flaws in each one of them, and within each room, a different set of flaws for each listening position, making even measuring such problems a very imprecise process. Usually, a technician will make a lot of response measurements over a large area and somehow average them together. That’s a great way to avoid “chasing” a nasty problem for one seat while ignoring a different problem in another…you just average them all out and apply whatever equalization more or less works for every seat. Or rather, doesn’t work for any seat.

What if we could custom equalize out response problems for each seat individually? What if we went even farther and worked not only on frequency response issues, but looked at time-domain problems too, like sound reflections from walls, or physical speaker misalignment? What if we did all of this by making a series of test measurements, then let some highly sophisticated math from another galaxy do the work, and create a custom digital filter for each speaker in the room that addresses problems caused by that room for each seat? Too cool, you say?

Yup, that’s for sure. That’s also, in simple terms, what the Audyssey process does. Audyssey MultEQ XT, MultEQ, 2EQ, and EQ all use variations on the process, to differing levels of sophistication. The most elaborate implementation is found in the Audyssey MultEQ XT system, which a custom installer makes use of a computer and software to make the necessary measurements and calculations to cover the most demanding installations. MultEQ XT can also be found in high-end receivers, and is the most powerful version yet released.

Below that, MultEQ, 2EQ and EQ vary in processing and measurement power, with the EQ version being a preset system tuned for HTiB systems and TV sound systems. And yes, the automotive sound industry may soon benefit from Audyssey processing too.

So you ask, “Who are these guys?” And well you should. From the Audyssey web site’s ‘about us’ page,: Audyssey Laboratories was conceived at the prestigious Immersive Audio Laboratory at the University of Southern California in Los Angeles, California. Dr. Sunil Bharitkar, Philip Hilmes, Prof. Tomlinson Holman, and Prof. Chris Kyriakakis were all involved in conceiving and creating the technology that was the basis for "spinning out" the company in July, 2002.

In late 1996, after a fierce competition among 117 universities, the National Science Foundation established a unique research center at USC that focused on immersive technologies. A key component of the Integrated Media Systems Center (IMSC) is the Immersive Audio Laboratory that was founded by Chris Kyriakakis and Tomlinson Holman. Over the past 10 years Tom and Chris have conducted research in audio signal processing, acoustics, and psychoacoustics. The results of their interdisciplinary research have been published in more than 100 technical journals and several books. One of the most challenging problems that they addressed was the comprehensive understanding of the negative effects of room acoustics on sound reproduction. It took 5 years of intense research and experimentation and more than $5M in research funds to fully understand and solve this intricate problem. No other facility in the world had the scientific expertise and the resources to fundamentally examine and solve this problem.

Sharp eyed readers may have picked up the name of Tomlinson Holman. Yes, the “T H” of “THX” fame, inventor of THX theater sound systems, home THX, and the founder of the entire THX program at Lucasfilm. That Tom Holman. For more about Tom, see the “THX” and “TMH” links at www.platinumhometheaters.com

We have been using Audyssey processing for over a year now with great results, and now recommend it to our clients, if built into receivers by Denon, Marantz, and Onkyo, or in high end component systems using the Audyssey Sound Equalizer product custom calibrated to the space.

For information on how you can get Audyssey MultEQ in your home theater, contact us at Platinum Home Theaters.

Tuesday, 17 July 2007

Visual Acuity…or Can I See 1080p?

Here’s the set up: you fork over a king’s ransom for that new 1080p screen (projector, plasma, LCD, it doesn’t matter for this discussion). You place it in your home theater room, sit back, relax, and enjoy the fuits of your labor. But wait. There’s this nagging voice in the back of your mind that says “How come this doesn’t look better, or different than my old screen?” Let’s extend this question to, “Can I See the 1080p I just paid for?” The answer is a definite “maybe”.

If you are in your 20s, and have 20/20 vision, you can see better than most people in the world, and can see things as small as 1/60th of a degree of arc wide. What that means is, if your place a protractor at your pupil (figuratively!!!) and look at how wide 1 degree is, then divide by 60, that’s how small an object you can see. Anything smaller gets mushed into whatever is next to it and becomes indistinct. In order to see the pixels of a 1080p screen, they have to be at least 1/60 of a degree of your visionary arc wide. Simple math, right? It's all based on visual acuity. In fact, we don’t want to see pixels, so we want to sit just far enough away from our screen so as to have one 1080p pixel mush into the next creating a smooth pixel-free image. So just how far away is that? Here’s an example.

If you have a 42” 1080p plasma TV hanging on your wall, you’d need to sit at least 5.5 feet from it to blend those 1920 pixels into a smooth image. As you move farther, some that 1080p resolution is wasted. Any closer and you’ll see the dots. Again, that's assuming 20/20 vision. How many of you sit 5.5 feet from your TV? I didn’t think so. Most are more like 8 feet or farther. With your eagle eyes at 8 feet, the pixels mush together, even with the lowly 720p screen.

Let’s do the same calculations for a 50” screen. For a 1080p screen, plunk your chair at 6.5ft or more to just blur the pixels. At 720p, try a tad under 10 feet. Yes, 1080p is wasted on you for a 50” set at 10’. You just can’t see that well. And if you don’t have 20/20 vision (yea, me neither) these distances get shorter real fast.

Now, before we start getting all upset about how we think we can spot 720p over 1080i on broadcast TV stations, there’s a lot more at work than just the pixel count. We’re talking about display screens only here. Source material is a whole other discussion that includes the resolution of origination formats, how TV stations process images prior to air, bit rates, and much more. But broadcast HDTV is free to those with antennas, HDTVs are not. As a consumer, you owe it to yourself to know if spending more on your picture for higher than necessary resolution is smart, or just obsessive compulsive behavior. For smart consumers, we offer our consulting services to help you pick the perfect picture. For those with OCD, we recommend buying the highest resolution screen you can at any size and price. It’s probably cheaper than therapy.

Coming soon: How big a screen should you get?

HTIAB…make room for the Ensemble 1080 Home Cinema

HTIAB – that stands for Home Theater In A Box…a concept that usually means a “get everything in a box a cheaply as possible”, has a new big brother, the Home Cinema System. The concept comes from Epson and Atlantic Technology and takes the form of the Ensemble 1080 touted as the “first complete home cinema system designed for the consumer market”. It begs the question, “who were you designing for before?”…but let’s move on…

Ensemble 1080 starts with a 1080p 3 LCD front projector and adds a special 100” front projection screen with built-in LCR speakers and a special ceiling mounted surround speaker assembly. It’s all controlled from console with an up-converting DVD player and subwoofer with integrated amplifiers. The system has an HDMI port for external Blu-ray disc players. Speaking of HDMI, the console connects to the front speakers via an HDMI cable making hookup and cable management simpler. Add some remote control finesse (they claim no multiple remotes…hmm…) and you’re into Home Cinema for 7 grand. And just in case that’s a bit steep for you, their 720P version comes in at 5 grand. (See our discussion elsewhere about if you really need 1080p, or will 720 do youo just fine). But don’t run out to Tweeter (even if they are under new ownership) or Best Buy to find it. The system will be sold only via the custom installer channel (that’s us, folks!).

Here are the links, but as of this post, there's nothing on the Epson or Atlantic Tech sites about this yet...
Epson Home Entertainment

Atlantic Technology

Platinum Home Theaters (custom integrators)

Samsung introduces LED based LCD 1080p TVs

It’s a world of contrasts, and Samsung apparently knows it. They recently announced their new 71 and 81 series TVs feature not only elegant gloss-black slim bezels and 1080p resolution, HDMI 1.3 inputs, and more, but hit big new numbers in the dynamic contrast zone…are you sitting down for this…100,000:1! Room light will be the issue for sure from now on.

If you like big numbers attached to your 1080p TVs, then here’s another: how about 80,000 hours on the LED-based backlighting? That’s more than 27 years of 8 hour a day TV viewing. Now THAT’s a lot of TV!

The new LED backlight system uses a process Samsung calls “local dimming”, coupled with Samsung’s Auto Motion PulsLED will be offered in sets from 40” to 58”. If you don’t like HDMI (and who really does?) how about connecting your Blu-ray player without wires? The new Samsung TV’s have built-in 802.11n wireless routers that are meant to talk to nearby Blu-ray players, satellite boxes, etc. Oh, and by “near by” we mean within 200 feet. Sure, it will cost a bit more, but how cool is that?

We’ll be keeping more than one eye on Samsung! The new products should ship by the holidays.

Welcome to Platinum Home Theater Blogs!

We're trying out the Blogger system in hopes of streamlining our Home Theater Blogs. If you wish to read our old blogs from before 7/17/07, go to www.platinumhometheaters.com and find the Archives link.

Thanks for joining us!