ADVERTISEMENT
See Inside Scientific American Volume 308, Issue 4

Why Hi-Res Isn't Always Better

Spending on extra pixels doesn't always pay off
HDTV, retina display,



Flickr/thomashawk

Smaller pixels + more pixels  = greater resolution. This is the dominant theme of electronics these days. Apple has its Retina displays on phones, tablets and laptops. Samsung, Nokia and the others are leapfrogging even the iPhone's resolution. The big buzz in televisions is 4K: screens with four times the resolution of HDTV.

But as I wrote in my Scientific American column this month, there are some substantial footnotes lost in the high-res marketing tsunami. More resolution means bigger, slower downloads. TV shows and software that haven't been upgraded for the higher resolution actually look worse than they did before.

You don't have to take my word for it, though. It turns out that online you can find the math that further explodes the marketing departments' claims that smaller pixels + more pixels = better.

Steve Jobs introduced the Retina display like this: "There's a magic number right around 300 pixels per inch, that when you hold something around to 10 to 12 inches away from your eyes, is the limit of the human retina to differentiate the pixels." In other words, the individual points of light would, theoretically vanish, creating a seamless image.

But Raymond Soneira, president of DisplayMate Technologies and a frequent critic of screen-makers' marketing claims, calls that "marketing puffery." He says that your eye’s resolution isn't counted in pixels. Instead, your eye is limited by its angular resolution. "The angular resolution of the eye is 0.6 arc minutes per pixel,” he wrote in an e-mail to tech publications in 2010. "So, if you hold an iPhone at the typical 12 inches from your eyes, that works out to 477 pixels per inch." The bottom line: "The iPhone has significantly lower resolution than the [eye's] retina. It actually needs a resolution significantly higher than the retina in order to deliver an image that appears perfect to the retina."

Now, it's worth noting that his analysis wasn't universally accepted. Phil Plait, who spent years calibrating the Hubble Space Telescope's optics, wrote that Soneira's numbers hold true only for people with perfect vision. If you have average eyesight, Jobs's claims are fine. (He also offers a very clear walk-through of the math.)

The marketing departments, however, have less wiggle room when it comes to the new age of 4K televisions. Most people already can't see the pixels on an HDTV set at normal seating distance. So what, exactly, is added by quadrupling the resolution?

Either you can make a much bigger TV, or sit much closer.

You can find various mathematical solutions to this problem, but here's one that's typical and also clearly written. Its conclusion: You'd need an 84-inch screen and you'd need to sit 5.5 feet from it to detect any difference in resolution.

Sitting that close, of course, would be absurd. If you had an 84-inch TV, you'd never, ever sit 5.5 feet from it. You'd miss half the movie! You couldn't possibly take in the entire image at that distance. You'd reflexively move back to a more comfortable viewing distance.

Look, most analyses (and just trying it out yourself) point out that you can't even see the difference between 720p and 1080p high definition at normal viewing distances. All analysis establish that 4K, therefore, is a ludicrous concept. It's quadrupling resolution that's already too high for anybody to discern.

In other words, common sense tells you that the resolution wars are indeed marketing puffery—and now you can do the math.

Rights & Permissions
Share this Article:

Comments

You must sign in or register as a ScientificAmerican.com member to submit a comment.
Scientific American Holiday Sale

Black Friday/Cyber Monday Blow-Out Sale

Enter code:
HOLIDAY 2014
at checkout

Get 20% off now! >

X

Email this Article

X