[Editor’s note: Good overview of scale (for eyes, and for maps) with remote sensing principles applied to tackle pixel density and the eye’s ability to resolve that resolution. See also post on smart phone screen sensor accuracy. Thanks anonymous twitter user!]
Republished from Discover Magazine.
With much bruhaha, Steve Jobs and Apple revealed the new iPhone 4 yesterday. Among other features, Jobs said it has higher resolution than older models; the pixels are smaller, making the display look smoother. To characterize this, as quoted at Wired.com, he said,
It turns out there’s a magic number right around 300 pixels per inch, that when you hold something around to 10 to 12 inches away from your eyes, is the limit of the human retina to differentiate the pixels.
In other words, at 12 inches from the eye, Jobs claims, the pixels on the new iPhone are so small that they exceed your eye’s ability to detect them. Pictures at that resolution are smooth and continuous, and not pixellated.
However, a display expert has disputed this. Raymond Soneira of DisplayMate Industries, was quoted both in that Wired article and on PC Mag (and other sites as well) saying that the claims by Jobs are something of an exaggeration: “It is reasonably close to being a perfect display, but Steve pushed it a little too far”.
This prompted the Wired article editors to give it the headline “iPhone 4’s ‘Retina’ Display Claims Are False Marketing”. As it happens, I know a thing or two about resolution as well, having spent a few years calibrating a camera on board Hubble. Having looked this over, I disagree with the Wired headline strongly, and mildly disagree with Soneira. Here’s why.
First, let’s look at resolution*. I’ll note there is some math here, but it’s all just multiplying and dividing, and I give the answers in the end. So don’t fret, mathophobes! If you want the answers, just skip down to the conclusion at the bottom. I won’t mind. But you’ll miss all the fun math and science.