Ever look close, I mean real close at the imagery you seen in Google Earth and other online map providers? You’ll notice most of it, in the United States at least, comes from the USGS or USDA Farm Service Agency. But have you noticed they sometimes doctor the imagery to remove clouds or other collection artifacts? Well, look at the above image again Here’s the Gmaps view in Tybee Island, GA. Thanks Andrew and Geoff!
Posts Tagged ‘remote sensing’
[Editor’s note: Good overview of scale (for eyes, and for maps) with remote sensing principles applied to tackle pixel density and the eye’s ability to resolve that resolution. See also post on smart phone screen sensor accuracy. Thanks anonymous twitter user!]
Republished from Discover Magazine.
With much bruhaha, Steve Jobs and Apple revealed the new iPhone 4 yesterday. Among other features, Jobs said it has higher resolution than older models; the pixels are smaller, making the display look smoother. To characterize this, as quoted at Wired.com, he said,
It turns out there’s a magic number right around 300 pixels per inch, that when you hold something around to 10 to 12 inches away from your eyes, is the limit of the human retina to differentiate the pixels.
In other words, at 12 inches from the eye, Jobs claims, the pixels on the new iPhone are so small that they exceed your eye’s ability to detect them. Pictures at that resolution are smooth and continuous, and not pixellated.
However, a display expert has disputed this. Raymond Soneira of DisplayMate Industries, was quoted both in that Wired article and on PC Mag (and other sites as well) saying that the claims by Jobs are something of an exaggeration: “It is reasonably close to being a perfect display, but Steve pushed it a little too far”.
This prompted the Wired article editors to give it the headline “iPhone 4’s ‘Retina’ Display Claims Are False Marketing”. As it happens, I know a thing or two about resolution as well, having spent a few years calibrating a camera on board Hubble. Having looked this over, I disagree with the Wired headline strongly, and mildly disagree with Soneira. Here’s why.
First, let’s look at resolution*. I’ll note there is some math here, but it’s all just multiplying and dividing, and I give the answers in the end. So don’t fret, mathophobes! If you want the answers, just skip down to the conclusion at the bottom. I won’t mind. But you’ll miss all the fun math and science.
[Editor's note: This graphic reminds me of lens distortion matrix analysis in my remote sensing class back in university. Not all sensors are made equal and pictures and sensors just capture a representation of reality, not the real thing.]
Republished from AppleInsider. January 11, 2010.
Touchscreen analysis shows iPhone accuracy lead over Droid
A test comparing the accuracy and sensitivity of smartphone touchscreens across various makers gave the iPhone top marks ahead of HTC’s Droid Eris and the Google-branded Nexus One, and much better results than the Motorola Droid.
The results, published by MOTO labs, noted that the company (which has no relation to Motorola) “has years of experience developing products that use capacitive touch, and we’ve had the opportunity to test many of the latest devices. Our conclusion: All touchscreens are not created not equal.”
[Editor's note: Perhaps Wired magazine's Google evil-meter just tipped a bit less negative? In all seriousness, this sounds like a great project!]
Republished from HughStimson.org. Dec. 11, 2009.
Land cover change analysis has been an active area of research in the remote sensing community for many years. The idea is to make computational protocols and algorithms that take a couple of digital images collected by satellites or airplanes, turn them into landcover maps, layer them on top of each other, and pick out the places where the land cover type has changed. The best protocols are the most precise, the fastest, and which can chew on multiple images recorded under different conditions. One of the favorite applications of land cover change analysis has been deforestation detection. A particularly popular target for deforestation analysis is the tropical rain forests, which are being chain sawed down at rates which are almost as difficult to comprehend as it is to judge exactly how bad the effects of their removal will be on biological diversity, planetary ecosystem functioning and climate stability.
Google has now gotten itself into the environmental remote sensing game, but in a Google-esque way: massively, ubiquitously, computationally intensively, plausibly benignly, and with probable long-term financial benefits. They are now running a program to vacuum up satellite imagery and apply land cover change detection optomized for spotting deforestation, and for the time being targeted at the Amazon basin. The public doesn’t currently get access to the results, but presumably that access will be rolled out once Google et al are confident in the system. I have to hand it to Google: they are technically careful, but politically aggressive. Amazon deforestation is (or should still be) a very political topic.
Republished from The Washington Post.
In Idaho, scientists are using remote imaging to study evapotranspiration, the loss of water to the atmosphere by evaporation from soil and water, and by transpiration from plants.
Water management is serious business in the American West, where precipitation is scarce, irrigated agriculture is a major industry, new housing subdivisions spread across arid landscapes and water rights are allocated in a complicated seniority system.
I’m in search of a super generalized but comprehensive global coverage dataset or datasets that shows major highways and rail lines, even sea lanes. You can see an example of this on Plate 21 of the National Geographic 8th Edition Atlas of the World. Do you know of one? Please shoot me a note to email@example.com or comment here if you have a tip.
Why do I want such? I am working with Tom Patterson (of Natural Earth fame) and Dick Furno (retired from The Washington Post) to release a comprehensive, attributed GIS base map dataset derived in part from the Natural Earth physical wall map at around 1:15,000,000 scale and two other consistent and self referential datasets at approx. scales of 1:50m and 1:110m. These datasets will provide coverage that perfectly registers with the modern satellite remote sensing imagery and SRTM derived topography. Yes there is 1:1m coverage around the world but it is often out of date and too detailed for doing global, continental, and regional mapping.
We hope these open source datasets will allow everyone in the cartographic community to focus on telling the best “why” and “how” visual story about their thematic data instead of spending 50 to 70% of project time looking for or creating the vector geometry that captures the basic “where” of their thematic data.
Release is expected Fall 2009 at the NACIS map conference in Sacramento. Please check back in this space for more details as they develop.