Ever look close, I mean real close at the imagery you seen in Google Earth and other online map providers? You’ll notice most of it, in the United States at least, comes from the USGS or USDA Farm Service Agency. But have you noticed they sometimes doctor the imagery to remove clouds or other collection artifacts? Well, look at the above image again Here’s the Gmaps view in Tybee Island, GA. Thanks Andrew and Geoff!
Posts Tagged ‘imagery’
“If map projections are your problem, Geocart is your solution”
While most GIS and remote sensing map software support a couple dozen obligatory projections, Geocart supports over 175 general case projections. Map projections are mathematical formulas for converting the earth’s round shape to a flat surface and their “parameters” can be adjusted to form thousands of specific projections. For comparison, ArcGIS, the popular commercial geographic information system software from E.S.R.I. supports 1/3 as many projections; MaPublisher from Avenza supports 1/2 as many as Geocart.
The program’s author, daan Strebe, is a leading authority in this specialized subject and the new version incorporates corrections to many standard formula resulting in near loss-less projections. Unlike other software packages, Geocart can transform any projection to another projection (full forward and inverse transformation support for all projections). Other map applications can damage data when it is transformed. Furthermore, Geocart 3 introduces a new rendering mode using PixSlice technology to create a sharper, more detailed raster images (examples after the jump). This works both for resizing images and when transforming from one projection to another (reprojecting).
The application manual includes a handy decision tree to assist in what projection to use depending on the map’s topic and geographic coverage. The application includes innovative advanced tools to visualize the distortion inherent in each projection (sample image).
Pricing: For lapsed users, upgrade pricing is available for $500 with new professional licenses running $860, discount for multiple purchases. Steeply discounted non-commercial and student licenses are available. Price includes map databases (36GB with the pro version!) and, importantly, the new version imports shapefiles, the defacto geodata format.
Full review continued below . . .
I tested Geocart using the free, month-long trial (note the watermarks in the screenshots). Download and installation (once for the application, again for the default databases) went quickly but you will need an administrator account to accomplish the install. When the package downloads, it is labeled with your operating system type rather than “Geocart” so in my case I looked for “Mac OS 10.5/10.6″ in my downloads.
The app and included databases each weigh in about 150 mb for 300 Mb of disk space. Rather than collecting associated database files in the Applications folder (Program Files on Windows), they are installed in Library > Application Support > Mapthematics > Databases. If you want quick “template” access to frequently used data, it should be added in that location. The “add recent databases” command partly makes up for this.
Setting up a map document
To start mapping, go to File > New. Then go to Map > New. Multiple maps can be stored in a single Geocart document, each having their own projection parameters and database content.
When making a map, the first step is to determine how large the map dimensions will be and how much geography it will show. The relationship between the two is called map scale. Some databases, like Natural Earth, are set up based on map scales. Using the right database will result in prettier maps that are generalized appropriately (the linework doesn’t look too detailed or too coarse) and smaller files that are easier to work with.
Geocart also includes a useful linework simplification routine when your data is complex and needs to be simplified. This toggle is on by default and is accessed under Map > Generalize vectors. Toggle it on and off to compare the resulting resulting lines, your mileage will vary by map scale, even with the same source database.
Tip: The application takes map scale seriously and includes a tool to calibrate your system under Preferences > Display. This calibration functionality is absent to most other mapping packages.
To add data to the map
Each new map starts with “Stylized World Topo 5400×2700″ raster image in layered with a vector grid (Map > Graticule) in sinusoidal projection. With the map selected, go to Map > Databases. I was able to easily add in shapefiles from Natural Earth, some of which are included in the default databases. If you have existing Geocart 2 format databases, those will import directly, including typesetting databases.
Tip: To modify which databases load for each new map, go to Preferences > New Map Databases. I set mine to use Natural Earth country boundaries but removed the default image database.
Have a scanned map without a projection?
Geocart will help you figure it out. Add the map with File > Place image. (Vectors are not supported at this time). Then align with a map with a vector map database. Adjust the settings of the map until it matches. Then choose File > Export Database. Load the database back into a Geocart map and start projecting.
I was also able to add several map images and quickly georeference them and then deproject to geographic (platte carrée) or into another projection. One was a simple map of the ash plume in Europe in Mercator. The other was a complicated world wall map from National Geographic in Winkle Tripel (examples below).
Tip: When georeferencing an image, maximize both the map and the placed image to fit the window (Map > Scale to Window). Then adjust your Geocart map to use the same boundaries as the placed map image (make an educated guess). Then cycle thru the projections until the vector lines (graticule and country boundaries, etc) begin to match. Mercator and Robinson are common for world maps, a conic like Albers or Lambert is common for country and state maps. Then adjust the projection parameters and fine tune the boundaries and nominal scale and map resolution till everything fits exactly. Finally, export the placed image to database format.
Note: For raster maps that are georeferenced, the exported database file remains in the native projection of the image (it it not transformed to geographic). This does not affect your ability to reproject the image, however
Choose a projection
The familiar icons by projection class are still found in the main menu bar (see screenshot above). With a map selected on the document, choose a different projection (some are even listed in cyrilic and arabic!) and watch the map update in real time.
If you want assistance in choosing a projection (who can remember all their quirks!?), check out Projection > Change Projection. A dialog with the same listing comes up but with descriptions, history, preview maps, and distortion information. Gain insight with the programmer’s unique and comprehensive expert knowledge will help guide your projection choice. While the map is projecting, a progress wheel with a rough remaining time will show in the upper left corner. Advanced datum support and transformation are provided.
Tip: The manual includes a full decision tree for choosing a projection. This is one of the best features of Geocart.
I love interrupted projections like the Goode homolosine and making one in Geocart is a cinch. Simply choose the Goode from the Pseudocylindric menu (oval icon on left) and then chose Projection > Interruptions > Goode Continental. While you’re getting the projection parameters, map size and resolution right, keep the rendering quality at draft (Map > Draft). When the settings are right, change that to Map > Final Quality for more precise results.
All databases in Geocart are geographic with live, on-the-fly transformations into your map’s specified projecting (see exception above for georeferenced images). I added in coastlines, rivers, lakes, country boundaries, US state boundaries into my test vector map. Even on my slowest, older laptop, rendering was responsive for basic usage creating vector world, regional, and country maps.
Tip: If you somehow end up with a strange looking map (off center, etc), choose Projection > Reset Projection and the current projection parameters will revert to defaults
Tip: When using a conic projection like Albers or Lambert, make sure the Projection > Projection Center is set to Latitudinal 0°N.
Geocart 3.0 is a world unto itself, however. While it does import raw data in shapefile format (YES!), it does not currently import or export PRJ files, part of the SHP file specification, the defacto geo data storage and exchange format. Imported SHP files must be in geographic projection. This makes sense in part as Geocart supports many more projections and parameters than most other mapping software packages (3 times as many as ArcMap, 6 times as many as Natural Scene Designer, 2 times as many as MaPublisher and Geographic Imager). Geocart also sometimes uses slightly different formulas for the same projections as the other applications (the author claims Geocart’s implementations fix errors in common formulas, which is probably the case based on my experience with the literature and web source code snippits).
But for the projections that are shared in common, it would be useful to offer PRJ support (including transformations out of the error prone versions), and shapefile export of databases after their coordinates have been transformed (and GeoTIFF for raster).
More importantly, PRJ files offer a quick load of common projection parameters. So if I’m in California I can load up the Albers with the standardized parameters so my data will interoperate with other cartographers working in that area, and they take some of the guess work out of choosing a map projection. Both ArcMap and MaPublisher are better then Geocart in this regard. MapTiler thru Proj4 is the worst. Azimuth (r.i.p.) is the best at setting appropriate projection and parameter for the visible, selected geography.
Tip: If you do have a PRJ file, open it in a text editor and manually copy over the parameters to Geocart. They use a “well known text” structure that is human readable.
Legend editor (stylizing your map)
Geocart includes basic legend editor for setting line and fill styles, appropriate for general reference mapping. Geocart is a general projection tool, not for making thematic maps. The layer sorting of individual databases is adjustable in the Map > Databases dialog.
Tip: Consistent styles can be shared between map projects by going to Preferences > New Map Line Styles.
Testing the limits
Don’t want to plot the entire world? Use Map > Boundaries to set a crop (and speed up map rendering). This window is quite amazing and has both 2d and 3d views with actual spherical trapezoids! Boundaries can be set relative to the projection center and can be a circular diameter, spherical trapezoid, or irregularly shaped “custom” boundary. To remove the boundaries, change the setting back to “Unconstrained”.
Quibble: When adjusting boundaries in most conic projection, your standard parallels should also change. A prompt should be provided in this use case to automatically adjust those to your new view. In the special case of setting standard parallels in Projection > Parameters, it would be helpful if Geocart showed these on a map like in the Projection Center dialog.
Quibble: The draw on map interface in Boundaries needs a little more work for modifying the existing settings. Other apps, like Geographic Imager, allow me to drag the edges of a drawn boundary while in Geocart I have to start over (or use the number fields). It’s also a little wonky when dragging exactly horizontal or vertical (a full latitude or longitude strip). There are also no ticker buttons to increment the parameter values, either. Once you have this set, though, you’re golden so it’s a minor inconvenience.
Next: Rendering quality and speed . . .
Above: Brand X on the left. Geocart at right. Examine the letter forms (U in United Kingdom, N in London, all in Paris, the Ca in Cariff). The Geocart render results in sharper, crisper letter forms with less “pixel burrs”. The demo water mark not with standing.
The key concept is Geocart creates an optimized map on each render. The original data resolution is stored in the document, but what draws on the screen is determined by the map size and resolution. Set that in Map > Set size and resolution. Once adjusted, the map will fill that space in the window. You can zoom in and out with the normal Cmd-+ and – keyboard shortcuts and the zoom with update in the window title.
When Geocart is set to render in Final mode, its output results in better output than applications that use only nearest neighbor or bicubic interpolation. In the example above, looking at the letter edges on London, the Geocart version is crisper and smoother. This also comes into play at the edges of a world map where the projection distortion is more extreme and is especially important with projecting raster data.
For my heavy-use scenario, I put Geocart up against the latest National Geographic world map
The map is in Winkel Tripel projection. I rasterized the PDF (took about 1 hour with Photoshop on my old laptop) and then loaded the image into Geocart and georeferenced it and saved it out as a database (78 mb, seems small), see section on Adding map data above. I then reprojected it Goode homolosine in Geocart. I also ripped out a platte carrée from Geocart and projected that into Goode in Geographic Imager, Natural Scene Designer, MapTiler (Proj4), and ArcMap.
The final projected Goode image dimensions was 22,700 pixels by 9,910 at 675 mb in TIFF image format. Enough detail to print back out as a wall map or tile for a web map service.
Geocart is built for speed and will utilize all processors, including multicore
Paul Messmer’s under the hood improvements allow the application to make 100% use of all processor cores. I was still able to use other applications while Geocart processed data, however. One side effect of supporting multiple cores is rendering occurs per core in real time, see screenshot below. Geocart also plays nice on idle.
I tested Geocart on 3 different machines, all Intel Macs running 10.5 or 10.6 from an older laptop to a new desktop towers. Application task completion speed increased directly proportional to the number of cores available.
Fun fact: Geocart uses a Hilbert curve to render the map when utilizing multiple cores to keep memory accesses as local as possible in order to make the best use of the processor caches. This results in seperate render traces on the screen, see image below.
At best “final” settings, the huge map in Goode homolosine projection described above took 20 min on the 16 core Mac Pro (2 x 2.93 quad core GHz quad-core Intel Xeon with 8 gb of RAM) but 1 hour 20 minutes on an older 4 core Mac Pro with the same RAM configuration. The draft render took significantly less time and was comparable in time and quality to Natural Scene Designer, Geographic Imager, ArcMap, and MapTiler (Proj4).
Because Geocart is always planning for the most general case with the most advanced options, this can slow down it’s rendering compared to other applications (most noticeable when in Final rendering mode). Future versions might speed up if special functions were added for the standard parameter cases. But by the time the programmer did that, the speed difference might be equivalent to increases in hardware speed and cores, so this doesn’t worry me much.
Compared the competition
Geographic Imager ($699 for Adobe Photoshop plugin, add $699 if you don’t already own Photoshop) did not support the interrupted form of the projection and produced confetti until I tweeked the settings. To project vectors, you’d need MaPublisher, a vector plugin from Avenza for Adobe Illustrator, will set you back $1399 plus cost for Illustrator. ArcMap (thousands of dollars) required a RGB (not indexed) version of the geographic TIFF version but insisted on reprojecting into grayscale. Natural Scene Designer ($160) produced the most comparable raster results and ease of use, but at less quality (though faster). It should be noted the Pro version of Natural Scene Designer 5 also supports multiple cores and limited vector shapefile support (raster rendering only), plus better handling of GeoTIFF with TFW export. MapTiler, Mapnik, and other open source GIS options are free but you’ll spend time setting them up and learning their make-by-and-for-programmer quirks.
Geocart is a good teaching tool as well when using the distortion visualizations and mouseOver readouts (available under Window > Information). The pertinent readouts are Angular deformation, Areal inflation, Scaler distortion, and Scale factor range.
Note: Geocart quit on me once when I tried to use Map > Copy Attributes while visualizing distortion with a very large selected map, but I was not able to replicate the error or any crash in subsequent testing sessions. In general I’ve found the program to be very responsive and to not hang up, even when rendering extremely large maps with multiple databases.
Quibble: The Information panel should display how long it took to render the selected map.
On exporting out your final map, vector (PDF) and raster (TIFF, PSB “Photoshop”, and JPG) formats are available. On opening the map in Illustrator, each database layer is conveniently grouped, with clipped content. Geocart could take a page out of IndieMapper’s layered SVG approach where the file format would still be PDF but the groups would be named and even better yet actual PDF layers.
Quibbles: Geocart suffers from the same zealous masking and embedding as other apps. If no boundaries have been defined in Geocart, the clipping masks should not be included. Saving out as PDF will embed the raster databases into the file, like all other programs. On export of the raster formats, an option should be provided to NOT export the vector database layers. Another option should be provided to export each raster database layer to a separate file (or layered TIFF / PSB). Needs to export out a PRJ file for the raster and GeoTIFF with embedded registration, pixel size, and projection tags.
Note: If you’re looking for SHP export, you’ll be disappointed. Though that’s kind of missing the point of Geocart. See “Choose a projection” section above.
Geocart 3 is a solid release that will satisfy most of your reference mapping needs, especially if projection matters to you. If you liked Geocart 2, you’ll definitely enjoy working with version 3, and on the latest computer hardware it simply screams. The addition of direct shapefile import removes a barrier to geodata access, though more could be made of the PRJ files and DBF attributes. There are still some missing features when compared to version 2 and daan (the programmer) is interested in hearing from the cartography community which should added back. They also seem responsive to fixing some of the usability issues I’ve noted above.
But where are those Kelso Corners, I ask? Besides being a personal soapbox, my blog is named for the “corners” that form when a pseudocylindric or lenticular projection is extended to fill out it’s rectangular bounding box by repeating content that would otherwise only be found on the opposite edge of the map. They are righteously awesome, plus they satisfy non-carto designers proclivity to design to a boxy grid. However, you can only find these “corners” on a few old print maps; I don’t know of a single digital app that creates them. I’ve staked naming rights
Pros: Over 175 projections (best in industry), support for advanced projection parameters, loss-less reprojection, PixSlice technology for sharper, more detailed raster images. Runs on both Windows and Mac, with support for multiple core processors. Now imports shapefile vector map data. Large document support. Easy to use. Software programmer responsive to emails and forum posts.
Cons: No PRJ support. Does not export GeoTIFF, or world file created after georeferencing images. Does not include a SHP filter in file dialogs, and file dialogs do not remember last browsed directory. Should start with blank new document on launch. Linework generalization engine filters just by Douglas-Peucker in this version, not the smooth bezier curves found in Geocart 2 or the amazing generalization found at MapShaper.org. Rendering in PixSlice can significantly increase render times. No support for scripting/automation. No export back to SHP format (especially with DBF attributes), useful for thematic mapping in a secondary GIS application.
[Editor’s note: “As we try to integrate highly resolved data into existing GIS, the errors in legacy data will become more apparent.” Jeff outlines the problem through his experience at the BLM in Oregon. Jeff is also responsible for early “bump mapping” of digital terrain models (DEMs).]
Republished from the ESRI ArcUser Winter 2010.
By Jeffery S. Nighbert, U.S. Bureau of Land Management
The ability to obtain precise information is nothing new. With great patience and skill, mapmakers and land surveyors have long been able to create information with an impressive level of accuracy. However, today the ability to determine and view locations with submeter accuracy is now in the hands of millions of people. Commonly available high-resolution digital terrain and aerial imagery, coupled with GPS-enabled handheld devices, powerful computers, and Web technology, is changing the quality, utility, and expectations of GIS to serve society on a grand scale. This accuracy and precision revolution has raised the bar for GIS quite high. This pervasive capability will be the driver for the next iteration of GIS and the professionals who operate them.
When I say there is a “revolution” going on in GIS, I am referring to the change in the fundamental accuracy and precision kernel of commonly used geographic data brought about by new technologies previously mentioned. For many ArcGIS users, this kernel used to be about 10 meters or 40 feet at a scale of 1:24,000. With today’s technologies (and those in the future), GIS will be using data with 1-meter and submeter accuracy and precision. There are probably GIS departments—in a large city or metro area—where this standard is already in place. However, this level of detail is far from the case in natural resource management agencies such as Bureau of Land Management (BLM) or the United States Forest Service. But as lidar, GPS, and high-resolution imagery begin to proliferate standard sources for “ground” locations, GIS professionals will begin to feel the consequences in three areas: data quality, analytic methods, and hardware and software.
Republished from The Washington Post.
In Idaho, scientists are using remote imaging to study evapotranspiration, the loss of water to the atmosphere by evaporation from soil and water, and by transpiration from plants.
Water management is serious business in the American West, where precipitation is scarce, irrigated agriculture is a major industry, new housing subdivisions spread across arid landscapes and water rights are allocated in a complicated seniority system.
[Editor’s note: I was able to attend the reception for GeoEye-1 at the fabulous Newseum last week in Washington, DC. The imagery from this new satellite is truly awesome. Look for it soon in The Washington Post, and in Google Earth.]
The NGA analysts aren’t tapping the government’s huge network of highly classified spy satellites; they’re getting the pictures from commercial vendors. That’s the same stuff pretty much anyone can get, either through free, online programs, such as Google Earth, or by buying it from the same companies supplying Uncle Sam.
It’s a remarkable turn, given the warnings that security experts in the USA and worldwide raised a few years ago about giving the entire planet — terrorists and rogue states included — access to high-resolution satellite photos once available only to superpowers.
Last month, the most powerful commercial satellite in history sent its first pictures back to Earth, and another with similar capabilities is set for launch in mid-2009. The imagery provided by those and other commercial satellites has transformed global security in fundamental ways, forcing even the most powerful nations to hide facilities and activities that are visible not only to rival nations, but even to their own citizens.
Although no one disputes that commercial imagery poses threats, it has been embraced in ways few predicted.
“It’s created a lot of opportunities to do things we couldn’t do with (classified) imagery,” says Jack Hild, a deputy director at NGA, which provides imagery and mapping for defense and homeland security operations.
Pictures from government satellites are better than commercial photos, but how much better is a secret. Only people with security clearances generally are allowed to see them. Using commercial products, intelligence agencies can provide imagery for combat troops, which wasn’t possible before because of the risk of it reaching enemy hands and even international coalition partners.
Federal agencies use commercial imagery to guide emergency response and inform the public during natural disasters, such as this year’s Hurricane Ike. It’s also used by government scientists to monitor glacial melting and drought effects in the Farm Belt.
When commercial satellite photos first hit the market, “the gut reaction was, ‘We can’t allow this imagery to be out there because someone might do us harm with it,’ ” Hild says. “Are there still bad things that people can do with commercial imagery? Absolutely … but we think the benefits far outweigh the risks.”
Other nations share the sentiment. U.S. and foreign government contracts provide critical income for commercial imagery companies, such as Digital Globe and GeoEye — both of which supply photos for Google Earth.
“Most of our revenue (is) from governments,” says Mark Brender, vice president of GeoEye, which got half its 2007 revenue from the U.S. government and 35% from foreign governments. “They have a core competency in understanding how to use this technology — and a national security imperative to do so.”
In August 2006, the Islamic Army in Iraq circulated an instructional video on how to aim rockets at U.S. military sites using Google Earth.
Posted on a jihadist website, the video showed a computer using the program to zoom in for close-up views of buildings at Iraq’s Rasheed Airport, according to an unclassified U.S. intelligence report obtained by USA TODAY. The segment ended with the caption, “Islamic Army in Iraq/The Military Engineering Unit — Preparations for Rocket Attack.”
The video appeared to fulfill the dire predictions raised by security experts in the USA and across the globe when Google began offering free Internet access to worldwide satellite imagery in 2005. Officials in countries as diverse as Australia, India, Israel and the Netherlands complained publicly that it would be a boon to terrorists and hostile states, especially since the pictures often provide a site’s map coordinates.
Indeed, some terrorist attacks have been planned with the help of Google Earth, including an event in 2006 in which terrorists used car bombs in an unsuccessful effort to destroy oil facilities in Yemen, according to Yemeni press reports. Images from Google Earth and other commercial sources have been found in safe houses used by al-Qaeda and other terror groups, according to the Pentagon.
Many security experts say commercial imagery does little to enhance the capabilities of such organizations.
“You can get the same (scouting) information just by walking around” with a map and a GPS device, says John Pike, director of GlobalSecurity.org, a research organization specializing in defense and intelligence policy. The imagery “may give someone precise coordinates (for a target), but they need precise weapons … and their ability to target discrete parts of a particular site is pretty limited. People who think this gives you magical powers watch too many Tom Clancy movies.”
Republished from the Wall Street Journal, Oct. 1st, 2008. By SIOBHAN GORMAN. Thanks Laris!
WASHINGTON — The Department of Homeland Security will proceed with the first phase of a controversial satellite-surveillance program, even though an independent review found the department hasn’t yet ensured the program will comply with privacy laws.
Congress provided partial funding for the program in a little-debated $634 billion spending measure that will fund the government until early March. For the past year, the Bush administration had been fighting Democratic lawmakers over the spy program, known as the National Applications Office.
The program is designed to provide federal, state and local officials with extensive access to spy-satellite imagery — but no eavesdropping — to assist with emergency response and other domestic-security needs, such as identifying where ports or border areas are vulnerable to terrorism.
Since the department proposed the program a year ago, several Democratic lawmakers have said that turning the spy lens on America could violate Americans’ privacy and civil liberties unless adequate safeguards were required.
A new 60-page Government Accountability Office report said the department “lacks assurance that NAO operations will comply with applicable laws and privacy and civil liberties standards,” according to a person familiar with the document. The report, which is unclassified but considered sensitive, hasn’t been publicly released, but was described and quoted by several people who have read it.
The report cites gaps in privacy safeguards. The department, it found, lacks controls to prevent improper use of domestic-intelligence data by other agencies and provided insufficient assurance that requests for classified information will be fully reviewed to ensure it can be legally provided.
A senior homeland-security official took issue with the GAO’s broad conclusion, saying the department has worked hard to include many layers of privacy protection. Program activities have “an unprecedented amount of legal review,” he said, adding that the GAO is seeking a level of proof that can’t be demonstrated until the program is launched.
Homeland Security spokeswoman Laura Keehner said department officials concluded that the program “complies with all existing laws” because the GAO report didn’t say the program doesn’t.
Addressing the gaps the agency cited, Ms. Keehner said current laws already govern the use of intelligence data and the department has an additional procedure to monitor its use. The department will also work with other intelligence agencies to “ensure that legal reviews and protection of classified information will be effective,” she said.
In response to the GAO report, House Homeland Security Committee Chairman Bennie G. Thompson of Mississippi and other Democrats asked Congress to freeze the money for the program until after the November election so the next administration could examine it.
But the bill Congress approved, which President George W. Bush signed into law Tuesday, allows the department to launch a limited version, focused only on emergency response and scientific needs. The department must meet additional requirements before it can expand operations to include homeland-security and law-enforcement surveillance.
The restrictions were “the most we could have required without a complete prohibition,” said Darek Newby, an aide to Democratic Rep. David Price of North Carolina, who heads the House homeland-security spending panel.
But California Rep. Jane Harman, who heads a homeland-security subcommittee on intelligence, said that even limited funding allows the department to launch the program, providing a platform to expand its surveillance whether or not privacy requirements are met.
“Having learned my lesson” with the National Security Agency’s warrantless-surveillance program, she said, “I don’t want to go there again unless and until the legal framework for the entire program is entirely spelled out.”
Rep. Thompson vowed to fight expansion of the program until privacy issues are further addressed.
Write to Siobhan Gorman at email@example.com