Archive for the ‘Google Earth’ Category

IndieProjector for KML and Shapefiles (IndieMapper)

Monday, June 1st, 2009




[Editor's note: The brilliant folks at AxisMaps have done it again with this free online tool for reprojecting KML and shapefiles.]

Republished from Axis Maps / IndieMapper.

Indieprojector is a free web service that re-projects digital map files and converts them to SVG for use in vector graphics editing software. Map projections are an essential part of map making but we found the existing tools to be too expensive, inflexible or complicated. Indieprojector is the smarter, easier, more elegant way to reproject and convert geographic data. It’s a preview of our indiemapper technology that will bring map-making into the 21st century using web-services and a realtime visual approach to cartographic design.

Read more and watch demo screencast . . .

Go directly to IndieProjector to reproject your KML and shapefiles . . .

Hardest National Geographic Bee Yet Goes to 13-Year-Old (NG)

Thursday, May 21st, 2009


[Editor's note: The annual geo bee's U.S. round wrapped up yesterday. The winner, Eric Yang, will compete in the world championship in Mexico City. The contest was sponsored by Google this year and is hosted by Alex Trebek. Video of the winning question. Thanks Jo!]

Don’t mess with Texas seventh grader Eric Yang—at least when it comes to geography. Today the 13-year-old swept the toughest National Geographic Bee to date—with a perfect score.

Yang, of Griffin Middle School in The Colony, Texas, won the annual competition during a tie-breaker round with this question: “Timis County shares its name with a tributary of the Danube and is located in the western part of which European country?”

The answer, Romania, comes with a U.S. $25,000 college scholarship, a lifetime membership in the National Geographic Society, and a trip to the Galápagos Islands with Jeopardy! host and Bee moderator Alex Trebek.

Continue reading at National Geographic . . .

Interview with MarineMAP Mashup Developers (Kelso)

Tuesday, April 21st, 2009


[Editor's note: MarineMAP is a cutting edge mashup built using PostGIS, GeoDjango, Ajax, Flash, OpenLayers, GeoServer and MapServer with Google base map tiles. It assists stakeholders in the design of MPAs (Marine Protected Areas) in mapping oceanographic, biological geological, chemical and human dimensions of the ocean and coastal areas. I talk with Will McClintock and Chad Burt of the Marine Science Institute at University of California at Santa Barbara about the technical underpinnings and development philosophy behind the project. One key to the project's success (rolled out Dec. 2008) has been the hiring of dedicated programmers to implement design ideas and new technology to extend an earlier version's usability and reach. Thanks Melissa and Sebastian!]

Interact with the MarineMAP at

Interactive Map Tool Objective: MarineMap is an internet-based decision support tool that provides the capacity for the SCRSG (South Coast Regional Stakeholder Group) to view data layers, create individual MPA concepts, assemble collections of individual MPA concepts into MPA arrays, receive basic feedback on how well MPA concepts and arrays meet guidelines for MPA design, and submit MPA arrays to staff as MPA proposals. This tool will be the primary way in which MLPA Initiative staff and SCRSG members capture and store information regarding MPA proposals.


(Above) Screenshot above showing Marine mammal and Nearshore habitat layers on base map with area Measurement Tool enabled.

(Question) Kelso’s Corner: What technologies are leveraged in MarineMAP?

(Anwer) MarineMAP: We’re not using ArcGIS at all, save for cutting map tiles (using ArcGIS Desktop and Arc2Earth) and, as a non-critical component of the system, ArcSDE / SQL Server. We’re mainly using PostGIS, GeoDjango, Ajax, Flash, OpenLayers, GeoServer and MapServer and will soon switch to the Google Earth API.

We are using OpenLayers, rather than the Google Maps API for our “slippy map”. OpenLayers is pure javascript, as is most of the client application. We are using Flex, but only for the charting component. [Editor's note: OpenLayers is using the Google Maps tiles.]

(Q) Kelso’s Corner: How many programmers do you have on staff to deal with all the software components?

(A) MarineMAP: Currently, two of our developers work full-time on MarineMap, while our other two developers work half time. We also have several GIS analysts and a cartographer to deal with the data end of things. We are now looking for a full-time, in-house Assistant Web Developer to continue working on MarineMap. As we extend MarineMap to different geographies and planning processes, we anticipate that we’ll be looking for one or two more programmers as well.

(Q) Kelso’s Corner: What was the rational for doing this substantial map development in house? Did you evalutate other routes, consultants, off the shelf software before going this route, why was this option preferable? Did you have a good cheat sheet for how to develop / implement this technology? Did you have to hire new staff to do the programming or did you have existing expertise to draw on?

(Anwer) MarineMAP: We did not have a cheat sheet for how to develop / implement this technology. This was a brand new application using some new technologies, and some that we were familiar with. Of course, we had experience developing other applications and some of these technologies overlapped. But, there was a significant amount of learning happening for all of our developers.

The MLPAI is an on-going process that will terminate sometime around 2011. Until then, we need to have a highly functional and stable application that can be adapted to the changing needs of the process. It turned out to be much more cost-effective and time efficient to hire in-house developers to work on the application year-round. Before we built our team, we spent a significant amount of time considering a host of alternatives, including trying to maintain and tweak Doris, contracting out all of the work, etc.  Initially, we felt we did not have enough in-house expertise. Although we already had Chad Burt (UCSB), Jared Kibele (UCSB), Tim Welch (Ecotrust) and, now, Ken Vollmer (Ecotrust) as our in-house crew, we eventually contracted two developers from Farallon Geographics (Dennis Wuthrich and Alexei Peters) for a limited period to  help with developing the database schema. This was particularly nice given that we had only 6 months to get the first version of MarineMap out the door. Dennis and Alexei are no longer working on the project but I am very grateful that we had access to their time and expertise during the initial phases.

(Q) Kelso’s Corner: What was Doris?

(Anwer) MarineMAP: At the beginning of the Marine Life Protection Act Initiative (MLPAI), staff chose to hire consultants to build an application (eventually called “Doris”) that was built on ArcGIS Server 9.1 technologies. It shared some of the features of MarineMap, including drawing MPAs and arrays, and generating reports on what was being captured inside those MPAs. Doris had a poorly designed interface and, perhaps more significantly, it was dreadfully slow. Consequently, few stakeholders used it. Furthermore, because the application was built using technologies with which we had no particular in-house expertise, and because these technologies were proprietary, we had a difficulty updating the application or tweaking it on the fly. (I had been running ArcSDE / ArcIMS and ArcGIS Server applications for a couple years but had no real development expertise in, say, ArcObjects, or VB .Net.)

(Q) Kelso’s Corner: It seems there are many more RubyOnRails developers than Django. Have you found this a hindrance for hiring staff or when looking for trouble shooting advice?

(Anwer) MarineMAP: It does seem to be a bit of a challenge finding Django developers, particularly those that can / will work locally. I have not tried to hire a RubyOnRails expert so I have no direct means comparison.

(Q) Kelso’s Corner: Why will you be switching to the Google Earth API? Is this only for the front end? Have you been happy with GeoDjango?

(Anwer) MarineMAP: GeoDjango has been fantastic. Using the Google Earth API does not mean ditching GeoDjango. Rather, using the Google Earth API represents a shift away from the OpenLayers API. We’ll still be using GeoDjango extensively.

[Our lead developer] was a big proponent of RubyOnRails for quite some time, but Django has taken many of its best ideas to Python. While Ruby is aesthetically a beautiful language, Python is usually much faster and has a more mature set of modules to build on. The only thing I miss after switching over to Django is the database migrations Rails offers. Most open source GIS packages also have bindings for Python, where as there a few similar tools for Ruby.

Switching to the Google Earth API will just mean replacing OpenLayers. OpenLayers is a very good library, but the Earth API is much faster due to the fact that it is a compiled plugin rather than being written in javascript. This allows it to display thousands of placemarks on screen at once, which is one of the primary reasons for switching. Google Earth can also display temporal and 3d data.

(Q) Kelso’s Corner: Besides the change to Google Earth API, what other changes, updates do you plan for this online map?

(Anwer) MarineMAP: Besides switching to the Google Earth API, there is one major upcoming update to MarineMap. Specifically, we will be implementing a map-based (i.e., location based) discussion forum. Users will be able to zoom into a location on a map and tag objects (MPAs, data, places) with a comment. Other users will see these comments (if they have comments “turned on”) as they zoom in to a location or if they load an MPA. Users can then participate in a dialog via a traditional discussion forum that is linked to the map. Furthermore, users will be able to define a geographic region and subscribe to RSS feeds (using GeoRSS) for any activity within that region. One might choose to do this, for example, if they want to be notified by email any time somebody draws a new MPA in, or makes a comment about a data layer in a specific region that he / she cares most about. I believe the map-based discussion forum will go a long way in facilitating discussion about MPAs, particularly outside the in-person monthly stakeholder meetings.

Conclusion: Thanks so much for the informative Q&A session. Please check out the MarineMap project at

Ed Prado Museum Tour … Now on Google Earth (Duke CIT)

Tuesday, March 10th, 2009

[Editor's note: Google has started to add museum collections to Google Earth. The Prada in Madrid includes a self portrait by Albrecht Dürer, Las Meninas, the dark Goyas, and the Fusilamientos del Tres de Mayo. Video includes section on how Google took the photos. Thanks KL!]

Republished from the Duke
Original January 14th, 2009 by Randy Riddle.

Google has added the El Prado museum to Google Earth, allowing you to not only see the buildings, but to do a “virtual tour” of 14 paintings in the collections, viewing them in incredible detail – each painting is captured and presented in 14 billion pixels.

Below is a short video and you can also read a blog post at Gizmodo about the project.

See SRTM Satelite at Smithsonian’s Udvar-Hazy Center!!! (Kelso)

Friday, February 27th, 2009

I went out to the Smithsonian’s Udvar-Hazy Air and Space museum annex at Dulles International Airport in Chantilly, Virginia last weekend and was pleasantly surprised to see one of the SRTM payloads hanging off the ceiling. The Shuttle Radar Telemetry Mapping program helped produce a significantly more accurate and detailed world-wide digital elevation model (DEM, DTM) in the early part of this decade and was a great leap forward for shaded relief generation. If you make the trip, you’ll find the SRTM between 22 and 23 on the map below in the “space shuttle” hanger. The map does a good job of indicating what altitude different aircraft can be found in the hanger.

Here’s a photo:

Cannister/Mast, Shuttle Radar Topography Mission Payload
(republished from the Smithsonian)
In 2000, the Shuttle Endeavor carried the Shuttle Radar Topography Mission (SRTM) payload into orbit. Shuttle astronauts used the payload, manufactured by the AEC-Able Engineering Co., to map in high detail and three dimensions more than 70 percent of the Earth’s surface–the most complete and accurate rendering of the planet’s land masses ever attempted. The Museum possesses two components–the mast canister (this artifact) and the outboard support structure with its antennas–crucial to that mission.

To acquire this data, the SRTM used a novel hardware system that featured a main antenna located in the Shuttle payload bay, a folding mast (in the mast canister) that extended 60 meters from the Shuttle, and then another antenna system that was positioned at the end of the mast (the outboard structure). It was this dual antenna system–the largest rigid structure then flown in space–that produced, through interferometry (a technique for combining the information obtained from the two, separate antennas), a three-dimensional mapping of the Earth.

The mission was a joint undertaking of NASA’s Jet Propulsion Laboratory and the Department of Defense’s National Imagery and Mapping Agency. The military will use the highest resolution data from SRTM for terrain navigation for planes and cruise missiles. A lower resolution data set will be made available to civilian scientists and other users.

NASA transferred these artifacts to the Museum in 2003.

Transferred from NASA
Manufacturer: AEC-Able Engineering Co.
Country of Origin: United States of America
Overall: 292.1 length x 136cm diameter, 984.3kg weight (9ft 7in. x 4ft 5 9/16in., 2170lb.)
Aluminum, steel, titanium, plastic, copper
Inventory Number: A20040261000

General layout of the museum:

Google Outs Earth 5 with Ocean Floor, More (Electronista)

Tuesday, February 3rd, 2009

UPDATE: Be cautious about installing GE 5 on your Mac. Wired has the details . . .

[Editor's note: New 3d ocean floor elevation data, historical land imagery, ability to record virtual tours, and 3d planet Mars mode come to Google Earth in version 5 released Monday, Feb. 2, 2009.]

Republished from Electronista / MacNN.
Google’s LatLong blog has official coverage: Historical Imagery and Ocean elevation data.

Download version 5 from Google for Mac, Windows, and Linux.

Google on Monday announced the immediate release of Google Earth 5.0, bumping it up from the previous 4.3 build. Among the biggest changes are the inclusion of a detailed 3D ocean floor, the ability to go up to 50 years back in time when looking at a particular location, record a virtual tour of locations, and a 3D rendition of Mars. The ocean feature was developed together with many partners, including National Geographic, the Monterey Bay Aquarium and the US Navy, among others. The approximate two-thirds of the planet can now be viewed under water and includes videos and images of ocean life, along with details on surf spots, expedition logs and more. The historical images are accessed via a clock icon on the toolbar when viewing a location on the planet. The Touring feature lets travelers show off their journeys by recording navigating through their destinations and easily sharing them with peers. The fly-throughs can be narrated for an organized flow of a multi-stop journey.

Thanks to a joint project with NASA, Google Earth now also extends beyond to include a 3D map of Mars. Apart from 3D terrain, there are annotations describing the location and circumstanced associated with landing sites and the red planet’s other curiosities.

The download is free for both Mac and Windows PCs. Comprehensive information on the new features of Google Earth will be published throughout the week on Google’s Lat Long blog.

RUMOR: Google Earth 5 on Monday (Google Earth Blog)

Friday, January 30th, 2009

[Editor's note: New features are coming to an application near you. Thanks Laris!]

Republished from Google Earth Blog.
Original publish date: January 26, 2009.

Big Google Earth Announcement with Al Gore and More

The tech world was abuzz this weekend with rumors about a big upcoming event concerning Google Earth. WebProNews and AppScout were the first to report on Friday. Google has sent out an invitation to the press, including Google Earth Blog, for a “Special announcement about Google Earth” on February 2nd in San Francisco. And this event looks like it could be the biggest announcement since Google Earth was released! Speakers include: former Vice President Al Gore, CEO of Google Eric Schmidt, VP of Google Marissa Mayer, and Director of Google Geo John Hanke. Wow!

There are no specifics on the announcement mentioned in the invitation. Just some comments about how Google Earth has reached hundreds of millions of people around the world. The last time Google had this many dignitaries to make an announcement for Google Earth was in June of 2006 when they announced the upcoming release of Google Earth 4. Eric Schmidt and the two co-founders of Google (Larry Page and Sergey Brin) were there for the announcement made by John Hanke at that event. Google Earth 4 introduced photorealistic textures to 3D models, GE for the Mac and Linux, multi-lingual support, and a huge global imagery update covering many countries for the first time.

Another clue for this announcement was some other speakers for the announcement: Sylvia Earle – Explorer-in-Residence for National Geographic Society; Terry Garcia – EVP for National Geographic Society, and Greg Farrington, Executive Director for California Academy of Sciences. The last one isn’t surprising because the invitation says the announcement will be held at the California Academy of Sciences.

The big clue is Sylvia Earle. As pointed out by everyone, Sylvia Earle is a world renowned oceanographer. So, of course, the immediate conclusion is that Google Ocean is finally about to be introduced. Rumors have been flying about Google Ocean for quite a while.

So, clearly Google Earth is going to get some new Ocean-related data. Google just added new detailed ocean floor imagery last week. And, it’s a known fact that several of the parties involved with that also have worked on 3D bathymetry. Google Earth to date has not had many layers which provide data about the ocean. And the ocean terrain has always been flat (2D) in Google Earth. More ocean data is an area I’ve been looking forward to with great anticipation. Especially since this year my wife and I are departing to spend the next five years circumnavigating the oceans by sailboat. Having Google Earth help us explore the oceans will be handy! Google Earth has needed more information about the 75% of the Earth most of us ignore.

I don’t think this announcement will be confined to just Google Ocean though. When Google makes an announcement like this, they always try to push the envelope on multiple fronts. And, with Al Gore headlining the event, I’m sure we’re going to get some data about the environment. I’m expecting lots of new features and data to write about in February. It’s going to be exciting! I just wish I could attend the event myself – but, unfortunately we’ve got plans for next week which keep me from going. But, have no fear, I’ll still be reporting on this major event!

Using Wii Balance Board to Fly Through Google Earth (Google)

Friday, January 16th, 2009

[Editor's note: And here I thought the Wii was just for bowling, lol.]

Republished from Google Lat Long blog.
Thursday, January 8, 2009 at 1:56 PM

This year for Macworld I decided to create a program that allows people to “surf” any region on the Earth’s surface using a Nintendo Wii Balance Board and the Google Earth API.  To do this, I used the Google Earth Browser Plug-in with a Javascript API.  The Wii Balance Board transmits the your movements to the Earth Surfer application using Bluetooth and allows you to maneuver a virtual milk truck by shifting your balance as if you were on a surfboard.

Check out the following video to see it in action:

While it’s fun to use Earth Surfer, I really wrote it to inspire others to write their own programs. It’s all open source using the Apache License, so you can use the code in your own programs, even commercial ones.

It is based on Thatcher Ulrich’s terrific Javascript Monster Milktruck demo, which is an open source program on a webpage. I wrapped it as a Macintosh application program so I could add Objective-C.  Objective-C uses the Macintosh Bluetooth support to decode the Bluetooth packets from the Wii Balance Board. The Balance Board support is my work. I based that on DarwiinRemote, open source decoders for the Wii Remote.
Earth Surfer and its source code will be available next week on the Google Mac Developer Playground.

Another amusing hack for surfing in Google Earth.

KML to Shapefile File Conversion (Zonum)

Friday, December 12th, 2008

[Editor's note: Useful free tool for converting KML files to Shapefile for use in the GIS. Thanks Mary Kate!]

Republished from Zonum Solutions. Kml2shp file conversion

Need of transferring Google Earth Data to a GIS? Kml2shp transforms KML files into ESRI Shapefiles.

Download. Windows program. No Mac version.

The KML file could contain Points, Paths and Polygons. When creating SHP files the information is separated into thematic layers.

For each shapefile (shp), an attributes table (dbf) and index file (shx) are created.

The kml to shp conversion consists of three steps:

1) Open KML file
2) Choose Shape Type
3) Select output Shapefile name

Optionally, you can change from WGS84 to a local datum and from Lat/Lon to UTM.

Also, Kml2shp can export to AutoCAD (DXF) and GPS (GPX)

kml2Shp is a beta freeware tool. This program doesn’t need to be
installed, just unzip it and run it. contains the executable file (kml2shp.exe) and some bpl files. If you receive a message error about missing bpl files, come back here and get them.