Posts Tagged ‘Mashup’

One Planet Many People: Atlas of Our Changing Environment (UNEP)

Friday, July 31st, 2009

unatlas

[Editor's note: Fun site from the United Nations Environment Programme highlighting changes in the natural environment with side-by-side remotely sensed imagery and full write up of each place. Done both in Google Maps and available as a Google Earth feed. Map is fairly decent.]

Republished from United Nations Environment Programme.

Increasing concern as to how human activities impact the Earth has led to documentation and quantification of environmental changes taking place on land, in the water, and in the air. Through a combination of ground photographs, current and historical satellite images, and narrative based on extensive scientific evidence, this publication illustrates how humans have altered their surroundings and continue to make observable and measurable changes to the global environment.

Continue to Interactive Atlas: Google Maps | Google Earth

Travellr: Behind the Scenes of our Region-Based Clusters (Google GeoDev)

Monday, July 6th, 2009

[Editor’s note: The age-old rule for cloropleth mapping that suggests aggregation by multi-scale areal units based on the map’s zoom level is slowly seeping into “clustering” for the point-based mashup geo community. This overview from Travellr published on the Google GeoDevelopers blog includes two illustrations that show the power of this technique. I used such a technique (different implementation) on The Washington Post’s recent swine flu mapping.]

Republished from Google GeoDevelopers Blog.
Monday, June 22, 2009

Recently, there has been a lot of interest in clustering algorithms. The client-side grid-based MarkerClusterer was released in the open source library this year, and various server-side algorithms were discussed in the Performance Tips I/O talk. We’ve invited the Travellr development team to give us insight on their unique regional clustering technique.

Travellr is a location aware answers service where people can ask travel-related questions about anywhere in the world. One of its features is a map-based interface to questions on the site using Google Maps.

dxjmnbf_4t5qkqpfw_b1
Figure 1. An example of the Travellr Map, showing question markers for Australia.

Clustering for usability
We learned that the best way to display markers without cluttering our map was to cluster our questions depending on how far you zoom in. If the user was looking at a map of the continents, we would cluster our questions into a marker for each continent. If the user zoomed-in to France we would then cluster our questions into a marker for each region or city that had questions. By clustering our data into cities, regions/states, countries, and continents, we could display relevant markers on the map depending on what zoom level the user was looking at.

Optimizing for Clustering
Our next challenge was how to extract clustered data from our database without causing excessive server load. Every time the user pans and zooms on the map, we need to query and fetch new clustered data in order to display the markers on the map. We also might have to limit the data if the user has selected a tag, as we’re only interested in a questions related to a topic (ie: “surfing”). To execute this in real-time would be painstakingly slow, as you would need to to cluster thousands of questions in thousands of locations with hundreds of tags on the fly. The answer? Pre-cluster your data of course!

Step 1. Structure your location data
When a question is asked about a city on Travellr, we also know its region/state, country and continent. We store more than 55,000 location points as a hierarchy, with each location “owning” its descendent nodes (and all of their data). Our locations are stored in a Modified Preorder Tree (also called Nested Sets). Modified Preorder Trees are a popular method of storing hierarchical data in a flat database table, having a focus on efficient data retrieval, and easy handling of sub trees. For each location we also keep a record of its depth within the tree, its location type (continent, country, region/state, or city), and its co-ordinates (retrieved using the Google Maps geocoder).

Step 2. Aggregate your data
We calculate aggregate data for every branch of our locations tree ahead of time. By storing aggregate data for cities, regions/states, countries, and continents, we provide an extremely fast and inexpensive method to query our locations database for any information regarding questions asked about a particular location. This data is updated every few minutes by a server-side task.

Our aggregations include:

  • Total question count for a location
  • Most popular tags for that location
  • Number of questions associated with each of those tags.

How we query our structured, aggregate data on the map
Whenever the user zooms or pans the map we fire off a query to our (unpublished ;) API with the tags they are searching for, the current zoom level, and the edge co-ordinates of the map’s bounding box. Based on the zoom level (Figure 2) we work out whether we want to display markers for continents, countries, states, or cities. We then send back the data for these markers and display them on the map.

dc287ncr_29cb84v7ct_b
Figure 2. Clustering at different zoom levels (blue = continents, countries, pink = states, cities)

Everyone Wins
So what is the result of structuring and aggregating our data in such a way? It means that we have nicely organized, pre-clustered data that can be read from cheaply and easily. This allows us to provide a super-fast map interface for Travellr that puts minimal load on our infrastructure. Everyone is happy!

Comments or Questions?
We’d love to hear from you if you have any questions on how we did things, or suggestions or comments about Travellr’s map. This article was written by Travellr’s performance and scalability expert Michael Shaw (from Insight4) and our client-side scripting aficionado Jaidev Soin.

You can visit Travellr at www.travellr.com, or follow us on Twitter at twitter.com/travellr.

Yahoo! Geo Technologies

Monday, June 22nd, 2009

[Editor’s note: Yahoo! provides advanced mapping capabilities including GeoPlanet, a Web 2.0 gazetteer of world placenames (see also GeoName’s post on the relational ontology / the semantic web).]

Republished from Yahoo!

Yahoo! wants to connect the Web to the World; here you can access our increasing portfolio of platforms to help you geo-enrich your applications and make the Internet more location-aware:

GeoPlanet™: Provides the geographic developer community with the vocabulary and grammar to describe the world’s geography in an unequivocal, permanent, and language-neutral manner. (Blog post)

GeoPlanet Data: Tab-delineated files containing WOEIDs and the corresponding place-names that underlie GeoPlanet.

Placemaker™: Identify, disambiguate, and ‘extract’ places from unstructured and structured textual content to help create local- and location-aware applications. (Blog post)

Fire Eagle™: Allows users to share their location with sites and services through the Web or a mobile device.

Maps: Embed rich and interactive maps into your web and desktop applications.

INTERACTIVE MAP: Impacted U.S. GM Plants (Kelso via Wash Post)

Wednesday, June 3rd, 2009

[Editor's note: This interactive from Tuesday shows plants set to close under G.M.'s restructuring plan in the context of all of G.M.'s manufacturing facilities. Checkboxes allow different types of facilities to be filtered and zoom presets make it easier to zoom into clusters of markers. A reset button allows the display to return to it's original state. A table below the map contains some of the same information.]

Republished from The Washington Post.

Under GM’s restructuring plan, the automakers’s manufacturing facilities will be reduced to 33 by 2012. Three distribution centers will also be eliminated.

Click on map icons for plant name, location, number of employees and more. Screenshot below.

Interact with original version at The Washington Post . . .

gm_plant_closures_map

Mapping Foreclosures in the New York Region (NY Times)

Wednesday, June 3rd, 2009

[Editor’s note: The interactive Google Maps mashup in Flash AS3 from the New  York times shows vector overlay of choropleth mapping by census tract and at the street level via dot distribution. As the user zooms in, the dots are revealed, as is a street map. At all levels the census tract summary statistics are available with a mouse over. Zooms are preset for some areas, and the user can type in their own address to zoom to that area. Multiple years add time dimension. Spatial brushing on the map is accomplished by outlining the geography’s stroke, not changing the fill color. Thanks Laris!]

Republished from the New York Times. May 15, 2009

A New York Times analysis found that foreclosure rates in the region were highest in areas with high minority populations. Zoom in to see foreclosures at the street level. Screenshot below.

Interact with the original at New York Times . . .

nytimes_foreclosures_map_3

nytimes_foreclosures_map_2

nytimes_foreclosures_map_1

MAP: Track Swine Flu Cases (Kelso via Wash Post)

Thursday, April 30th, 2009

[Editor's note: This is my mashup for The Washington Post tracking the H1N1 swine flu outbreak. It is custom built using the Google Maps for Flash API. Reporting is by Washington Post staff and is updated several times each day. Zoom in and out of the map to see more detail and different symbolization approaches.]

Screenshots below:

Interact with the original at The Washington Post . . .

Use our interactive mashup to track the distribution of swine flu (H1N1) cases around the world. Click on the map markers to learn more about cases at each location. 

SOURCES: Staff reports; Centers for Disease Control and Prevention; and World Health Organization. Interactive by Nathaniel V. Kelso; Research by Madonna Lebling, Robert Thomason, Mary Kate Cannistra and April Umminger – The Washington Post. First published April 27, 2009 at 10 p.m.

World Map of Swine Flu Outbreak

World Map of Swine Flu Outbreak

World Map of Swine Flu Outbreak

Interact with the original at The Washington Post . . .

ESRI’s ArcGIS Server Provides Foundation for Maryland’s MD iMap (ESRI)

Tuesday, April 28th, 2009

mdimap

[Editor’s note: One of the more useful + powerful sites to leverage new Flash / Flex mashup capabilities of new ArcGIS 9.3 release. The site is designed both for state residents and government policy makers. Thanks Mary Kate!]

Republished from ESRI and State of Maryland. Original Feb. 11, 2009.

Authoritative Statewide Basemap and Performance Measurement Tool Serves Government and Citizens

Redlands, California—Maryland Governor Martin O’Malley recently launched the ArcGIS Server software-based MD iMap, an authoritative online basemap of Maryland that allows government and citizens to assess state, local, and municipal performance. As the portal into the state’s enterprise geographic information system (GIS), MD iMap also provides data to governments throughout the state including seamless, geocoded statewide centerlines and six-inch imagery. MD iMap embodies O’Malley’s vision of “one Maryland, one map.”

“In Maryland, GIS is vital to setting goals, tracking performance, and creating transparency,” said O’Malley. “We have been using GIS for years to increase government accountability and efficiency and to enhance transparency. With one comprehensive and interactive map for Maryland, our citizens will have access to unprecedented information online. From land conservation to public safety, the possibilities are endless when government becomes transparent and accountable to the citizens it serves.”

GreenPrint is the first GIS-based performance measurement application that is accessible via MD iMap. It is a planning tool designed to help government staff, conservation organizations, and individual citizens make good decisions about land conservation and growth. The state’s other performance measurement applications, including StateStat and BayStat, will be added soon.

To support government staff in Maryland, a secure agency login on the MD iMap Web site home page connects users to Maryland GIS Online, which is built with ArcGIS Online. On that site, staff can download data and Web services from other government entities in the state. In addition to significantly enhancing data sharing and coordination, the portal is innovative in its delivery of real-time, up-to-date statistics in one sleek, user-friendly interface.

“Governor O’Malley’s vision of one Maryland, one map, speaks to the best in government including accountability, unity, and service to citizens. It is also an outstanding example of a public and private partnership driving government forward,” said ESRI president Jack Dangermond.

Interact with the original at MDiMap . . .

Density Mapping in Google Maps with HeatMapAPI (GeoChalkboard)

Monday, April 27th, 2009

[Editor's note: Heat maps are a useful way to qualitatively represent densely clustered point locations on a map. This post from GeoChalkboard walks thru how to create using the Google Maps API and the HeatMap API.]

Republished from GeoChalkboard, there on March 11, 2009.

In the GIS world heat maps are a graphical representation of point data on a map through the use of colors that indicate the density of some variable such as crime incidents or traffic accidents.  Heat maps let users quickly visualize the density of locations. Being able to understand the density of point locations makes it much easier to see patterns in your data, especially when using colors. In this post we’re going to examine the HeatMapAPI, a JavaScript API for creating heat maps in Google Maps.


(Above) 2009 Starbucks Store Closures

Introduction
HeatMapAPI can be used over the Internet or as a .NET DLL that runs in a local environment and allows you to integrate heat map images into Google Maps or other GIS systems.  In this post we’re going to use HeatMapAPI to visualize the density of recent Starbucks store closures.  In a recent statement, Starbucks announced the closure of 600+ stores in the United States due to economic conditions.

Continue reading at GeoChalkboard . . .

Interview with MarineMAP Mashup Developers (Kelso)

Tuesday, April 21st, 2009

marinemapsupporttool

[Editor’s note: MarineMAP is a cutting edge mashup built using PostGIS, GeoDjango, Ajax, Flash, OpenLayers, GeoServer and MapServer with Google base map tiles. It assists stakeholders in the design of MPAs (Marine Protected Areas) in mapping oceanographic, biological geological, chemical and human dimensions of the ocean and coastal areas. I talk with Will McClintock and Chad Burt of the Marine Science Institute at University of California at Santa Barbara about the technical underpinnings and development philosophy behind the project. One key to the project’s success (rolled out Dec. 2008) has been the hiring of dedicated programmers to implement design ideas and new technology to extend an earlier version’s usability and reach. Thanks Melissa and Sebastian!]

Interact with the MarineMAP at marinemap.org/marinemap.

Interactive Map Tool Objective: MarineMap is an internet-based decision support tool that provides the capacity for the SCRSG (South Coast Regional Stakeholder Group) to view data layers, create individual MPA concepts, assemble collections of individual MPA concepts into MPA arrays, receive basic feedback on how well MPA concepts and arrays meet guidelines for MPA design, and submit MPA arrays to staff as MPA proposals. This tool will be the primary way in which MLPA Initiative staff and SCRSG members capture and store information regarding MPA proposals.

marinemapsupporttool2

(Above) Screenshot above showing Marine mammal and Nearshore habitat layers on base map with area Measurement Tool enabled.

(Question) Kelso’s Corner: What technologies are leveraged in MarineMAP?

(Anwer) MarineMAP: We’re not using ArcGIS at all, save for cutting map tiles (using ArcGIS Desktop and Arc2Earth) and, as a non-critical component of the system, ArcSDE / SQL Server. We’re mainly using PostGIS, GeoDjango, Ajax, Flash, OpenLayers, GeoServer and MapServer and will soon switch to the Google Earth API.

We are using OpenLayers, rather than the Google Maps API for our “slippy map”. OpenLayers is pure javascript, as is most of the client application. We are using Flex, but only for the charting component. [Editor's note: OpenLayers is using the Google Maps tiles.]

(Q) Kelso’s Corner: How many programmers do you have on staff to deal with all the software components?

(A) MarineMAP: Currently, two of our developers work full-time on MarineMap, while our other two developers work half time. We also have several GIS analysts and a cartographer to deal with the data end of things. We are now looking for a full-time, in-house Assistant Web Developer to continue working on MarineMap. As we extend MarineMap to different geographies and planning processes, we anticipate that we’ll be looking for one or two more programmers as well.

(Q) Kelso’s Corner: What was the rational for doing this substantial map development in house? Did you evalutate other routes, consultants, off the shelf software before going this route, why was this option preferable? Did you have a good cheat sheet for how to develop / implement this technology? Did you have to hire new staff to do the programming or did you have existing expertise to draw on?

(Anwer) MarineMAP: We did not have a cheat sheet for how to develop / implement this technology. This was a brand new application using some new technologies, and some that we were familiar with. Of course, we had experience developing other applications and some of these technologies overlapped. But, there was a significant amount of learning happening for all of our developers.

The MLPAI is an on-going process that will terminate sometime around 2011. Until then, we need to have a highly functional and stable application that can be adapted to the changing needs of the process. It turned out to be much more cost-effective and time efficient to hire in-house developers to work on the application year-round. Before we built our team, we spent a significant amount of time considering a host of alternatives, including trying to maintain and tweak Doris, contracting out all of the work, etc.  Initially, we felt we did not have enough in-house expertise. Although we already had Chad Burt (UCSB), Jared Kibele (UCSB), Tim Welch (Ecotrust) and, now, Ken Vollmer (Ecotrust) as our in-house crew, we eventually contracted two developers from Farallon Geographics (Dennis Wuthrich and Alexei Peters) for a limited period to  help with developing the database schema. This was particularly nice given that we had only 6 months to get the first version of MarineMap out the door. Dennis and Alexei are no longer working on the project but I am very grateful that we had access to their time and expertise during the initial phases.

(Q) Kelso’s Corner: What was Doris?

(Anwer) MarineMAP: At the beginning of the Marine Life Protection Act Initiative (MLPAI), staff chose to hire consultants to build an application (eventually called “Doris”) that was built on ArcGIS Server 9.1 technologies. It shared some of the features of MarineMap, including drawing MPAs and arrays, and generating reports on what was being captured inside those MPAs. Doris had a poorly designed interface and, perhaps more significantly, it was dreadfully slow. Consequently, few stakeholders used it. Furthermore, because the application was built using technologies with which we had no particular in-house expertise, and because these technologies were proprietary, we had a difficulty updating the application or tweaking it on the fly. (I had been running ArcSDE / ArcIMS and ArcGIS Server applications for a couple years but had no real development expertise in, say, ArcObjects, or VB .Net.)

(Q) Kelso’s Corner: It seems there are many more RubyOnRails developers than Django. Have you found this a hindrance for hiring staff or when looking for trouble shooting advice?

(Anwer) MarineMAP: It does seem to be a bit of a challenge finding Django developers, particularly those that can / will work locally. I have not tried to hire a RubyOnRails expert so I have no direct means comparison.

(Q) Kelso’s Corner: Why will you be switching to the Google Earth API? Is this only for the front end? Have you been happy with GeoDjango?

(Anwer) MarineMAP: GeoDjango has been fantastic. Using the Google Earth API does not mean ditching GeoDjango. Rather, using the Google Earth API represents a shift away from the OpenLayers API. We’ll still be using GeoDjango extensively.

[Our lead developer] was a big proponent of RubyOnRails for quite some time, but Django has taken many of its best ideas to Python. While Ruby is aesthetically a beautiful language, Python is usually much faster and has a more mature set of modules to build on. The only thing I miss after switching over to Django is the database migrations Rails offers. Most open source GIS packages also have bindings for Python, where as there a few similar tools for Ruby.

Switching to the Google Earth API will just mean replacing OpenLayers. OpenLayers is a very good library, but the Earth API is much faster due to the fact that it is a compiled plugin rather than being written in javascript. This allows it to display thousands of placemarks on screen at once, which is one of the primary reasons for switching. Google Earth can also display temporal and 3d data.

(Q) Kelso’s Corner: Besides the change to Google Earth API, what other changes, updates do you plan for this online map?

(Anwer) MarineMAP: Besides switching to the Google Earth API, there is one major upcoming update to MarineMap. Specifically, we will be implementing a map-based (i.e., location based) discussion forum. Users will be able to zoom into a location on a map and tag objects (MPAs, data, places) with a comment. Other users will see these comments (if they have comments “turned on”) as they zoom in to a location or if they load an MPA. Users can then participate in a dialog via a traditional discussion forum that is linked to the map. Furthermore, users will be able to define a geographic region and subscribe to RSS feeds (using GeoRSS) for any activity within that region. One might choose to do this, for example, if they want to be notified by email any time somebody draws a new MPA in, or makes a comment about a data layer in a specific region that he / she cares most about. I believe the map-based discussion forum will go a long way in facilitating discussion about MPAs, particularly outside the in-person monthly stakeholder meetings.

Conclusion: Thanks so much for the informative Q&A session. Please check out the MarineMap project at MarineMap.org/marinemap.