Archive for the ‘Best practices’ Category

Antrophogenic transformation of the terrestrial biosphere

Sunday, October 7th, 2012

[Editor’s note: The new “anthropocene” age has been detailed by the likes of The Economist, National Geographic Magazine,  The New York Times, and Wired. While much has been made of miles of road and general interconnected transportation network,  population density, and other measures,  I’m most captivated by this newish map showing “years of intensive use”. Get your Jared Diamond out and study this map. Thanks Hugo! More maps at Ecotope, thanks Andrew!]

Perhaps the most obvious mark we’ve made to the planet is in land-use changes. For millennia, humans have chopped down forests and moved rock and soil for agriculture and pastureland—and more recently, for construction.

Antrophogenic transformation of the terrestrial biosphere

CREDIT: ERLE ELLIS, ADAPTED FROM E. ELLIS, PROCEEDINGS OF THE ROYAL SOCIETY A, 369:1010 (2011) From the Science article “A global perspective on the anthropocene” DOI: 10.1126/science.334.6052.34

2012 Mountain Cartography Workshop presentations from New Zealand

Wednesday, September 12th, 2012

The 8th ICA Mountain Cartography Workshop was held in Tongariro National Park, New Zealand (map) during 1-5 September 2012. A couple dozen participants gathered to talk about maps, mountains & enjoy outdoor recreation. Martin, Tom, and I joined from the United States and other participants ranged from Argentina to Norway and the usual contingent from Switzerland and Austria. We had wonderful hosts in New Zealand organizing committee.

I’ve gathered our presentations together here as a reference, enjoy.

PDFs are linked below. Original Powerpoints over there »

Aileen Buckley: NAGI Fusion Method
Download PDF (2.6 mb) »

Benjamin Schroeter: Glacier variations – Projects at the Institute of Cartography, TU Dresden

Download PDF (22 mb) »

Martin Gamache: Cartography beyond the Planimetric
Download PDF (40 mb) »

Nathaniel V. KELSO: Cartography at Stamen Design
Download the PDF (16 mb) »

Geoff Aitken: The Tramper’s Map of the Tararua Mountain System circa 1936
Download PDF (2.7 mb) »

Dusan Petrovic: Designing photo-realistic and abstract mountain maps for a 3d mapping study
Download PDF (3.7 mb) »

Martin Gamache: Americans on Everest – 50 year anniversary, mapping with the iPad
Download PDF (28 mb) »

Roger Wheate: Visualization of changes in the alpine glaciers in western Canada
Download PDF (8 mb) »

Lorenz Hurni: Glacier DEM reconstruction based on historical maps: A semi-automated approach
Download PDF (5.6 mb) »

Sebastian Vivero: A new digital terrain model for the Tasman Glacier, New Zealand, using digital photogrammetry techniques.
Download PDF (2.7 mb) »

Karel Kriz: User interaction and design issues for the Tyrolean AWC Portal
Download PDF (1.8 mb) »

Stefan Raeber: Panoramic Maps – Evaluating the usability and effectiveness
Download PDF (2.8 mb) »

Karel Kriz: Decomposing an exhibition map
Download PDF (1.4 mb) »

Roger Smith: Texture maps with Photoshop
Download PDF (1.9 mb) »

Georg Gartner: Putting emotions in maps – Towards supporting wayfinding
Download PDF (2.7 mb) »

Karel Kriz: Needs, concepts and realization of a mountain compliant smartphone app
Download PDF (2.5 mb) »

Antoni Moore: Multi-modal exploration of rugged digital terrain on mobile devices
Download PDF (0.5 mb) »

Martin Gamache: Relief approaches at National Geographic Magazine
Download PDF (70 mb) »

Roger Smith: Maps & geomorphology
Download PDF (8.8 mb) »

Geoff Aitken: New topographic mapping
Download PDF (2 mb) »

Nathaniel Vaughn Kelso: Create your own terrain maps
Download PDF (12 mb) »

Tom Patterson: Mountains unseen – Developing a relief map of the Hawaiian seafloor
Download PDF (5.7 mb) »

Martin Gamache: Yosemite National Park and El Capitan
Download PDF (7 mb) »

Martin Gamache: Animation of projections used in the National Geographic Mapping the Oceans supplement
(view video above)

Andrew Steffert: Lidar flood mapping
Download PDF (32 mb) »

A friendlier PostGIS? Top three areas for improvement

Friday, August 10th, 2012

I prompted a flurry of PostGIS hate (and some love) on Twitter last week, documented via Storify.

I’ve been using PostGIS for around 2 years now both at Stamen and before that at The Washington Post. Let me say upfront that PostGIS is amazing and is definitely in the top 5 best FOSS4G (free and open source software for geo) out there. It is a super powerful spatial data store that is free to download, install, and use (even in commercial projects!). It’s can also be mind numbingly difficult to install and use.

It doesn’t matter how awesome something is unless it’s usable. If we want the FOSS4G community to grow and be adopted by more everyday users of GIS and general users for spatial data needs, we need to improve this situation. “Patches welcome” is a programmers crutch. Actually following up with user’s real world issues is where it’s at.

Besides the specific issues outlined below, PostGIS lacks basic functions required for spatial analysis found in ArcToolbox. Those are slowly being rolled out as sidecar projects running on top of PostGIS, and CartoDB is a good case in point. But unless you’re a programmer and can roll your own (and your project budget can afford it), that’s a #fail.

@PostGIS asked me for details on how it could be friendlier and I’ve itemized around 20 below.

Top 3 areas for PostGIS improvement

1. EASIER TO INSTALL

If a project that is considered core to the FOSS4G stack (eg Mapnik, PostGIS, etc), the project needs to act like it.

Our servers at Stamen run Ubuntu Linux and we have a variety of them, running different combinations of applications and operating systems. Our staff machines are Mac laptops. There’s some pretty good installers now for Windows and Mac it seems. But the Ubuntu support has been out of sync too often.

  • Request 1a: Core FOSS4G projects should be stable and registered with official, maintained APT Ubuntu package list.

Distributing via private PPAs that are hard for end-users to discover and more cowboy in robustness is poor practice.

  • Request 1b: The APT package distribution of core FOSS4G projects should work with the last 2 versions (equivalent to 2 years) of Ubuntu LTS support releases, not just the most recent cutting edge dot release.

As of today, the latest LTS is 12.04, before that is 10.04. We just upgraded to 12.04 and are slowly upgrading the rest of our FOSS4G stack. This type of staggered versioning is standard in production environments.

While it’s nice to have cutting edge features, we also need to acknowledge that one app’s cutting edge features & cutting edge dependencies are another end-user’s dependency hell when installed with other software in the FOSS4G stack.

What’s amazing about Ubuntu is that they tell you up-front exactly how long they plan to support a particular version, in months and years (view timeline).

There should be an overlap period between the versions distributed in major package systems and the versions supported by the developers themselves, as well as an overlap with the release cycle of a system like Ubuntu. For instance, Mapnik is now thankfully in this state but for a long time supported the widely available 0.7 release inconsistently, but 0.7 was the only version widely available.

  • Request 1c: Backport key bug fixes to the prior release series.

We’ve all been burned by FOSS4G maintainers when say they’ve fixed problems in newer versions, but they don’t back port those changes to point or patch releases that are still compatible with LTS. Sometimes it’s unavoidable. Most of the time it’s not.

2. EASY DATA IMPORT, EXPORT, and BACKUP

Once PostGIS is installed it should be 100% usable without learning additional magic workflow. The existing workflow might seem normal to a Unix nerd or pro DBA administrator, but it’s not intuitive for a new user.

 2.1: IMPORT & EXPORT

I should be able to import shapefiles, the defacto geodata format, easily like this:

shp2pgsql import.shp

Instead of:

shp2pgsql -dID -s 900913 import.shp <destination_table> | psql -U <username> -d <my_new_db_name>

How to get there is detailed below. Note that the advanced power of the import flags and even the piping of raw SQL is still there if you need it as a power user. But the basic import (and export) should be that simple.

  • Request 2.1a: Include a default PostGIS spatial database as part of the basic install, called “default_postgis_db” or something similar.

This new database would be the default import location for shp2pgsql and other utilities if the user did not specify a named database. This will reduce the learning curve for new and novice users as they wouldn’t even need to create a spatial database to get up and running.

If the user needs more than one spatial database because of project managements, they can still create new spatial databases and import into those.

This would remove the requirement of becoming a postgres super user to create the first (and likely default) spatial database.

  • Request 2.1b: Include a default PostGIS Postgres user as part of the basic install, called “postgis_user” or something similar.
  • Request 2.1c: If I name a spatially enabled database in shp2pgsql that doesn’t yet exist, make one for me.

PostGIS should be making my life easier, not harder. If a database of that name doesn’t yet exist, ask if it should be made (y/n) and create it, then import. If a database is named but doesn’t have the spatial tables enabled, ask if they should be enabled (y/n) and do so.

  • Request 2.1d: It’s too hard to manually setup a spatial database, with around a printed page of instructions that vary with install. It mystifies Postgres pros as well as novices.

The support files for PostGIS’s functions and spatial reference systems have been stored in a variety of places on the file system, requiring us to remember what files to add, a search to find their location, then incantations to actually import those onto a new database to enable spatial power.

Fixed? I hear this is fixed as of Postgres 9.1 by using `CREATE EXTENSION postgis database` to create databases that are spatialized. That’s super awesome!

  • Request 2.1e: Default destination table names in shp2pgsql.

The required destination_table should be optional if I want to not use the filename of the shapefile as the table name:

shp2pgsql -dID -s 900913 import.shp <destination_table> | psql -U <username> -d <my_new_db_name>

Could be:

shp2pgsql -dID -s 900913 import.shp -U <username> -d <my_db_name>

  • Request 2.1f: Automatically pipe the output to actually put the raw SQL results into PostGIS.

The | (pipe) in the shp2pgsql command workflow is confusing. Pipe it automatically for me. I know this is a Unixism. It’s also super confusing to new users.

  • Request 2.1g: If my shapefile has a PRJ associated with it (as most do), auto populate the -s <srid> option.

It’s 2012, people. Map projections are a fact of life that computer should be making easier for us, not harder. Manually setting projections and transforms should be a last resort for troubleshooting, not every day routine.

Right now I must manually look up what EPSG code is associated with each shapefile’s PRJ file and set that using the -s <srid> flag so that SRID is carried over to the spatial database. When this is not provided, it defaults to -1.
  • Related 2.1h Projection on the fly: If you still can’t reproject data on the fly, something is wrong. If table X is in projection 1 (eg web merc) and table Y is in projection 2 (eg geographic), PostGIS ought to “just work”, without me resorting to a bunch of ST_Transform commands that include those flags. The SRID bits in those functions should be optional, not required.
  • Request 2.1i: Reasonable defaults in shp2pgsql import flags.

Your mileage may vary, but everyone I know uses the following flags to import data: -dID.

Make these the default. Add warnings and confirmation prompts as appropriate.

  • Request 2.1j: Easier creation of point features from csv or dbf.

This is a basic GIS type operation. Now I need to import manually into a table and use SQL to create the point geometries detailed here.

2.2: DATA BLESSING

If PostGIS’s claim to fame is as a spatial data store, and no more, it needs to get better at accepting all data, and releasing it to the wild again.

I often get invalid geometries reporting from PostGIS on import of geo data that works perfectly fine in Mapnik, OGR, QGIS, ArcGIS, and other GIS applications. PostGIS is too obsessive.

  • See Section 3 below for more specific requests.

I’m still researching PostGIS 2.0 to understand whether this has all been fixed. It sounds like it’s been partially fixed in that it’s now easier to “bless” the geometry into a structure PostGIS likes better, but the underlying problems seems to remain.

2.3. DATA BACKUP

  • Request 2.3a: Forward compatible pgdumps. Dumps from older PostGIS & Postgres combinations should always import into newer combinations of PostGIS and Postgres.

Data should not be trapped in PostGIS. We need an easy, transparent, forward compatible method of backing up data in one PostGIS database and restoring it into a new PostGIS database, either on the same machine, or a different machine, or the same machine with an upgraded version of PostGIS.

I should be able to backup data from PostGIS and have it be restored into newer copies of PostGIS without a problem (I constantly have this problem, especially between Postgres 8.3 and 8.4, maybe it’s fixed in Postgres 9.x and PostGIS 2.x?). I should be able to upgrade my DB and machines without it complaining.

  • Request 2.3b: Offer an option to skip PostGIS simple feature topology checks when importing a pgdump.

PostGIS might approach this with a two-pronged system. If there’s a problem with the data, it could keep around the original version untouched alongside a cleaned-up interpretation, and be able to dump either on request. Or, there could be a flag on a geometry row that specifies whether or not strict interpretation is applied. Defaulting to strict makes sense to us and maintains backwards compatibility with old version, but offers an escape hatch for data funk with topology and other PostGIS errors. This is especially troublesome for the Natural Earth data, which is slowly being edited to conform with PostGIS’s overly “right” view of the world.

3. “INVALID” GEOMETRIES, AND POINTING THE FINGER AT GEOS

Falling under the heading: “Beauty is in the eye of the beholder”: Real world data has self-intersections and other geometry burrs. Deal with it. I often get invalid geometries reporting from PostGIS on import of geo data that works perfectly fine in Mapnik, QGIS, ArcGIS, and other GIS applications. PostGIS is too obsessive.

  • Request 3a: Topology should only be enforced as an optional add on, even for simple Polygon geoms. OGC’s view of polygon topology for simple polygons is wrong (or at the very least too robust).

I understand that PostGIS 2.0 now ships with a clean geometry option. Woot woot. I think there are underlying issues, though.

  • Request 3b: Teach PostGIS the same winding rule that allows graphics software to fill complex polygons regarding self-intersections. Use that for simple point in polygon tests, etc. Only force me to clean the geometry for complicated map algebra.

ArcGIS will still let you join points against a polygon that has self intersections or other topology problems. Why can’t PostGIS?

  • Request 3c: Teach OGC a new trick about “less” simple features.

The irony of the recursive loop:

  1. PostGIS points finger at GEOS topology.
  2. GEOS (JTS) topology is based on OGC Simple Feature spec.
  3. OGC Simple Feature spec is based on an overly simplistic view of the world. It might be convenient in the expedient programing sense, but it’s not practical with real world data.
  4. Everyone at OpenGeo is happy as it’s self consistent.
  5. It’s hard for other people with real world data to actually use the software, sad face.
  • Request 3d: Beyond the simple polygon gripe, I’d love it if GEOS / PostGIS could become a little more sophisticated. Adobe Illustrator for several versions now allows users to build shapes using their ShapeBuilder tool where there are loops, gaps, overshoots, and other geometry burrs. It just works. Wouldn’t that be amazing? And it would be even better that ArcGIS.

When open data is not open: World Bank double speak on Google Map Maker?

Sunday, February 19th, 2012

[Editor’s note: Good news! The World Bank has listened and responded with a new blog post clarifying their stance on open data and backed away from Google Map Maker: ”If the public helps to collect/create map data, the public should be able to access, use & reuse that data.”]

By Nathaniel Vaughn KELSO

In their recent op-ed in the New York Times and on the official World Bank blog, Caroline Anstey* and Soren Gigler** made a compelling case for open data and open government that is fatally flawed. Unless fixed, it has dangerous implications for the future of open data globally. The World Bank’s new policy around map data is needlessly exclusionary: public data should remain publicly accessible.

Over the past two years, the World Bank has made great strides in making its processes more transparent and rethinking the international organization as a development platform and innovation bank.

However, on January 13th, Ms. Anstey and Mr. Gigler muddied that effort by partnering with Google on map data as an end game around many of those “open” goals. Let me be very clear:

The World Bank’s new agreement with Google is a neocolonial wolf handing out shiny blue Map Maker t-shirts.

Google Map Maker expressly prohibits citizen cartographers from using/sharing the very data they add to the map in ways that can help their own development efforts. Users are locked into the Google platform: they cannot export their data or create derivative work, especially commercial projects. Nor can they share that geo data with other online mapping efforts, especially critical during disaster relief.

The World Bank has successfully partnered on map data before, most notably in Haiti with OpenStreetMap (OSM) in response to the major earthquake and the following humanitarian crises.

The Haiti experience shows that crowd sourcing map data works. Most developing countries do not have basic local map data. Timely, accurate geo data showing roads, schools, fresh water sources, health facilities, and more help save lives during an emergency, and in the meantime we all enjoy with up-to-the-minute maps. Regardless if you are in Washington, D.C. or Nairobi, Kenya.

I agree the Bank’s core mission is advanced by improving access to geo data for humanitarian response and development planning. This serves to make development more effective and inclusive by expanding access to basic geo information.

It is appropriate to engage citizens in the Bank’s client countries by inviting them to participate (via mapping parties and online portals) and strengthen the capacity of civil society to “put them selves on the map”. That is an inalienable right and a noble effort by the Bank to facilitate.

But this new agreement falls down on close inspection of the Google Map Maker terms of service. The corporate legalese is contrary to another Bank core principle:

“the right to use that same open data to empower citizens in effective development.”

The Google partnership proposes a new digital serfdom. The Bank should instead embrace the OpenStreetMap model: a system of micro-data grants that empowers a self-sustaining wave of economic development as more data gets added to the map. The citizen map maker should have an ownership share.

I urge the Ms. Anstey and Mr. Gigler to emphasize to the World Bank’s local offices and partner organizations (including the United Nations) that this new agreement with Google is *non-exclusive*, meaning the Bank can and must open data by sharing local geo data with other organizations, like OpenStreetMap.

When the Bank partners to allow citizens to draw their own map, the resulting map data must be free and open. Indeed, open mapping tools and civil society organizations like OpenStreetMap (who innovated first with mapping parties and their online map editor) should be leveraged and grown as much as possible.

Instead, the current agreement allows Google to use local citizens to collect information for free and make exclusive profit. The agreement, specifically the general Map Maker terms or special ODbL terms for the World Bank project, should be rewritten as Patrick Meier*** suggests, to ”allow citizens themselves to use the data in whatever platform they so choose to improve citizen feedback in project planning, implementation and monitoring & evaluation.”

Terms of use like Open Database License (ODbL) promote circulation of geo data for the most good and Google has been receptive in the past to opening up parts of their Map Maker data for humanitarian relief. Let’s complete the circle so this type of license is a core part of a revised World Bank “open” data agreement and have it in place before the next disaster.

The World Bank must reiterate its commitment to truly open data with due speed.

—–

The author is chief cartographer for the Natural Earth, a public domain map database. His maps have been published in The Washington Post and National Geographic and he is a design technologist at Stamen Design.

—–

* Caroline Anstey is a managing director of the World Bank.
** Soren Gigler is Senior Governance Specialist, Innovation at the World Bank.
*** Patrick Meier is director of crises mapping at Ushahidi.

Related reading

PR: World Bank and Google Announce Map Maker Collaboration

Open Aid Partnership:

Patrick Meier, Ushahidi’s director of crisis mapping

RWW coverage

Directions Magazine coverage

Google’s Official LatLong Blog

Global Integrity

Does a Google-World Bank Deal On Crowdsourcing Ask Too Much of the Crowd? (TechPresident)

“What Gets Measured Can Be Changed”: World Bank Turns Its Data Catalog Public (TechPresident)

The World Bank Responds to the Google Map Maker Deal (Global Integrity)

 

7 Billion People in Kinetic Typography

Tuesday, January 4th, 2011

Very cool motion graphic promo for National Geographic’s new year-long series, via Kat and seen at BrainPickings.

Live election results map from The Washington Post

Tuesday, November 2nd, 2010

Looking for live results and post-election wrap up? Look no further than The Washington Post »

picture-6

Maybe places are more about time than location: Retrofitting Geo for the 4th Dimension (Fekaylius)

Tuesday, July 27th, 2010

240688859_4c8b4a4c98

[Editor’s note: Thanks Sylvain!]

Republished from Fekaylius’s place.

We are in a period of mass-market place ambiguity.

Places drift, jump, and fade, physically. Some places have a much higher propensity towards noticeable drift than others, but location, in general, is not stable. The geo-web of the past few years has mostly ignored this as a low impact edge case. The era of the Google Maps API dramatically boosted developer productivity and interest within the geo space because it simplified and lowered the barriers to entry, while simultaneously reinforcing a few paradigms that find easy adoption within rapidly moving startups and business, ideas like “the perfect is the enemy of the good” and “solve for the 80% use case”. Startups are constantly faced with a to-do list that can never be 100% complete, but these catchy ideas formalize and automate the painful process of deeming some desires unworthy of your attention. Since 80% of the places that most people are searching for, or reviewing, or visiting feel relatively immune to change (at least in the “several years” lifespan much of today’s software is being designed for), we have very quickly built up a stiff and rigid framework around these places to facilitate the steep adoption of these now ubiquitous geo-services. The rigidity is manifest in the ways that place drift isn’t handled, places are assumed to be permanent.

Continue reading at Fekaylius’ place . . .

Calculating bounding box in ArcMap

Monday, July 12th, 2010

xmindimen

[Editor’s note: I keep returning to this technique for calculating a feature’s extent (minimum bounding rectangle) in ArcGIS using the Field Calculator. Thanks William and Jeff!]

Republished in part from ESRI Forums.
Sample Field Calculator code for computing XMIN appears below.

Information about shape properties appears in the “Geometry Object
Model
” diagram (pdf). All four parameters are Xmin, Xmax, Ymin, and Ymax.

dim Output as double
dim pGeom as IGeometry

set pGeom = [shape]
Output = pGeom.Envelope.XMin

Deep simplicity: A personal graphics Manifesto (Alberto Cairo)

Wednesday, July 7th, 2010

raivap

[Editor’s note: Alberto Cairo picks up where he left off in March, further defining what he calls “Deep simplicity” and why news infographics should use it to counter the trend of complex visualizations that are more data explorations (dumps) rather than distilled presentations. Unless you belong to the small community of specialists they are aimed at,” you won’t get those complex visualizations. Instead focus on sharing the “why” and “how” with less of the raw “what”.]

Republished from Visualopolis.

Last week I was working on a science infographic for Época with the help of my colleague Gerson Mora (3D guru) when I went back to the idea I’ve been thinking about for the past few months, and that you can see outlined in the previous article: is it possible to create graphics that are simple and deep at the same time? If it is, they probably are the ones that news magazine readers appreciate the most.

This is the graphic we worked on for a couple of days. Simple, isn’t it? Just four white 3D Poser-like heads that display different levels of anger. The story this graphic was published with deals with the outbursts of rage that many soccer players are showing during the South Africa World Cup. We wanted to explain what happens in your brain when that negative emotion overrules your conscious mecanisms, making you lose control. And why it happens. [...]

This piece illustrates a fancy concept Ive been thinking about for future articles and books: deep simplicity. There’s a book under the same name by John Gribbin, but it has nothing to do with graphics (it’s about chaos theory and complexity). There’s also a little masterpiece by John Maeda that promotes something similar to what I propose, but applied to design in general, and in a more abstract level. I confess that some of Maeda’s ideas permeate my own reflections heavily. [...]

What does deep simplicity mean, anyway?

Continue reading at Visualopolis . . .

Knight News Challenge 2010 Knight News Challenge: TileMapping wants to bring the mashup mentality to local maps

Tuesday, June 29th, 2010

[Editor's note: Hyperlocal maps mashing up with local news are about to get much more interesting. If you create city street maps, you should read this article and start researching web Mercator and how to cut your custom cartography into image tiles. Might make a good topic at this year's NACIS meeting in St. Pete.]

Republished from the Nieman Journalism Lab.
By Megan Garber
June 2410 a.m.

Two primary concerns when it comes to news innovation have to do with information itself: harnessing it and investing communities in it. One of this year’s Knight News Challenge winners wants to tackle both of those concerns — at the same time, through the same platform.

Tilemapping aims to empower residents of local communities to explore those communities through mapping. “A lot of great stories can be told using maps and some of the new data that’s become available,” says Eric Gunderson, the project’s coordinator. And the Tilemapping project wants to leverage the narrative power of new technologies to help media — community media, in particular — create hyper-local, data-filled maps that can be easily embedded and shared. The tool is aimed at both journalists and community members more broadly; the idea is to help anyone with investment in a given location “tell more textured stories” about that location — and to help visualize (and discover) connections that might not otherwise be clear.

Tilemapping does what its name suggests: It provides “a tool that basically glues together a bunch of tiles,” Gunderson says, to create a layered map. (Map tiles are the small, square images that comprise maps — think of the squares you see when zooming in on a Google Map.) The project works through TileMill, a MapBox tool that, in turn, “glues together a bunch of other open-source tools to make it easier to generate map tiles.” Users customize both their data and the particular style of their map — and TileMill generates a custom, composite rendering, hosted on Amazon’s Elastic Compute Cloud (EC2). Essentially, the platform is a modular system that allows users to customize the data they want to represent — and to layer them upon other representations to create targeted, contextual maps.

Continue reading at Nieman Journalism Lab . . .