Azavea Atlas

Maps, geography and the web

Geoprocessing and the Esri GeoServices REST API

In my previous article, I wrote about the Open Geospatial Consortium (OGC) Web Processing Service (WPS) standard and how it can be used to enable different geographic data processing capabilities to work together.  In this article, I’m going to discuss a second example that has been under development by Esri for a few years, but was just released as a published specification.  At this summer’s International User Conference, Jack Dangermond announced that Esri would be publishing a REST API as a new standard.  A couple of weeks ago, Esri made good on that promise and released the GeoServices REST Specification.

What the heck is the GeoServices REST Specification?

While I’ll admit that I have not read the entire 220 page specification document, I’ll try to summarize the salient points.  First, I should note that while I’m pairing this blog post with a related one on WPS, I do not see the GeoServices REST spec as an alternative to WPS.  It’s actually much more broad.  And, unlike WPS, one could probably make the case that it’s already in fairly wide use by a large community.  The spec hews closely to the ArcGIS Server REST API that is already supported by Esri’s entire client product line, including the Flex, Javascript, Silverlight, iOS and Android API’s as well as the ArcGIS Desktop, Engine and Server products.  Anyone that elects to implement this new GeoServices REST spec will basically have a huge built-in client base that can take advantage of their services.

Rather than an alternative to WPS, one might actually see this as an alternative to the WMS, WCS, WFS, WPS and Catalog standards while also providing services for which there are no existing OGC standards, such as geocoding.  The REST-based specification supports JSON, HTML and KMZ responses, with JSON being the default format.  The full list of service categories includes:

  • Catalog Service – a list of available services.
  • Map Service – make maps as well as query, ID and other map functions; much like WMS, though with more functionality.
  • Geocode Service – turn addresses, intersections and place names into map coordinates; also includes reverse geocoding.
  • Geoprocessing Service – you can probably guess that this is my favorite service; both synchronous and asynchronous execution of tasks; this is the service that most closely resembles WPS.
  • Geometry Service – utility functions for commonly used vector geometry operations such as reprojection, simplify and densify, buffer, area/length calculation, label points for polygons, distance calculation, generalize, trim/extend, convex hull, cut, difference, intersect, union and reshape; these could also be implemented as WPS services (or through the Geoprocessing Service) but these are provided as a lighter weight, easy-to-use set of utilities; there’s a lot of overlap here with JTS and NTS and one could imagine a rapid implementation of this service using these toolkits plus a projection engine.
  • Image Service – provide access to existing imagery, in particular raster catalogs and mosaicked images; this service also includes local and neighborhood transformations of the imagery, such as recolor, hillshade, slope, aspect, NDVI, statistics, stretch and identify functions.
  • Feature Service – provides functions for querying and editing vector features stored in a geodatabase; the closest OGC equivalent is WFS.

What will this mean?

On its own, the GeoServices REST spec does not mean much.  It will need a community of developers that are willing implement the specification.  That will mean building back-end server processes that will respond to requests made according to the specification.  The open question is whether or not developers will embrace the standard and will it catch on in the marketplace?  That’s obviously impossible to answer right now, but some of the potential can be seen in Brian Flood‘s work on the Arc2Cloud product.  Brian and his brother  got to be feeling pretty smug at this point.  By implementing many parts of the ArcGIS Server REST API, his Arc2Cloud product already supports the majority of the GeoServices REST specification with the server processes running in the Google App Engine cloud computing infrastructure.  This is a very compelling concept – build geoprocessing services that operate against cloud infrastructure but enable many, many people to use them by doing so on top of an established standard.

For Esri, this is a risky move.  Similar to the risks ERDAS faces by embracing WPS, Esri is creating a specification that, if broadly adopted, will make it easier for some people to not use their flagship ArcGIS Server product.  On the other hand, by demonstrating leadership in the geoprocessing market, they will both encourage the growth of that market and their broad product line puts them in a good position to capitalize on the larger marketplace.  I see this as a smart move by a company that feels sufficiently self-confident in its spatial analysis, geoprocessing and data management capabilities that it can invite both partners and competitors to the table.

There has also been some early criticism of the GeoServices specification.  Some punters have remarked that this is not really an open standard since it hasn’t been submitted to an independent standards organization and is not open for public comment and changes.  Browsing the specification, one my colleagues also remarked on the extensive use of the “esri” prefix in things like enumerations.  That’s something that we would generally not see in an open standard and suggests that this isn’t really intended as something to be used outside the Esri ecosystem.

On the other hand, the new specification is being made available under the Open Web Foundation agreement, which should make the spec free of copyright and patent claims as well as enable others to revise, share and implement as they see fit.  Further, there are many paths for specifications and standards as they evolve.  As the OGC has amply shown, submission to a standards body does not guarantee usefulness.  While the OGC has several standards that are in broad use (Simple Features, WPS, WMS, WFS, KML and WCS), it has also got a bunch of “standards” that have been submitted for narrow, commercial purposes and have failed to gain broad market support.  As the longevity of the shapefile has demonstrated, open publication of protocols can have a significant positive impact on interoperability, even if it’s not managed by a standards body.  Further, as Google showed with KML, commercial shepherding of a protocol for a few years can be a precursor to later submission to a standards organization.

Resources

Geoprocessing and WPS

As you may have gathered from our recent newsletter article and other announcements about the CommonSpace project we’ve been developing in Philadelphia, we are in the middle of redesigning the geoprocessing engine that underpins our DecisionTree product.  The new engine, code-named “Trellis”, is leveraging our experience implementing high performance raster processing operations.  We are taking the lessons we learned with DecisionTree – distributed and parallel processing, binary messaging, caching, pyramiding, etc. – to create a more generic processing framework that will support a broader array of geoprocessing operations than was the case for the original single-purpose design created for the DecisionTree application.

We’ll unveil more details of the Trellis work as it evolves over the next few months, but as part of our design research, we’ve been looking at a number of the existing technologies related to server-based geoprocessing.  This first article will focus on the Open Geospatial Consortium (OGC) Web Processing Standard (WPS).

WPS is one of an alphabet soup of geographic data and mapping standards overseen by a non-profit standards organization called the Open Geospatial Consortium (OGC).  It’s a particularly interesting one for Azavea because it is concerned with making geographic data processing available across a network – essentially enabling us to move geo-computation and spatial analysis from a desktop GIS to the server and enabling this type of analysis to be provided as a service over the web or even in a mobile application.  We think that this is a really important capability for two reasons:  a) it will allow sophisticated analysis that has previously required a GIS specialist and complex software to be made available in simple applications on the Internet; b) we think this will result in faster, more responsive applications that can serve more people at lower cost.

OGC standards like WPS have developed over the course of many years and have arisen in order to support interoperability across diverse platforms.  The OGC standards that Azavea has found most useful for its web and mobile applications are of two basic types: services the return some kind of geographic data; and formats for organizing and transporting that data across a network.

  • WMS and WMTS (web map service and web map tile service) – service that provides map images for display in a web browser
  • WFS (web feature service) – service to request and filter vector feature data in a geographic database
  • WCS (web coverage standard) – service that provides raster data (aerial and satellite imagery, for example)
  • GML (geography markup language) – this is an XML protocol for encoding geographic data
  • KML (keyhole markup language) – developed by Keyhole and later purchased by Google as part of the software that would become GoogleEarth, KML was submitted to the OGC after it had undergone a fair amount of development; it does not fit neatly into the the other standards, but it’s broadly used for combining geographic data with styling
  • SLD (styled layer descriptor) – a way to describe how to apply color and other styling on a map

What the heck is WPS?

So if WMS is for getting map images, and WFS and WCS are for requesting vector and raster data, and data can be transferred using KML and GML and styled using SLD, that’s a lot of what is done in the web map mapping world.  What do we need WPS for?  WPS provides a way to request transformations of existing geographic data.  While much of contemporary web mapping remains a matter of simply displaying data on a base map and asking some basic questions about that data, the utility of geoographic analysis goes beyond display of information on a map.  For example:

  • in a flood scenario, we might want to know which properties are located within 100 meters of a flood plain boundary
  • to find the perfect site for a school, we might want to consider several geographic maps layers, apply weights to them and generate a heat map
  • for crime analysis, we might want to create a density map based on crime locations

In each of these cases, we need to transform one or more geographic data sets.  To answer the first question, for example, we would need to buffer the flood plain polygons by 100 meters to create a new layer and then select records that fall within the new polygon.  For the second scenario, we need to read each of the relevant map layers, convert them to a common format and scale, apply weights and then create and apply colors to a map of the results.  WPS is a standard that supports requests for these types of geographic data transformations (processing)  in a common way.

Like many of the OGC services standards, WPS is conceptually simple.  It supports only three functions:

  1. GetCapabilities – returns information about the available processing features
  2. DescribeProcess – returns metadata specific to each available processing function
  3. Execute - runs a process based in a series of inputs

It’s important to note that WPS does not actually do anything. Like other OGC services, it is simply a lingua franca for asking for work to be done.  if you host a WPS service, you still have to have software that executes the processing tasks.  But WPS defines a common protocol for making requests for almost any kind of geoprocessing task.

Who’s Using WPS?

52north logoUnlike WMS, WFS and WCS and a few other standards, WPS is a relatively new standard that was only finalized in late 2007.  There are not a lot of examples yet, but the reference implementation is a WPS server and sample clients being developed by 52° North, a non-profit based in Germany.  The software project is being led by Bastian Schaeffer and Theodor Foerster and is a Java-based implementation that is available under an open source license.  The project has an ambitious roadmap and a rapidly growing community.  There is also ongoing work to create a connector to the geoprocessing capabilities in Esri’s ArcGIS Server as well as distributed (or “grid”) computing.

A second open source implementation is PyWPS, created for people using Python.  It’s primary purpose is to make GRASS-based processing available to web clients.

ERDAS logoIn 2009, ERDAS released its own WPS-based web geoprocessing service.  ERDAS refers to its WPS implementation as an  internet Spatial Modeling Service (iSMS).  The WPS interface supports access to the IMAGINE Spatial Modeling Engine by making calls over the internet.  The capabilities are integrated into the ERDAS APOLLO server product line under the APOLLO Professional package.  And, of course, ERDAS has enabled its client technology, the IMAGINE and TITAN products, to consume WPS-compliant services.

What’s the point?

So why is WPS important?  As I mentioned above, like most information technology standards, the purpose of WPS is interoperability.  If a standard becomes broadly adopted in many software packages, it becomes easier to mix-and-match components for a particular purpose.  By enabling their APOLLO server to speak WPS, ERDAS is enabling any software that can make WPS requests to use the server, even if they are not ERDAS products.  So I can use the uDig WPS plugin and make requests to spatial models defined and run in an ERDAS APOLLO server and display the results in uDig.  For a commercial company like ERDAS, this is a double-edged sword.  By supporting WPS, they are also saying that you don’t need to buy an ERDAS client software package in order to use an APOLLO server.  But, by the same token, it also means that now many more people will be able to make requests to APOLLO servers, and this will grow the ERDAS ecosystem and may result in higher sales of APOLLO and IMAGINE licenses.

In a second article, I’ll focus on an even newer standard just published by Esri, the GeoServices REST API specification.

Resources

Dana Tomlin to be Inducted into GIS Hall of Fame

Tomlin photos

Azavea does a lot of work with geoprocessing and spatial analysis.  When we work with raster data in particular, we are often leveraging what has become known as “Map Algebra,” a mechanism for combining and transforming map layers in a logical and organized manner. The concepts of Map Algebra were developed in the doctoral thesis of C. Dana Tomlin in the 1980′s and later published in book form in Geographic Information Systems and Cartographic Modeling. Perhaps more importantly, though, Dr. Tomlin developed and then openly shared the software code that demonstrated the concepts behind his ideas. Before the term “open source” entered the common parlance, he gave away both code and documentation to anyone with serious interest. As a consequence, Map Algebra concepts as well as Tomlin’s source code and algorithms underpin many of the commercial and open source implementations of GIS software used today for raster analysis.

As a landscape architecture and regional planning student at the University of Pennsylvania School of Design, I got my introduction to GIS through a class with Dana.  Though only a brief plenary session, I was instantly hooked and subsequently took every class on GIS offered in the subsequent 18 months.  I finished my MLA, but never practiced in the design professions and instead pursued a career in GIS and technology.  I’m sure I’m not the only one influenced in this way – he’s a masterful teacher, igniting both curiosity and imagination in his students. Over the years, Dana has been an important mentor and friend for me.  More recently, we’ve been working with him on both Azavea’s GPU research (NSF grant IIP-0945742) and a watershed modeling project we are developing with the Stroud Water Research Center and the Cartographic Modeling Lab, a research lab Dana founded at U-Penn.  It’s been a thrill to have an opportunity to work together.

The GIS community has gained a great deal from Dr. Tomlin’s contributions, so it was with great pleasure that I recently learned URISA will induct him into the GIS Hall of Fame at this year’s URISA conference in Orlando.  It’s a well-deserved honor, and he will join some illustrious past inductees, including:  Jack Dangermond, Ian McHarg, Michael Goodchild, Roger Tomlinson, Nancy Tosta and the Harvard Lab. If you are attending, the awards breakfast will be on Thursday morning, Sept 30.  Perhaps I’ll see you there.

Four New GNSS Satellites and an Augmented Reality iPhone App Launched

Russia’s GLONASS constellation moved a step closer to full global coverage with the launch of three new satellites on September 2, 2010.  At present, twenty-one GLONASS satellites are operational, and two others are considered spares.  Three additional satellites are scheduled for launch in November, and the first in a series of GLONASS-K satellites is scheduled to launch in December.  The new GLONASS-K series will feature a longer lifespan of up to ten years and additional signal capacity.  With a full constellation expected to be complete by the end of the year, Russia is currently promoting its GNSS technology to both foreign and domestic manufacturers of navigational receivers and related products.

On September 11, 2010, Japan launched the first in a series of three satellites that will provide enhanced navigation signals for Japan and portions of the surrounding Asia-Pacific region.  The Quasi-Zenith Satellite System (QZSS) is named for the asymmetrical Figure-8 orbit that will keep at least one satellite almost directly overhead – at the zenith – at all times.  For high accuracy positioning, the ideal satellite geometry is to have one satellite at the zenith and three others broadly scattered around it.  The new satellite, also known as “Michbiki,” will send signals that are interchangeable with those of the United States’ GPS constellation, thus allowing the QZSS to augment the eight to eleven GPS satellites that are normally available over Japan at any given time.  When fully operational in 2013, the three QZSS satellites will reduce ranging errors and increase positioning accuracy even in areas of Japan where urban canyons or mountainous terrain have previously been an issue

Even before Michibiki was launched, it had its own iPhone/iTouch application. QZ-Finder allows users to keep track of QZSS and GPS satellite positions overhead with a compass-like skyplot view as well as a world map view that shows how the satellites are distributed around the globe and even tracks the QZSS orbit trajectory.  The new app also features an augmented reality view of the satellites that can be accessed through the user’s iPhone camera and even incorporated into a photograph for an image that is truly “out of this world.”

Nonprofit Tech Conference 2011 Sessions: Gap Analysis & Vendor Tips

Washington, DC here we come — well, in a few months, but we’re excited to start thinking about the 2011 Nonprofit Technology Conference.  The conference is the hub of activity for nonprofit technology and we’d encourage all technology vendors that deal with nonprofits to attend.   We’ve met some great people and always come home energized to continue to serve the nonprofit sector.  For this year’s conference we’ve proposed two sessions.    We’d appreciate your votes (and feedback):
  • Geographic Gap Analysis: Leveraging Census, Open, and Proprietary Datasets for Fundraising
  • Behind the Vendor Curtain: Technology Tips and Lessons Learned from Socially Minded Companies