USGIF GotGeoint Blog USGIF promotes geospatial intelligence tradecraft and a stronger community of interest between government, industry, academia, professional organizations and individuals focused on the development and application of geospatial intelligence to address national security objectives.
FOSS4G 2014 runs from Sept 8th-12th in Portland, Oregon this year. The conference brings to together developers and others in geospatial open source. This is an opportunity to participate in the rapidly growing open source geospatial community. Last year's FOSS4G 2013 in Nottingham attracted 28 workshops, 180 Presentations and 833 delegates.
At the India Geospatial Forum in Hyderabad, I moderated a session on electric power. It turned out ot be an absolutely fascinating conversation with a wide range of speakers representing different aspects of the Indian power industry.
Alekhya Datta, a Research Associate at the Centre for Distributed Generation, part of The Energy and Resources Institute (TERI) in New Delhi, gave a presentation about a web-based open source geospatial application that has been developed to estimate the solar power generation potential of a city.
TERI promotes renewable energy and is working with the Ministry of Power to develop a roadmap for renewable energy for India. The objective is to install 20 GW of solar PV by 2020. As part of this effort 51 cities have been designated as "Solar Cities" and TERI is working with state governments to develop a program to encourage the installation of solar PV in these cities.
Chandigarh has been chosen to be first solar city. Chandigarh is the first planned city post-independence and is in the North of India. Its population is about a million.
One of the projects to support the Solar City initative is to develop a high performance and flexible Web-GIS tool to estimate the rooftop solar power potential for a city. It is intended to be low cost, acessible to the public and to use widely available data sources.
Satellite imagery (Pleiades-1A) is used for digitizing building rooftops. This is augmented by a digital surface model and meteoroloigcal information so that it can account for shadowing and cloud cover and includes the heights of buildings, trees and utiltiy poles. Rooftop solar analysis is simpler in Chandigarh than in North American and European cities because roofs are flat.
The total number of rooftops that were digitized is about 110,500. In addition about 14,000 buildings were surveyed and of these 50-60 were used for ground-based solar radiation measurement. For a subsample of 5-10 of these, solar pV panels were actually installed on the rooves and the generated power measured.
Estimating the solar potential of cities
The web-based tool was developed using open source geospatial libraries including Open Layers, GeoServer, PostGIS and TileServer and relies on OGC open standards. It is intended to be extended to enable it to be used in the other 50 solar cities including Delhi. It enables users to estimate the rooftop solar power potential of selected area for different types of PV technologies such as crystalline and thin-film. It also provides tools for estimating potential green house gas mitigation through solar rooftop PV systems for a specific area. The tool is available here (or here)and Alekhya Datta encourages people to try it.
It is envisioned that it will be used as a decision support system to perform pre-feasibility assessments of viability of devloping a rooftop PV program for a city for a particular region or area in the context of the possible business models and financial schemes that are available in that locality.
The Climate Policy Initiative (CPI) has conducted a study of the impact of near-real-time monitoring using satellite imagery of illegal deforestation in Brazil's Amazon. In Brazil 80% of the Amazon which originally covered over four million sq km remains with its original vegetation. Amazon deforestation rates accelerated in the early 2000s reaching a peak of over 27,000 sq km in 2004. By 2011 deforestation rates had fallen dramatically to 5,000 sq km. Changes in Brazilian conservation policies, specifically, the Action Plan for the Prevention and Control of Deforestation in the Legal Amazon (PPCDAm), has been held responsible for significantly contributed to reducing the rate of deforestation. The objective of the CPI study was to identify Which specific policy efforts contributed most to the reduction in Amazon deforestation. The PPCDAm was the key conservation policy effort of the 2000s. One of the main changes it introduced was the implementation of a satellite-based system that captures and processes georeferenced imagery on forest cover in 15-day intervals called DETER. It was developed by the Brazil's National Institute for Space Research (INPE) for the Brazilian Institute for the Environment and Renewable Natural Resources (Ibama), which is the national environmental police and law enforcement authority. DETER is capable of detecting deforested areas larger than 25 hectares not covered by clouds.
The image illustrates how deforestation is monitored by DETER. Deforested areas are shown in purple and forest areas in green. For any given location, DETER compares recent images are with older ones to identify changes in forest cover. DETER identifies deforestation hot spots and generates alerts for areas in need of immediate attention by Ibama enforcement personnel. Prior to DETER, deforestation monitoring depended on voluntary and anonymous reports of illegal activity. DETER enabled Ibama to monitor and quickly respond to illegal deforestation activity in near-real-time.
Impact of satellite imagery supporting near-real-time monitoring
The CPI study took advantage of the fact that cloud cover inhibited monitoring illegal logging in certain areas of the Amazon. The CPI analysis is based on comparing enforcement activities and deforestation rates in areas with greater cloud cover with areas with less cloud cover. It concluded that Ibama is systematically less present in municipalities with greater cloud cover in any given year and consequently these municipalities exhibit higher deforestation the following year.
The CPI analysis found that indicate that the presence of Ibama enforcement personnel (assumed to be proportional to the number of environmental fines applied in a municipality in a given year) significantly reduced deforestation the following year showing that effective monitoring and law enforcement reduces deforestation.
To quantify this effect, CPI performed two simulations. In the first scenario, it was assumed that the annual number of fines in each municipality from 2007 through 2011 was equal to that observed in 2003, the year before the PPCDAm program was launched. This allowed CPI to compare the rates of deforestation in the absence of PPCDAm with what was actually observed which reflects the PPCDAm program. This simulation suggests that without PPCDAm, the Amazon would have lost over 101,000 sq km to illegal deforestation from 2007 through 2011. The actual deforestation observed during this period was 41,500 sq km. This suggest that the PPCDAm policies conserved 59,500 sq km of the Amazon forest.
In the second simulation, CPI assumed a scenario where there was no monitoring and enforcement in other words no fines were applied in all Amazon municipalities from 2007 through 2011. It is concluded that in the absence of any monitoring and law enforcement, over 164,200 sq km of forest would have been deforested between 2007 through 2011 and that monitoring and enforcement saved more than 122,700 sq km of the Amazon.
CPI makes the case that the adoption of a satellite-based system for real-time monitoring of deforestation and effective targeting of law enforcement activities reduced deforestation in the Brazilian Amazon. The sheer magnitude of the forest area that was preserved indicates that the relative impact of DETER-based monitoring and law enforcement was far greater than that of other conservation policies implemented under the PPCDAm framework.
CPI also concluded that the policy change had no effect on agricultural production supporting the conclusion that both preservation and economic growth can happen simultaneously.
Among other things the MOU provides for the establishment of Open Source Geospatial Laboratories and Research Centers (ICA-OSGeo Labs) across the world for supporting development of open-source geospatial software technologies, training and expertise.
The motto of the ICA-OSGeo Lab initiative is "Geo For All". It is intended that the combination of e-learning tools and open source GIS will increase access to GIS education. Free and open GIS software helps make geospatial education available to students from economically poor backgrounds and students in developing and poor countries.
At this year's GIS in the Rockies, James Fee gave the keynote which was targeted at GIS professionals on the topic of how to remain relevant in the age of change. He concluded that a GIS professional is a programmer and he recommended learning how to not only use but also to develop applications using the latest open source geospatial software.
To help people get started in open source geospatial software development, a completely funded OSGeo India, FOSS GIS -3- week Winter School at IIIT Hyderabad, India has just been announced.
Geospatial Technology has made rapid strides in the last decade, but a lack of people with software development skills and access to source code has led to a set of uses that are limited by the tools rather than the needs of the application domains. For example, in urban planning, though development of master plans are able to exploit these tools for macro level planning, its integration to micro & meso-level needs of communities and government/utility agencies respectively is hindered by the absence of the appropriate GIS based applications. Tp develope these requires not only knowing how to use geospatial software but the ability to modify existing and create new modules for specific domains. To do this requires access to source code and an understanding of the software development process. Combined with knowledge of geospatial science this can lead to innovation and the development of new advanced technology for solving problems in specific domains.
To contribute to creating a community of developers with the necessary software development skills, a 3-week Winter School on “Open Source Development of Geospatial Technology – Community level planning” is being organised during from 27th January to 15th February 2014. It is organized by the Open Source Geospatial Foundation - India (OSGEO-India) and the International Institute of Information Technology, Hyderabad (IIIT-H). The last date for applying for the course is 29th December, 2013.
The objectives of the course are
To impart knowledge about the current open source software development practices, with a geospatial focus
To educate about geospatial science concepts and on how to use them for problem solving
To encourage course participants to think ‘outisde-of-the-box’ by providing a platform for assessing and implementing innovative ideas.
The focus of the current winter school will be on integrating geospatial concepts into community level planning.
The organizers see this as the first initiative of its kind in India to encourage people to understand and modify a software tool, rather than be passive users of the toolset.
One of the organizers of the course is K.S. Rajan, Head of the Laboratory for Spatial Informatics at IIIT-H in Hyderabad. Dr. Rajan has just received the Indian National Geospatial Award 2013 from the Indian Society of Remote Sensing in recognition of his contribution to the field of geo-spatial science, technology and applications in India.
I blogged earlier about Google joining LocationTech as a strategic member (together with IBM, Oracle, Boundless( was OpenGeo), and Actuate).
After lining up key strategic members, the next step was to attract open source geospatial projects which wanted to take advantage of the governance and intellectual property services including commercially friendly licensing that LocationTech/Eclipse provides. LocationTech has recently received a number of project
proposals that will put LocationTech on the leading edge of open source geospatial development.
provides scalable processing for huge datasets by using MapReduce, a
programming paradigm for distributed processing, to build an efficient
framework for processing large-scale data. SpatialHadoop is an extension to Hadoop that allows
efficient processing of "big spatial data".
is a robust, scalable, high performance data storage and retrieval
system based on Google's BigTable design and built on Hadoop. GeoMesa provides
a foundation for storing, querying, and transforming spatio-temporal
data in Accumulo. It implements Geotools interfaces that enable Geoserver and other Geotools related projects to use Accumulo as a data store.
GeoTrellis is a general framework
for low-latency geospatial data processing developed using Scala and
Akka. GeoTrellis was designed to solve three core problems, initially
for raster processing, scalable, high performance geoprocessing web
services; distributed geoprocessing services for large data sets; and
arallelizing geoprocessing operations to take full advantage of
I blogged about this recently because it is a project to develop distributed long transactions for geospatial data.
GeoGit is a distributed version control system specially
designed to handle geospatial data efficiently. It takes inspiration
from the source code versioning system
Git, but has an approach suited
to spatial data (currently Shapefiles, PostGIS, MS SQLServer, or SpatiaLite.) GeoGit efficiently handles very large
binary data and is optimized for spatial data using a spatial index.
JTS Topology Suite
JTS Topology Suite (JTS) is an open source Java software library that
provides an object model for planar geometry together with a set of
fundamental geometric functions. JTS conforms to the Simple Features
Specification for SQL published by the Open GIS Consortium. JTS provides the core component for vector-based geomatics
software or can be used
as a general-purpose computational
Geoff (Geo Fast Forward)
Geoff is intended to enable existing Eclipse Rich Client Platform (RCP) applications to integrate into the geospatial domain using lightweight components to visualizing geospatial information on a geographical map embedded in the RCP applications.
At this year's GIS in the Rockies, James Fee gave an enthralling keynote targeted at GIS professionals on the topic of how to remain relevant in the age of change (or What GIS Pros Can Do to Keep Their Skills in Demand).
He went through his own discovery of maps and technology beginning with atlases and continuing with a lot of the key technologies of the past decades most of which I can remember such as personal computers (Atari 800), the Logo programming language, Turbo Pascal, Hypercard, Freehand, Hamster dance (which I don't remember), the first releases of Arc/Info, Arc Macro Language (AML), spatial databases, the Internet, Sun, Sparcstations and Solaris, ArcView and the Avenue scripting language, mobile devices, big data, real-time, Perl, Python, spatial analytics, UAVs and 3D cities. His main point is that geospatial folks have always been at the forefront in adopting new technologies and that GIS has a history of pushing the envelope. His recommendation for GIS professionals is
("Python is the bee's knees"). James is a major baseball fan and he asserted that If you like GIS, you like statistics and furthermore if you like statistics, you like baseball. He gave a very simple example of using Python and several Python mathematical libraries to generate an interesting perspective on current baseball standings.
To answer the title question, what should a GIS professional do to stay relevant ? James asserted that if you
Put points on a map and throw up a scale bar
Perform geoprocessing without Python or Model Builder
Have a job description of “Plotter Operator”
Have no idea what “fuzzy tolerance” is
you will soon be out of a job. On the other hand if you
Embrace Python as your GIS tool of choice.
Use Model Builder to automate your work flows.
Learn new tools such as TileMill/Mapnik/PostGIS
(all open source geospatial I would add) your future is bright.
One of the key enabling technologies introduced in the early days of AM/FM/GIS systems was long transactions, also known as data versioning. Long transactions make it possible to store spatial design data in the same database as as-built data, enabling design engineers to work on new designs at the same time as operations staff are relying on as-built information to operate and maintain active utility networks.
The first commercial systems implementing this technology in the early 1990's were IBM's GFIS which ran on DB2, Intergraph FRAMME, and VISION* which managed versioned spatial data in an Oracle RDBMS (this was long before Oracle Spatial and Workspace Manager). As I remember, San Diego Gas & Electric and BC Hydro were GFIS sites. The largest FRAMME site was Ameritech, now part of AT&T. The largest VISION^ sites, which are still running and supporting about a thousand concurrent designers, are US West, now CenturyLink, and Telesp, now Telefonica Sao Paulo.
Smallworld implemented long transactions in 1996. Since then both Oracle (Workspace Manager) and ArcSDE (now ArcGIS) have implemented support for long transactions.
Each of these early implementations implemented long transactions in a quite different way. For example, in some the number of versions at any given time is assumed to be small, but the number of objects in each version is large. In others the number of versions can be large, and the the number of objects per version is small.
Recently there has been a renewed interest in spatial long transactions with a focus on managing them in a distributed data environment. It is being proposed that the underlying version management platform be based on the same principles as Git which has become the de facto standard in the open source community. Git was originally developed by Linus Torvalds for source code management for Linux. Google, Facebook, Microsoft, twitter, Linked In, Qt, Gnome, and Android are some of the folks using Git. Millions of open source projects are hosted on GitHub. The GeoGit project is an attempt to apply a Git inspired version management system to spatial data in a distributed environment. Currently, users can import raw geospatial data as Shapefiles, PostGIS or SpatiaLite into a repository where any changes to the data are tracked. These changes can be viewed in a history, reverted to older versions, branched in to sand boxed areas, merged back in, and pushed to remote repositories. GeoGit, still in early development or alpha mode, is written in Java, and is available under the BSD License.
One of the use cases GeoGit is intended to support is a hybrid data management process involving an authoritative database that can be updated by crowdsourced data. It is being proposed that with distributed versioning, it is possible to have the best of the crowdsourced (e,g,, OpenStreetMap) and authoritative (e,g,, TIGER) worlds. "Data stewards could maintain their own data repositories for their particular areas of interest. They could also apply the same quality assurance checks as the central authority, which would enable their data to be easily pulled in to the central database. However, the central authority need not be involved in order for partnering stewards to exchange updates with one another. These inter-steward exchanges would also facilitate quicker updating to the central authority, because there would be fewer conflicts during the merging of the updates the various stewards submit."
At this year’s AGI GeoCommunity '13 Conference in Nottingham, Iain Langlands, GIS Manager, Glasgow City Council gave an overview of the £24million Future City / Glasgow program which will demonstrate how technology can make life in the city smarter, safer and more sustainable.
Glasgow had to compete with 29 other cities to win the Technology Strategy Board's ‘Future Cities Demonstrator'. "The city will demonstrate how providing new integrated services across health, transport, energy and public safety can improve the local economy and increase the quality of life of Glasgow's citizens, and will allow UK businesses to test new solutions that can be exported around the globe." The Technology Strategy Board is the the innovation agency of the UK National Government.
The city has started a program that will build a new operations centre, establish a city technology platform and develop innovative demonstrator projects that address across four themes; public safety, transport health and energy.
For example, on the theme of travel/transport, there are initiatives aimed at reducing conjestion by route optimization and other technologies. On the energy theme, there is an initiative focussed on intelligent buildings, samrt buildings that can adjust its own lighting and heating based on analytics using a variety of sensor data. In the area oh health there is a program to improve social mobility. 70% of Glasgow's neighbourhoods are below the national norm for social deprivation and this program is designed to help address this issue. In the area of public safety, there are initiatives involvign closed circuit TV for monitoring streets and public areas and intelligent street lighting.
One of the "motherhoods" of the Glasgow smart city intiative is that data collected by the city will be open. This will enable intelligent operations and real-time analysis, not only by the city but by developers/entrepreneurs and is intended to promote research and the development of new businesses. The data will be available through a public portal.
Citizens are viewed as mobile sensors so that a lot of the data is crowdsourced. OpenStreetMap is the source for much of the street data. An example of an app that will be an important in empowering citizens is the MyGlasgow app that allows citizens to report problems such as potholes. A unique aspect of Glasgow's app is that it asks the citizen reporting the pothole to assess the risk the potholes represents. Rather surprisingly to some, based on the feedback they have gotten so far, the man in the street's assessment of risk is not very different from a professional road inspector's.
Empowering citizens to put all residents at the forefront of technology integration and application is key to the Future City Demonstrator, as well as collaboration between stakeholders. A small team of specialists are working on projects designed to show how technology can improve life in the city, helping to inform and connect its citizens, and help move towards a more self-reliant and sustainable society. Geospatial technology will be central to the success of the program. The geospatial platform for the city's technology appears to be primarily open source and includes MapBox, TileMill, and Quantum GIS.
I blogged previously about the OGC GeoPackage specification being available for public comment. Chris Holmes of OpenGeo, who has been a major contributor to the open source geospatial community since joining the GeoServer team as lead developer in 2002, has posted an illuminating blog post about his experience in working with the Open Geospatial Consortium (OGC) standards specification process on the GeoPackage specification.
Back in February Chris was unhappy with the GeopPackage specification as it then was. So much so that he decided to join the GeoPackage Standards Working Group (SWG), "participating in weekly
(and then twice a week) calls, and trying to work with the OGC workflow
of wikis and massive Word documents." One of his goals was to learn about how the OGC process actually works and be able to "offer some suggestions for improvement from my open source software experience."
This is very important from an OGC perspective for two reasons. First of all, open source geospatial is an important and growing segment of the geospatial software community, and secondly "open source loves standards" so that historically often the first implementations of a standard and the first adopters have come from the open source community.
In his blog post Chris reported that the OGC staff has been "great" about being open to new ways of working. In particular he was very excited that the SWG had achieved an OGC first by putting the GeoPackage specification out on GitHub, which is expected to make it much more accessible to the open source community than if it had been made available in the traditional OGC way (wikis and Word docs).
Secondly he is happy with the specification that the OGC standards process has generated. In his own words he believes that the specification is ‘pretty good’ as it is right now, but that he expects that the GeoPackage specification will improve as a result of real implementations. He says he would like to see three full implementations before the GeoPackage V 1.0 standard is fully adopted by the OGC.
For the OGC putting the specification on GitHub is an experiment, so Chris has asked the community, especially the open source folks familiar with working with GitHub, to become involved in helping to improve the specification. For folks without GitHub experience the SWG has written some tips on how to contribute to the GeoPackage specification without having to learn git.
In summary this, because it comes from an open source geopatial perspective, sounds very positive. I think it bodes well for more direct involvement of the open source geospatial community in OGC standards specification processes in the future. This would be a win-win for the OGC and the open source geospatial community.