USGIF GotGeoint Blog USGIF promotes geospatial intelligence tradecraft and a stronger community of interest between government, industry, academia, professional organizations and individuals focused on the development and application of geospatial intelligence to address national security objectives.
Canada signed the Copenhagen Accord in December 2009 and committed to reduce its greenhouse gas emissions to 612 Mt or 17 percent below 2005 levels (737 Mt) by 2020.
Canada’s total greenhouse gas (GHG) emissions in 2011 were 702 megatonnes (Mt) of carbon dioxide equivalent (CO2 eq). Canada’s emissions growth between 1990 and 2011 was driven primarily by increased emissions from the fossil fuel industries and transportation. Emission reductions from 2005 to 2011 were driven primarily by reduced emissions from electricity generation and manufacturing.
Based on 2011 emissions and existing regulations the Government of Canada estimates the country will produce 734 MT of greenhouse gases in 2020, or 122 MT higher than its committment target in Copenhagen.
However, the contribution from different sectors to the total has changed.
Energy-stationary excluding electricity 1.6%
Energy-fugitive emissions 12.7%
Industrial processes -0.4%
National inventory total 0.0%
Emissions in the electricity generation sector dropped by 6.1% due to continued declines in electricity demand and changes in the generation fuel mix. This was offset by increases in fugitive emissions (12.7%) resulting from increased production activity in the coal mining and natural gas sub-sectors and increases in stationary energy excluding electricity (1.6%), transport (3.1%) and agriculture (3.0%).
The U.S. Environmental Protection Agency (EPA) has released annual greenhouse gas emissions data for 2012 collected under the Greenhouse Gas Reporting Program (GHGRP). The program collects annual greenhouse gas
information (CO2, CH4, N2O, SF6, NF3, HFCs, PFCs) from facilities in the largest emitting
industries, including power plants, oil and gas production and refining,
iron and steel mills, and landfills. For 2012, 7,809 facilities in nine industry sectors reported direct emissions to the atmosphere.
Total emissions were 3.13 billion tonnes of carbon dioxide equivalent (CO2e), about half of total U.S. greenhouse gas emissions. This represents a 4.5% decrease from the emissions reported in 2011. In 2012 the U.S. GDP increased by 2.8%.
Power plants are represented by 1611 facilities in the database. Together they emitted 2.090 billion tonnes of CO2e in 2012, about 40 percent of
total U.S. carbon pollution. The 2012 GHGRP data show that power plant emissions decreased by 6.3% from 2011 to 2012 or about 10% since 2010 when reporting began. The EPA ascribes the emissions drop primarily to the switch from coal to natural gas for electricity generation.
Transparency about the state of the environment in Brazil is a key goal of IBAMA. As an example, IBAMA teamed with the National Institute for Space Research (INPE) to use satellite imagery to monitor, make public and control deforestation in the Amazon, When the program was initiated in 2004, illegal deforestation was increasng dramatically. The goal of the program was to reduce illegal deforestation by 80%. The program has been successful beyond expectations, and is now six years ahead of schedule in reducing illegal deforestation. In the graph the green line is actual deforestation and the yellow line represents the objective.
A related program is monitoring an embargo of products produced from timber illegally cut which has also contributed to reducing illegal logging.
The Air Resources Board (ARB) held its fourth auction of greenhouse gas allowances on August 16, 2013. The auction included a Current Auction of 2013 allowances and an Advance Auction of 2016 allowances.
The Auction Administrator (AA) reported that the 2013 auction clearing price is $12.22 per allowance, with 13,865,422 total 2013 allowances sold. The AA reported that the 2016 auction clearing price is $11.10 per allowance with 9,560,000 total 2016 allowances sold.
According to a report in the press, about 350 businesses and cities took part. Allowances for 2013 emissions, valued at $170 million, sold out. Additionally, more than $100 million in 2016 emission credits were sold this quarter. California's cap-and-trade program came into effect in January. The first auction was held in November of last year.
AB32 is the legislation that mandated carbon trading for about 350
companies in California. In the first two years 90 percent of the
credits are free. In other words, based on their emssions history
companies get 90% of their emission allowances for free from the
state. But by 2016, all allowances will be sold. Participants have to buy credits if they emit over 25,000 tons of carbon dioxide per year. Between 2013 and 2020, the number of greenhouse gas credits offered will decrease by 2-3 % per year, which is expected to drive up the price of the allowances.
The program is designed to be compatible with cap-and-trade programs in
the Western Climate Initiative (WCI) which includes six U.S. states
(California, Montana, New Mexico, Oregon, Utah and Washington) and four
Canadian provinces (British Columbia, Manitoba, Ontario and Quebec).
Four major independent datasets show 2012 was among the 10 warmest years on record.
Minimum Arctic sea ice extent in September and Northern Hemisphere snow cover extent in June each reached new record lows. Arctic sea ice minimum extent (1.32 million square miles, September 16) was the lowest of the satellite era and 18 percent lower than the previous record low extent of 1.61 million square miles that occurred in 2007.
A new melt extent record occurred July 11–12 on the Greenland ice sheet when 97 percent of the ice sheet showed some form of melt, four times greater than the average melt this time of year.
The Antarctic maximum sea ice extent reached a record high of 7.51 million square miles on September 26. This is 0.5 percent higher than the previous record high extent of 7.47 million square miles that occurred in 2006.
Four independent datasets indicate that the globally averaged sea surface temperature for 2012 was among the 11 warmest on record.
Heat content in the upper 2,300 feet, or a little less than one-half mile, of the ocean remained near record high levels in 2012. Overall increases from 2011 to 2012 occurred between depths of 2,300 to 6,600 feet and even in the deep ocean.
Sea level reached a record high: Globally, sea level has been increasing at an average rate of 3.2 ± 0.4 mm per year over the past two decades.
Continuing a trend that began in 2004, oceans were saltier than average in areas of high evaporation, including the central tropical North Pacific, and fresher than average in areas of high precipitation, including the north central Indian Ocean.
Global tropical cyclone activity during 2012 was near average, with a total of 84 storms, compared with the 1981–2010 average of 89. Similar to 2010 and 2011, the North Atlantic was the only hurricane basin that experienced above-normal activity.
Major greenhouse gas concentrations, including carbon dioxide, methane, and nitrous oxide, continued to rise during 2012. Global CO2 emissions from fossil fuel combustion and cement production reached a record high in 2011 of 9.5 ± 0.5 petagrams of carbon. A new record of 9.7 ± 0.5 petagrams of carbon is estimated for 2012.
Atmospheric CO2 concentrations increased by 2.1 ppm in 2012, reaching a global average of 392.6 ppm for the year. In spring 2012, for the first time, the atmospheric CO2 concentration exceeded 400 ppm at several Arctic observational sites.
Cool temperature trends continue in Earth’s lower stratosphere: The average lower stratospheric temperature, about six to ten miles above the Earth’s surface, for 2012 was record to near-record cold.( Increasing greenhouse gases and decline of stratospheric ozone tend to cool the stratosphere while warming the planet near-surface layers.)
Impact of humans
A draft of the United Nations IPCC Fifth Assessment Report (AR5) report leaked to the New York Times says that it is "extremely likely" that humans caused "more than half of the
observed increase in global average surface temperature from 1951 to
2010." The draft report predicts that sea levels will rise by between 29 and 82cm by the end of the century, with scientists "fairly confident" that sea level will be closer to the upper limit. The report is being circulated for final review and will be released September 2013 to November 2014.
Acording to the International Energy Agency (IEA), the energy sector accounts for two thirds of greenhouse gas emissions. CO2 emissions reached a record high in 2012 growing by 1.4% to reach 31.6 Gt in 2012.
CO2 emissions by fuel type
According to the IEA, in 2010, 43% of CO2 emissions from fuel combustion were produced from coal, 36% from oil and 20% from gas. Between 2009 and 2010, CO2 emissions from the combustion of coal increased by 4.9% and represented 13.1 GtCO2. Without additional abatement measures, the IEA projects that emissions from coal will grow to 15.3 GtCO2 in 2035.
CO2 emissions by sector
Generation of electricity and heat was by far the largest producer of CO2 emissions and was responsible for 41% of world CO2 emissions in 2010. Worldwide, this sector relies heavily on coal, the most carbon- intensive of fossil fuels, amplifying its share in global emissions. Countries such as Australia, China, India, Poland and South Africa produce between 68% and 94% of their electricity and heat through the combustion of coal.
Between 2009 and 2010, total CO2 emissions from the generation of electricity and heat increased by 5.6%, while the fuel mix remained unchanged. CO2 emissions from coal increased by 4.7% and from natural gas by 9.5%.
In 2010 among the five largest emitters of CO2, China, the Russian Federation and the United States have reduced their CO2 emissions per unit of GDP between 1990 and 2010.
Emissions per capita
Among the five largest emitters in 2010, the levels of per-capita emissions ranged from 1 tCO2 per capita for India to 5 tCO2 per capita for China to 17 tCO2 per capita for the United States. In 2010, the United States alone generated 18% of world CO2 emissions with a population of less than 5% of the global total. China contributed about thes same share of world emissions (24%) while accounting for 20% of the world population. India, with 17% of population, contributed more than 5% of CO2 emissions.
TransnetBW, a grid operator in Baden-Württemberg, is beginning operating an environmentally friendly transformer that uses vegetable oil as a coolant. The device will link the 380-kV ultra-high voltage level
with the 110-kV grid in the Bruchsal-Kändelweg substation plant near
Karlsruhe. The transformer weighs just under 340 tons and contains 100 tons of vegetable oil.
Transformers are usually cooled and insulated with mineral or silicone oil. These oils are harmful to the environment and highly flammable. Vegetable oils are less flammable and environmentally friendlier. The types of vegetable oils are rapeseed, soy or sunflower oils, which are are biodegradable and have a much higher flashpoint. This means that a vegetable oil transformer can be operated without additional protective equipment such as collecting tanks, even in zones with strict environmental requirements. Vegetable oil transformers are much safer in densely populated residential areas.
According to Jack Dangermond who led the plenary sessions on the first day, this year in San Diego 12000 people from 130 countries were in San Diego for the 33rd Esri International User Conference.
I gravitated primarily to the utility sessions. According to Bill Meehan, who is responsible for ESRI's utility business, there were 750 utility folks at this year's UC.
GIS is everywhere
GIS is being used in diveerse areas including environmental monitoring, climate change, permafrost melting, sea level rise, agriculture, water resources, aquaculture, pollution remediation, energy resources management, land information systems, urban design, utility network management, wasteshed modeling, utility pole inspection, sewer rehabilitation, port management, indoor GIS, iInsurance risk management, health-related mapping, law enforcement, mapping where space debris might fall, flood risk mapping, storm tracking and damage assessment, geodesign, crowdsourcing for collecting a variety of data, interoperabilty, citizen engagement, government transparency, cartography, geology and mining, automatic generalization of different scale maps, story maps, portals for infrastructure, and open data, to mention just a few of the things that were singled out in the plenary session.
Some of the areas that were singled out for special recognition were
Special Achievements in GIS award went to the Dartmouth Atlas for showing geographically how outcomes and the cost of medical procedures varies depending on location across the U.S.
The Hong Kong Lands Department received the Enterprise GIS award for their land management system which they started in the 1990's and is now in its second generation.
Tbe President's Award went to Direct Relief which distributes different types of equipment to people around the globe in need with 99% efficiency according to Forbes.
The theme of this UC was GIS transforming our world, not only changing the physical world as we know it, but also changing our perception of the world and how GIS has and will fundamentally help us collectively build a better future. A classic example that has impacted everyone on the planet is GPS or more generally GNSS. As a number of speakers including Dangermond said we are never lost any more and that is fundamental change to how we perceive the world.
Some of the major themes that emerged at this UC are common to many
software and service providers in the IT sector, but some of these are
new or unique to ESRI.
We are facing serious global challenges, in particular climate change, which both at the Geodesign Summit and at this UC conference appears to be a personal challenge for Dangermond who sees GIS as contributing in major way in creating a more sustainable future. He sees GIS helping to change how we think and act and how we do basic things such as design.
GIS across the organization
A theme that was pervasive throughout the conference including in the utility sessions was the need to make GIS pervasive across the organization because 90% of the data that organizations are collecting and managing has location. This was also true in utility sessions I attended, where the thrust is to expand the use of GIS outside of the traditional area of records management into other departments.
Vertical industry "templates"
In the utility and telecommunications sectors there is a major effort underway to create preconfigured vertical "templates" for specific vertical industries. In the utility sector one was announced yesterday for the electric power Industry, and others are underway for gas and telecommunications. These templates are a collection of data and tools for vertical industries and are available as open source on Github.
The web was pervasive, It was hard to find a session where the theme that desktop GIS is being transformed into web GIS did not come up. It doesn't mean that the desktop is going to go away in the near future, but sharing data, applications, and apps is being recognized as a common goal and web GIS makes this very easy to do. This includes web portals for accessing enterprise data over the web.
A lot of the traditonal desktop GIS capabilities are now being made available in the cloud so that all you need is an iOS or Android device or a web browser. This is also available to large enterprises who need to keep their data on their own servers for security reasons. Cindy Salas from Centerpoint Energy demonstrated ESRI's cloud capability but running "on premise" completely wthin Centerpoint's fire wall.
Being able to integrate location and the spatial dimension into enterprise applications such as ERP (SAP), business intelliegence (IBM Cognos), Salesforce, MS Dynamics, and Excel spreadsheets is a new focus for ESRI. They have already created plugins for several well-known applications.
In the past when you talked about mobile in the context of utilities, you were normally dealing with Panasonic Toughbooks which could cost up to $6 000 each. At this conference just about every application was able to run on a variety of low cost mobile devices including iOS and Android mobile devices. One of the sessions I attended was about the National Broadband rollout in Australia where the 500 field folks who are doing 300 000 pit inspections were all equipped with iPads, which you might have thought were too fragile for pretty rough field work.. This project has been underway for something like 18 months and todate only two of the iPads have been damaged.
Spatial data volumes whether from sensors at utilities, from LIDAR scanning of transmission lines, or from earth obsevration satellited are now reckoned in terabytes. There were sessions on spatially enabling Hadoop, SAP HANA and other big data management systems for managing these data volumes.
In the utility track managing real-tme data from sensors on transformers, smart meters, phasors, and other devices and enabling near real-time decision making was a common theme. A general event processing capability that can be triggered by rules has been added to the quiver to make this possible for any data stream.
In the area of imagery, Digital Globe imagery can be accessed with 4-5 hours of acquisition.
Big data and real-time means that we need to be able to understand what these huge volumes of data are telling us and then enable decision-making, automated or human mediated, as rapidly as possible. If a transformer is running hot, we can't wait a week until the next service check to reconfigure the network because by then we may have reduced it projected lifetime by 10%. There were "location analytics" sessions in most of the vertical industry sessions including utilities.
Software companies like ESRI and Hexagon are now in the content busines. And with the data beng made available in the cloud, you can access it through web GIS or desktop GIS. The data includes traditional topographic maps, digital terrain models, imagery from earth observation satellites from Digital Globe and others some of it near real-time (fr example, from Digital Globe within 4-5 hours of acqusition), demographics, and others.
Dangermond's term for this is the "living atlas" because this body of content is intended to be dynamic, real-rime and comprehensive. The goal seems to be all the data you would need to manage the planet.
Some of the content explicitly mentioned includes
30 cm coverage of the US
60 cm coverage of Western Europe
imagery for the middle east and asia in total covering 2/3 of the world
demographics, at the zip code, county, state, and national levels
scientific maps including environmental, energy, infrastructure, and terrain models
This data is accessible to web and desktop applications and makes it very easy to do standard spatial analytical things like multi-criteria suitability or site selection mapping. Users can also publish their own data to the cloud.
This has always been a strong focus area for ESRI and from Dangermond's perspective with the challenges the world is facing, GIS professionals will be even more critical in the future. There is focus on advancing spatial literacy to a broader audience than in the past.
Laser scanning is a major source of important data for the construction industry (right of way determination and construction progress monitoring), utilities (transmission vegetation management), and other industries. ESRI has made this a focus area and there were a number of sessions devoted to managing and analyzing point cloud data.
Dangermond specifically singled out 3D as a major capability area for ESRI. There are specialized 3D products, but 3D is also being built into ESRI main stream GIS and spatial analytical products.
3D city models
At the plenary there was a demonstration of how to quickly build a 3D city model from 2D data, including extruding buildings from a 3D building footprint and height, adding textures depending on building types, and then tools for analysis such as zoning and shadow analysis and visualization including flythroughs. These models are sharable on the cloud as web scenes requiring only a web browser (with no plugins) or iOS mobile devices.
At the UC ESRI is demonstrating a prototype of a product designed specifically for geodesign.
Dangermond specifically mentioned the newest national open data initiative in Peru.
There are sessions specifically on editing and manipulating OpenStreetMap data.
At the INSPIRE conference in Florence, Ewa Klien, Joachim Rix and Didier Vancutsen gave an overview of an interesting EU project called Plan4business. Plan4business is designed to address the challenges faced by urban and regional planners today. Datasets are not easy to use for business issues such as comparative analysis, monitoring and analysing urban statistics, or developing urban inquiries and projects. Private and public sector researchers, planners and professionals from the real estate world, insurance, energy, agriculture, and others as well as investors realize that they need to develop the ability to used spatial datasets from many sources to perform complex analysis and then visualize the results for non-technical decision makers. The Plan4business project is aiming at developing a web-based platform which will offer urban and regional planning data users a full range of compatible planning data and services such as transport and energy infrastructure, regional, environmental, and zoning planning. The Plan4business project started in April 2012 and will continue through March 2014.
The Plan4business platform is intended to be competitive from a business perspective. It is intended that the platform will provide content as well as analysis and visualisation services via both an Application Programming Interface (API) for application developersand an interactive web frontend for end users. Functions offered will have to range from simple statistical analysis to complex trend detection and to 2D/3D representations. The Plan4business platform is intended to provide opportunities for planners, private consultants, public and private data providers, and researchers.
To provide some idea of the type of planning projects that are envisaged as using this platform., a number of operational use cases have been defined for both the private and the public sectors.
Spatial Planners, Planning engineers
Banks and Insurances Services
Energy and environmental services
Tourism and Travel Services
Transport and Logistics Services
Spatial Planning Authorities
Regional Development Agencies
Other Public Services
Public researchers / Universities
Energy and environmental services
For example, the Energy and environmental services use case involves an environmental agency that needs to design and propose a number of policies for managing and moderating noise levels in urban realm, both residential settlements and productive/industrial areas. Policy designers have to assess criterion, together with competent local authorities, to limit/contain noise level to which humans are exposed, particularly in built-up areas, in public parks or other quiet areas in an agglomeration, in quiet areas in open country, near schools, hospitals and other noise-sensitive buildings areas.
The world is moving toward ubiquitous real-time automated monitoring of environmental parameters such as air quality and water level.
The Open Geospatial Consortium (OGC) Sensor Observation Service (SOS) applicable to use cases in which sensor data needs to be managed in an interoperable way. This standard defines a Web service interface which allows querying observations, sensor metadata, as well as representations of observed features.
The INSPIRE equivalent is part of the INSPIRE Regulation for Network Services Operation of a Download Service
Today at the INSPIRE conference , Arne Bröring of 52°North gave an overview of an effort to make SOS and INSPIRE compliant. It appears that this will be feasible without any change to the existing SOS and INSPIRE standards. It is planned that an INSPIRE Technical Guidance document will be extended to include SOS compliance. In addition an open source implementation is planned.