USGIF GotGeoint Blog USGIF promotes geospatial intelligence tradecraft and a stronger community of interest between government, industry, academia, professional organizations and individuals focused on the development and application of geospatial intelligence to address national security objectives.
The UK Government as part of its building information modeling (BIM) initiative has said repeatedly that it expects the big payoff of a digital model, estimated at more than 40% savings, will be during operations and maintenance, typically representing 80% of the total cost of a facility. Companies such as BAM who do Design, Build, Finance and Maintain (DBFM) projects report significant benefits from full lifecycle BIM + geospatial. But there is little if any quantitative evidence supporting this conjecture. I have asked people from Finland familiar with the very early BIM developments in that country if there were studies of the benefits of BIM for operations and maintenance, but apparently the BIM focus there has been entirely on design and build.
Crossrail with a budget of £14.8 billion is the biggest engineering project in Europe. It involves 42 km of tunnels beneath one of the most densely populated parts of Europe. It has wider tunnels and its 40 stations have longer station platforms than the Tube has. Crossrail trains are expected to start running next year and the full network should be open by 2019.
But the most interesting aspect of the Crossrail project is a 3D digital model with associated asset data that has not only been used during design and construction, but is intended to be used for operations and maintenance. Crossrail appears to be the first major project that may be able to provide support for the conjecture that the biggest benefits of BIM are for operations and maintenance.
The Crossrail model is comprised of spatial and non-spatial data with links between the two. The spatial data is made up of more than 250,000 3D BIM models as well as as-builts, together comprising a few terabytes. As construction of each facility is completed as-builts are collected by point-cloud survey using laser scanners. The point clouds captured in the survey are compared to the design and divergences that need resolving are recorded for fixing. The detailed asset data and documentation add an additional 5 terabytes. This represents one of the World's largest BIM model. A critical aspect of the spatial database is that all assets are geolocated so that workers can query a particular location of London on a map and then navigate to the Crossrail assets there.
The model is intended to become a crucial tool for monitoring, operating and maintaining Crossrail’s systems once the railway is running. Sensors monitor various aspects of the railway's operation and remote-controlled devices can change operating parameters from a central control room or from a handheld device. Managers can view this information within the 3D model and can zoom in on an area which needs attention. Crossrail is testing low-power wireless smart sensors called Utterberries that can monitor strain, temperature, humidity, acceleration, and other aspects of a facility. Utterberries weigh 15 grams and are smart - they have an ARM processor on-board and can operate for more than a year on one charge. One of the coolest capabilities of the digital infrastrucure is an augmented-reality interface which allows workers to hold an iPad up to a wall or floor and see a view of the infrastructure (electricity, water, and communications) under the floor or behind the wall.
Harsha's blog post provides insightful speculation as to Google's reasons for terminating these products.
Dale Lutz points out that ESRI appears to be working with Google to take on Google's enterprise customers. According to ESRI's Google relationship page: "Google and Esri are working closely together to provide replacement software and training to all of Google's enterprise customers and partners that have implemented Google Earth Enterprise and Google Maps Engine technology. Esri will be providing the new 10.3 version of ArcGIS for Server and related client/app technology to all Google Earth Enterprise and Google Maps Engine customers and partners.”
INSPIRE-Geospatial World Forum 2015, a joint conference organized by the European Commission and Geospatial Media and Communications, has made available its full conference program. Almost 500 presentations are scheduled on topics including building information modelling (BIM), open data, big data analytics, open standards, linked data, cloud computing, crowdsourcing, Earth observation, indoor positioning, land information systems for smart cities, urban resilience and sustainability, health, agriculture and others. Some 2000 delegates are expected to attend from more than 80 countries. Top sponsors include Trimble, Topcon, ESRI, Digital Globe, Oracle and Bentley.
The theme of the conference is CONVERGENCE: Policies + Practices + Processes via PPP with a focus on improving coordination among policy-makers, technology providers and users. Including geospatial data and technology in construction, agriculture, health and other industry workflows is an enabler for more successful public–private partnerships (PPP) by facilitating more informed decision making among the stakeholders.
Michael Byrne, who was Geographic Information Officer (GIO) of the Federal Communications Commission (FCC) and led the development of the National Broadband Map (NBM), has just been awarded the Citizen Services Medal for creating a series of online maps (National Broadband Map and derived maps) and geospatial visualizations that helped people make informed decisions about the country’s communications systems. It was developed in seven months from a standing start, launched Feb17, 2011, and got half a million hits in the first ten hours it was up.
The Service to America Medals are awarded every year to federal employees who achieve amazing results and epitomize public service.
The National Broadband Map allows users to use a web tool to search broadband availability across the United States and compare real download speeds to advertised broadband performance. All of the data is available for download. The map was part of an FCC stimulus (American Recovery and Reinvestment Act of 2009) initiative to improve broadband coverage/speed across the U.S.
The development of the National Broadband Map is a fascinating story. The business objectives of the National Broadband Map, which were were mandated by Congress, included searchability, interactivity, responsive on-line access and a very tight development timeframe.
The data collection part of the story was a challenge because the FCC had to rely on the states to collect the data. The FCC ended up with data on 3500 broadband providers, submitted by 50 states, 2 territories and the District of Columbia. Michael's team had to take all this data in many different formats and load it into the NBM database. The end result was a database covering 1650 broadband providers with 25 million records showing where broadband Internet service is available, the technology used to provide the service, the maximum advertised speeds of the service, and the names of the broadband providers.
Michael and his team's biggest technical challenge was the tight development timeline. The second major challenge was ensuring fast user response time.
Seven months is an incredibly tight development timeline for any enterprise system. Early in the development cycle, the technical team, which was comprised of only eight people including Juan Marín Otero, the lead geospatial architect, decided that a RESTful architecture and open source geospatial software were the only way they were going to complete the project within the timeline because they recognized that they had to communicate directly with the developers if they were going to resolve issues in days, rather than months.
After deciding on the open source software and open data stack, the system was developed in four and a half months. It survived a deluge of traffic in its first hours and days. The first day the map went live, the site received 158 million hits.
The open source software and open data stack used for the NBM includes
The objective of the remarkable four year Stonehenge project was to create a detailed archaeological map of Stonehenge and its surroundings based on a synthesis of remote sensing and subterrainean geophysical data. The result is a digital model of the Stonehenge landscape that ties surface features together in a seamless map with underground features and structures that shows that Stonehenge proper is only a small piece of a much larger, more complex ancient structure.
The Stonehenge Hidden Landscapes Project is led by the University of Birmingham in conjunction with the Ludwig Boltzmann Institute for Archaeological Prospection and Virtual Archaeology and is a collaboration with the University of Bradford, the University of St Andrews, and the 'ORBit' Research Group of the Department of Soil Management at the University of Ghent, Belgium.
The data collection technologies include surface remote sensing methods such as aerial photography, laser scanning and airborne imaging spectroscopy. The geophysical prospection methods used ground penetrating radar (GPR), electromagnetic induction, and magnetometry. Under suitable ground conditions GPR surveys provide detailed three dimensional information about depth, shape and location of archaeological structures at a high spatial resolution. GPR can be used to detect stone structures, interfaces caused by pits and trenches, and cavities and differences in soil humidity. New multichannel GPR arrays permit considerably increased spatial coverage with greatly improved resolution and can generate detailed 3D images of subsurface structures. Magnetometery is most suitable for mapping archaeological structures causing anomalies in the Earth's magnetic field such as prehistoric pits, trenches, postholes, walls, fire places and kilns. Magnetometer surveys result in a 2D map without direct information about the depth of the buried structures. Electromagnetic induction measurements can be used to efficiently and non-invasively to map physical properties of the soil and buried objects.
The main platform for integrating the remote-sensed and the geophyical measurements is a GIS-based archaeological information system, with additional tools for dynamic visualization and spatial analysis. The huge amount of data generated by the remote sensing and geophysical prospection measurements require appropriate data processing, aanalytical and 3D visualization tools.
The remote sensing techniques and underground geophysical surveys have discovered hundreds of new features which now form part of the most detailed archaeological digital map of the Stonehenge landscape ever produced. The results of the survey include 17 previously unknown ritual monuments dating to the same time period as Stonehenge. A massive timber building, probably used for the ritual inhumation of the dead and which was finally covered by an earthen mound has been mapped. The immense Durrington Walls 'super henge', situated a short distance from Stonehenge was also mapped. It was found to have a circumference of more than 1.5 kilometers. The results will be featured in a new BBC Two series titled Operation Stonehenge: What Lies Beneath.
According to Gigaom in 2012 a Florida-based patent troll sued both Apple and Google because they included a view of street level images next to or inserted in a map in their products. The troll had filed patents on this type of combined imagery (see image for an example) in 2003. Apple and Google joined forces to challenge the troll based on "prior art" using a new Patent Office appeals system. The system is known as inter partes review. Inter partes review is a new trial proceeding conducted at the Board to review the patentability of a claim in a patent based only on prior art consisting of patents or printed publications. It came into effect in 2013 through the America Invents Act of 2012. It allows third parties to challenge patents before administrative patent judges. Last week the appeals board of the U.S. Patent and Trademark Office ruled that all 27 claims contained in the troll’s patent are invalid.
Markets and Markets has released a report that analyzes the LiDAR (Light Detection and Ranging) market by product types (airborne, mobile, terrestrial and short range) and applications (government, civil engineering, military/defence/aerospace, corridor mapping, topographic surveying, and volumetric mapping) and provides global forecasts for the period 2013 – 2018.
LiDAR has been traditionally applied to produce 3D digital elevation and terrain models typically for forestry and civil engineering applications. But corridor mapping (for example, for transmission lines), which uses mobile LiDAR systems as well as airborne LiDAR, is projected to be the application with the highest annual growth rate from 2013 to 2018.
The report estimates that the global LiDAR market was worth $218.9 million in 2012. Civil engineering was the most important application. Most of the revenue was derived from airborne lidar systems, partly because they are relatively expensive. But this is changing. The report forecasts that the global LiDAR market will grow by more than 15 % annually over the next five years, reaching $551.3 million in 2018. The report predicts that airborne LiDAR systems will have the lowest compound annual growth rate among the four types of LiDAR systems and projects that the revenue associated with terrestrial and mobile LiDAR systems will surpass that of airborne LiDAR systems by 2018.
The report also forecasts that low-cost LiDAR systems could revolutionize the surveying industry in the next five years. I blogged recently about a light weight UAV-mounted LiDAR platform weighing less than 10kg which combines UAV, LiDAR and GNSS technology and is designed as a micro-mapping solution.
Beginning in December 2013, the Energy Information Administration (EIA) has begun reporting detailed accounting of generator additions and retirements as well as their effects on total generation capacity by fuel and technology type on a state-by-state basis. Power plant owners and operators report plans to modify or retire existing generators, or build new generators, every year as part of EIA's Annual Electric Generator Report. All generators at utility, industrial and commercial plants with at least one megawatt (MW) of capacity are included. Smaller installations such as residential rooftop solar PV are not broken out in these reports.
The state-level detail in the tables can be represented using maps like the one included here. Maps make it easier to see how different fuel types are reflected in different regions, for example, new solar generators in the Southwest, natural gas generators in Texas and along the Atlantic Coast, and wind generators in the Plains region that are expected to come online between December 2013 and November 2014.
According to Jack Dangermond who led the plenary sessions on the first day, this year in San Diego 12000 people from 130 countries were in San Diego for the 33rd Esri International User Conference.
I gravitated primarily to the utility sessions. According to Bill Meehan, who is responsible for ESRI's utility business, there were 750 utility folks at this year's UC.
GIS is everywhere
GIS is being used in diveerse areas including environmental monitoring, climate change, permafrost melting, sea level rise, agriculture, water resources, aquaculture, pollution remediation, energy resources management, land information systems, urban design, utility network management, wasteshed modeling, utility pole inspection, sewer rehabilitation, port management, indoor GIS, iInsurance risk management, health-related mapping, law enforcement, mapping where space debris might fall, flood risk mapping, storm tracking and damage assessment, geodesign, crowdsourcing for collecting a variety of data, interoperabilty, citizen engagement, government transparency, cartography, geology and mining, automatic generalization of different scale maps, story maps, portals for infrastructure, and open data, to mention just a few of the things that were singled out in the plenary session.
Some of the areas that were singled out for special recognition were
Special Achievements in GIS award went to the Dartmouth Atlas for showing geographically how outcomes and the cost of medical procedures varies depending on location across the U.S.
The Hong Kong Lands Department received the Enterprise GIS award for their land management system which they started in the 1990's and is now in its second generation.
Tbe President's Award went to Direct Relief which distributes different types of equipment to people around the globe in need with 99% efficiency according to Forbes.
The theme of this UC was GIS transforming our world, not only changing the physical world as we know it, but also changing our perception of the world and how GIS has and will fundamentally help us collectively build a better future. A classic example that has impacted everyone on the planet is GPS or more generally GNSS. As a number of speakers including Dangermond said we are never lost any more and that is fundamental change to how we perceive the world.
Some of the major themes that emerged at this UC are common to many
software and service providers in the IT sector, but some of these are
new or unique to ESRI.
We are facing serious global challenges, in particular climate change, which both at the Geodesign Summit and at this UC conference appears to be a personal challenge for Dangermond who sees GIS as contributing in major way in creating a more sustainable future. He sees GIS helping to change how we think and act and how we do basic things such as design.
GIS across the organization
A theme that was pervasive throughout the conference including in the utility sessions was the need to make GIS pervasive across the organization because 90% of the data that organizations are collecting and managing has location. This was also true in utility sessions I attended, where the thrust is to expand the use of GIS outside of the traditional area of records management into other departments.
Vertical industry "templates"
In the utility and telecommunications sectors there is a major effort underway to create preconfigured vertical "templates" for specific vertical industries. In the utility sector one was announced yesterday for the electric power Industry, and others are underway for gas and telecommunications. These templates are a collection of data and tools for vertical industries and are available as open source on Github.
The web was pervasive, It was hard to find a session where the theme that desktop GIS is being transformed into web GIS did not come up. It doesn't mean that the desktop is going to go away in the near future, but sharing data, applications, and apps is being recognized as a common goal and web GIS makes this very easy to do. This includes web portals for accessing enterprise data over the web.
A lot of the traditonal desktop GIS capabilities are now being made available in the cloud so that all you need is an iOS or Android device or a web browser. This is also available to large enterprises who need to keep their data on their own servers for security reasons. Cindy Salas from Centerpoint Energy demonstrated ESRI's cloud capability but running "on premise" completely wthin Centerpoint's fire wall.
Being able to integrate location and the spatial dimension into enterprise applications such as ERP (SAP), business intelliegence (IBM Cognos), Salesforce, MS Dynamics, and Excel spreadsheets is a new focus for ESRI. They have already created plugins for several well-known applications.
In the past when you talked about mobile in the context of utilities, you were normally dealing with Panasonic Toughbooks which could cost up to $6 000 each. At this conference just about every application was able to run on a variety of low cost mobile devices including iOS and Android mobile devices. One of the sessions I attended was about the National Broadband rollout in Australia where the 500 field folks who are doing 300 000 pit inspections were all equipped with iPads, which you might have thought were too fragile for pretty rough field work.. This project has been underway for something like 18 months and todate only two of the iPads have been damaged.
Spatial data volumes whether from sensors at utilities, from LIDAR scanning of transmission lines, or from earth obsevration satellited are now reckoned in terabytes. There were sessions on spatially enabling Hadoop, SAP HANA and other big data management systems for managing these data volumes.
In the utility track managing real-tme data from sensors on transformers, smart meters, phasors, and other devices and enabling near real-time decision making was a common theme. A general event processing capability that can be triggered by rules has been added to the quiver to make this possible for any data stream.
In the area of imagery, Digital Globe imagery can be accessed with 4-5 hours of acquisition.
Big data and real-time means that we need to be able to understand what these huge volumes of data are telling us and then enable decision-making, automated or human mediated, as rapidly as possible. If a transformer is running hot, we can't wait a week until the next service check to reconfigure the network because by then we may have reduced it projected lifetime by 10%. There were "location analytics" sessions in most of the vertical industry sessions including utilities.
Software companies like ESRI and Hexagon are now in the content busines. And with the data beng made available in the cloud, you can access it through web GIS or desktop GIS. The data includes traditional topographic maps, digital terrain models, imagery from earth observation satellites from Digital Globe and others some of it near real-time (fr example, from Digital Globe within 4-5 hours of acqusition), demographics, and others.
Dangermond's term for this is the "living atlas" because this body of content is intended to be dynamic, real-rime and comprehensive. The goal seems to be all the data you would need to manage the planet.
Some of the content explicitly mentioned includes
30 cm coverage of the US
60 cm coverage of Western Europe
imagery for the middle east and asia in total covering 2/3 of the world
demographics, at the zip code, county, state, and national levels
scientific maps including environmental, energy, infrastructure, and terrain models
This data is accessible to web and desktop applications and makes it very easy to do standard spatial analytical things like multi-criteria suitability or site selection mapping. Users can also publish their own data to the cloud.
This has always been a strong focus area for ESRI and from Dangermond's perspective with the challenges the world is facing, GIS professionals will be even more critical in the future. There is focus on advancing spatial literacy to a broader audience than in the past.
Laser scanning is a major source of important data for the construction industry (right of way determination and construction progress monitoring), utilities (transmission vegetation management), and other industries. ESRI has made this a focus area and there were a number of sessions devoted to managing and analyzing point cloud data.
Dangermond specifically singled out 3D as a major capability area for ESRI. There are specialized 3D products, but 3D is also being built into ESRI main stream GIS and spatial analytical products.
3D city models
At the plenary there was a demonstration of how to quickly build a 3D city model from 2D data, including extruding buildings from a 3D building footprint and height, adding textures depending on building types, and then tools for analysis such as zoning and shadow analysis and visualization including flythroughs. These models are sharable on the cloud as web scenes requiring only a web browser (with no plugins) or iOS mobile devices.
At the UC ESRI is demonstrating a prototype of a product designed specifically for geodesign.
Dangermond specifically mentioned the newest national open data initiative in Peru.
There are sessions specifically on editing and manipulating OpenStreetMap data.