Yesterday at Distributech 2018 in San Antonio I listened to a talk from a consulting company contracted by an electric power utility to clean up GIS data originally sourced from CAD drawings. This reminded me once again that is a challenge that telecommunications and cable companies and electric and water utilities have been attempting to deal with at least since the early 1990s and for some even before that. Telecommunications and cable and leading electric power utilities resolved this problem at least a decade ago, but for some electric power and even more water utilities it remains a problem that is going to hold these utilities back when they should be accelerating their adoption of digital technologies.
I began working in the utility and telecom sector in 1993, specifically in developing software for records management (called network documentation outside of North America), and at that time the magnitude of the AEC+geospatial cultural mismatch problem was just beginning to be appreciated by utilities and telecoms. In utilities and telecoms planners tend to use GIS tools, engineers and designers CAD tools, construction folks in the field paper CAD drawings, and asset managers GIS integrated with their FM tools. Fashioning AEC and geospatial data into an efficient data flow from planning through design and construction to operations and maintenance represented a challenge that remains a problem for utilities. The Between The Poles blog is over ten years old and one of the persistent themes from the very beginning in 2006 was the challenge of integrating CAD and GIS data in a common workflow (some examples; 2006, 2007, 2008 ).
A workflow that is found at every utility and telecom involves the design-build-maintain information flow. The workflow begins with an engineer and drafter producing a design, typically a CAD drawing, which is printed for the construction contractor to use in building a facility. The contractor uses the drawing in the field and then returns it as an "as-built" to the utility's records management group, who enter it, traditionally by manually redigitizing it from the paper drawing, into a GIS that is intended to contain the digital twin of the network. The data in the GIS is used for network maintenance, outage management and other functions. The reason for the manual redigitization step is the "errors" that CAD drawings inevitably contain unless drafters are forced to adhere to "geospatial rules". This entire process is slow and error-prone and backlogs of as-builts waiting to be entered into the GIS stretch into months and even years. The result is that utility GIS data is more often than not incomplete, out-of-date, and unreliable. For example, at the beginning of the broadband revolution, telecoms needed to be able to tell customers whether they were close enough to a terminal and fiber to be eligible for ADSL. In one of the largest telecoms in the U.S. the GIS records were so incomplete and unreliable that survey teams had to be sent out at considerable cost to resurvey the location of their terminals and fiber cables.
Because there was no standards-based solution to the CAD+GIS problem, every utility and telecom has had to try to find its own solution to the problem. This problem has persisted from the 1990s and remains a problem for utilities - as I said I heard a presentation yesterday that wasn't that different from presentations on the same topic I and others gave a quarter of a century ago.
Over the years the poor CAD/GIS interoperability has cost the utility and telecom sectors considerable amounts of money. Large telecoms have spent hundreds of millions of dollars trying to cleanup their GIS data and streamline their CAD+GIS workflows. A rough back-of-the-envelope calculation suggests this problem has cost directly and indirectly trillions of dollars over the past 20 years. In addition unreliable and out-of-date network infrastructure data that results from this problem has safety implications.
The juncture between distinct sectors each with its own technologies and business models, AEC (Autodesk, Bentley, and others) and geospatial (ESRI, Hexagon, and others), has analogies with the clash of tectonic plates and the associated incessant seismic events. In the mid-1990s pressure from customers motivated Autodesk and ESRI, 800 lb gorillas in their respective areas of CAD and GIS, to attempt to collaborate to address the CAD+GIS problem. The result of the agreement between Jack Dangermond and Carol Bartz at the time resulted in an unsuccessful product called ArcCAD, which satisfied neither CAD nor GIS users.
Averting a tectonic clash repeat: BIM+geospatial interoperability
The prospect that, by analogy with CAD+GIS problem, we could still be individually finding solutions to BIM/geospatial integration 20 years from now is discouraging. While there are parallels between CAD/GIS and BIM/geospatial, there are grounds for optimism that this time around we may find ways the address the problem in a general way. Firstly, in the mid 1990s there were virtually no geospatial or 3D AEC data standards. The Open Geospatial Consortium (OGC) and builsingSmart were both only founded in 1994. It required quite a while for the OGC geospatial and buildingSmart's IFC standards to broadly penetrate industry and government. Nowadays, there are many widely adopted geospatial standards; Simple Features for SQL, WMS, WFS, GML, KML, InfraGML, CityGML to name just a few from the OGC. buildingSmart's IFC standard is supported by all major BIM design tools including the market leader Revit.
Perhaps most significantly there has been significant progress in developing common conceptual models for geospatial and AEC views of city infrastructure. The major breakthrough in bringing the architectural and geospatial views onto a common footing is the OGC LandInfra Conceptual Model developed by the OGC in cooperation with buildingSMART International and approved as an OGC standard in August, 2016. The LandInfra conceptual model was developed by Bentley Systems, Leica Geosystems, Trimble, Australian Government Department of Communications, Autodesk, Vianova Systems AS, and buildingSMART International and provides a unifying basis for land and civil engineering standards including the OGC's InfraGML and buildingSmart International's IFC for infrastructure standards. InfraGML is the OGC's application schema supporting land development and civil engineering infrastructure facilities. InfraGML supports a subset of the existing LandXML standard. buildingSmart International's IFC-Alignment project uses this common conceptual model of alignments for roads, railways, tunnels and bridges. The objectives of the IFC-Alignment project is to enable the exchange of alignment information through the full infrastructure lifecycle from planning through design and construction to asset management.
Secondly, in the 1990s and the early 2000s there were few open source geospatial projects. The Open Source Geospatial Foundation (OSGEO) was only founded in 2006. Since then there has been rapid development of open source geospatial projects. There is enough open source geospatial code now that at the GEO|Design+BIM conference in Amsterdam, Sanghee Shin described a web-based geo-platform integrating massive, complex BIM models using open source geospatial libraries and tools including PostGIS, GeoServer, and Cesium and NASA WorldWind virtual globes. The importance of this project is that it shows that it is possible using the available open source geospatial APIs and libraries - together with some genuine innovative development - to create an open, non-proprietary, 3D geospatial platform for integrating geospatial and BIM. Given the critical importance of addressing the cultural and technical divide between the AEC and geospatial worlds, clearly many believe that a viable open source alternative will be essential for developing the innovative solutions to the challenge of interfacing the two worlds.
Finally, major players in the AEC and geospatial industries, Autodesk and ESRI, have again agreed to collaborate. This time around they are probably more aware of the technical issues hindering BIM+geospatial integration. Both have adopted open standards when it made business sense - often under pressure from government clients - and both have also adopted open source technology where it made business sense. Together the combination of major players supporting the effort, standards organization like the OGC and buildingSmart already working to address the problem, and a broad range of open source code and a vibrant developer community may be enough to address the technical problems. If building and infrastructure owners and facilities managers including government as well as regulators become aware of the risk of not solving the problem, a solution may be found so that we will not be having the same BIM+geospatial integration discussion a quarter of a century from now.
Comments