In jurisdictions worldwide the low quality of underground location data has been identified as a top cause of underground utility damage. Existing underground utility data, most of which is stored in network owners' records, is often inaccurate, out of date or simply missing. In North America in a survey of 402 locate professionals better facility maps received nearly unanimous agreement as a top strategy for improving the efficiency and quality of locates. Singapore's Digital Underground project is unequivocal about the economic and social impact of low quality underground location data stating that "virtually all stakeholders are aware that much of the available information is unreliable and that this has repeatedly led to losses of time, money and opportunities."
Challenges facing the utility mapping industry
The root cause of the low quality of underground utility data is how utility mapping is structured as an industry. The primary activities of the utility mapping industry are data collection and how we keep and share this data with other parties.
Data collection
The way we collect is slowly changing, but predominantly we're still using traditional survey and locate methods. The survey methodology we are using hasn't changed in a hundred years. It's a very time and materials based, manual and analog process. Survey processes are slow and expensive to execute and require specialized equipment and skill sets, and safety is a big concern.
Locating underground utilities is one of the biggest causes of delays in construction schedules, service disruptions are a common occurrence, safety is always a major concern. and locate operations are slow to execute. Accuracy and timeliness are major issues. Locate operations require specialized equipment and skill sets which restricts locators to a small number of industry insiders. A major shortfall is that underground utility location data is rarely available during the crucial planning and design stages of civil engineering projects.
Democratization of utility mapping
The utility mapping industry needs some fundamental changes. We need to move away from the monopoly that exists right now. There is just a handful of companies that are able to go out and collect underground utility data. We need to democratize data collection so that many more people can collect this data, targeting a crowd sourced approach and enabling multiple creators or collectors providing this data. This distributes the responsibility for the data as well. The more distributed the responsibilities, the lower risk and the lower the cost of failures will be.
We also need to change the skill level required. With current technology, we should be able to automate much of routine locating and thus enable many more people to do it. Using inexpensive technology will lower the cost of collecting and aggregating locate data. As data becomes lower cost to capture, data retention will become less expensive which will encourage people to retain the location data they acquire rather than simply painting/marking the ground and not capturing the data digitally.
Democratization is the real future of 3D infrastructure mapping. Right now data is still very siloed. We'll always have the traditional mapping companies, survey companies, line locating companies and asset owners going out and collecting their own data. But through crowdsourcing data from routine day to day locating and mapping, infrastructure projects, and other sources, aggregating that data and distributing access to that data is how utility mapping will really come to the forefront. It should be easy to upload data into a central register where it can be made accessible to multiple stakeholders. Compliance with Open Geospatial Consortium (OGC) standards provides a basis for full interoperability enabling open access to every single conduit, cable, pipe, or other piece of infrastructure in real time from your phone, your tablet, your laptop or other applications.
Infrastructure digital twin
We think the best paradigm for producing and capturing asset data is through collecting it into an infrastructure digital twin. Digital twins have made a lot of progress in a variety of industries. Most importantly, for the utility mapping industry, it should be noted that creating a digital twin is much easier than it was even just a few years ago. We think the time is right for an infrastructure digital twin and it can be just at the asset level for one particular company, but we believe it should be at a regional level. Cities, municipalities and regional governments around the world are beginning to develop comprehensive digital twins of all of their physical infrastructure; buildings, roads, trees, power lines, and subsurface utilities and they're pouring resources into it.
And we think that Canada and the United States should start doing the same thing. And for us in the utility industry, data collection costs are lower than they ever have been. There's clearly interest among stakeholders and governments to start dealing with infrastructure and utility mapping problems, particularly to lower the number and costs of strikes, injuries and the loss of life. We now know a lot more about how to create data from other data sets, using different types of analysis and artificial intelligence to piece disparate data sets together. Using AI machine learning in the cloud is much less inexpensive compared to just a few years ago. And nowadays one of the things that makes infrastructure digital twins readily accessible to the utility industry is the fact that storing huge volumes of data in the cloud has become very inexpensive.
What is infrastructure digital twin ? First of all, a digital twin needs to result in effective action. It's not just enough say we have a digital twin philosophy The purpose of having a digital twin is to result in effective action that improves your business. It needs to use real time and historical data, which is paramount for us in the utility mapping industry, and it's going to be motivated by outcomes. For us in the utility industry, it is to reduce and even eliminate utility strikes during construction and the associated risks of injuries and fatalities.
The infrastructure digital twin has to be stored and managed in a relational database so it can be queried, analyzed and improved on through time, and related to other data sets. Processes for collecting data and maintaining data must be simple so that we can break the barrier of restricting underground utility mapping to a specialized group of people. If we truly want to democratize and compile data from disparate data sets, we need very simple processes to get data into the database, analyzed and maintained.
An infrastructure digital twin should be captured and maintained in 3D, even though most utilities in the ground are within just a couple of meters of the top of the ground. The 3D component is vital, because part of analyzing a data set is that much ancillary data is 3D such as overhead power lines that come down the side of a pole into conduits below the ground,
Imagery that's taken looking straight down at the ground, like traditional satellite imagery, can be visualized on a map and incorporated directly into a GIS database. But other types of imagery such as oblique, 360-degree, street-side, and inspection imagery need to be treated differently. Oriented Imagery has a range of possible applications. For example, a construction company might use it to review asset inventory photos taken by field workers at a construction site. A telecom company or electric utility would use it to examine close range inspection images of electrical and telecom towers from many different angles. Oriented imagery applications enable you to see exactly where the image is on the map as you pan or zoom in the image. There are spatial navigation tools designed to work specifically with oriented imagery. You can query oriented images using spatial and temporal filters.
Finally, it is very important to have standardized formats, including storage formats and cloud formats. It improves our ability to inexpensively use machine learning to do associative analyses of disparate data sets to make a larger, more conflated data sets.
Automated feature extraction
Maintaining accurate, up-to-date, comprehensive data about assets, above ground and below ground, has been a challenge for utilities for decades. Now new technologies including machine learning are being applied to automate real-time maintenance of utility network information and provide a foundation for infrastructure digital twins.
Automated feature extraction has advanced dramatically in the last few years. This is an example where high density LiDAR and ultra high resolution imagery was used on a very low, very small diameter gas distribution install from an open trench. You can clearly see the ease with which the LiDAR produces a 3D representation of all of the local utility infrastructure.
Much of the the advancement in in automated 3D mapping have derived from the autonomous vehicle industry which have trickled down to other parts of the GIS industry. The hardware and software are vastly easier to use and produce vastly better data. You can see from these ortho images collected from our system and the LiDAR overlaid on it. Feature extraction was used to automatically define top of pipe, top of ditch and actual joints in the pipe. Automated classification of LiDAR removes the need for someone to go out there with a Pogo stick and find the top of pipe. It makes it possible to simply drive alongside and scan with a mobile LiDAR system an open trench containing a newly installed pipe to produce a detailed 3D model with centimetre accuracy without having a person present. It is much safer - no boots on the pavement - and it drastically reduces the cost.
We prefer mobile mapping systems as opposed to handhelds because mobile mapping systems produce a tremendous amount more data and its registration tends to be vastly superior. It is much safer because it eliminates a need for someone to get out of a vehicle - so no boots on the pavement. Anytime you're still getting out, whether you're using handhold laser handhold mapping system, or a Pogo stick, you still have a person outside the vehicle walking around a street. With a mobile system you eliminate that risk, and you also drastically decrease the time required to do the collection. Here is another example where automated feature extraction was used to identify the electro fused coupling. You can see the butt joints, you can see the Ts and so forth. All this can be done very simply now with automated feature extraction.
Data registers
While surveyors may have had experience with a provincial or other types of survey register, utility location data identified during routine locate (one call) operations in general is only used to mark the ground and is rarely recorded, digitally or otherwise. Practically this means locators going back to the same location they were at just a few weeks ago to repeat the locate operation for another project. People in the locate industry are very familiar with this and the amount of schedule and cost expense it causes. Exposing people to relocate or remap an area that was just done not too long ago is doubling their risk. Secondly, because the data is not retained or kept in easily accessible ways, we can't analyze it. For example, we might want to confirm that we have correctly identified the type of utility, the accuracy of its location, or determine if there have been changes in its location, type, or current status since we last detected it. And because the data is not being retained and centrally managed at a global level or at a large scale level, there's no real opportunity for organic data enhancements or improvements.
The CGA Next Practices Report to Industry has made the case that a comprehensive national GIS map of buried infrastructure is among the opportunities for systemic improvement with the greatest ROI for industry. The benefits of a national register are greater safety, enhanced construction efficiency; reduced design, construction, and maintenance costs; greater regulatory compliance; improved locate efficiency; fewer project delays and budget overruns; fewer service disruptions to the public, less traffic disruptions and loss of custom and other inconveniences for the public.
Sharing maps of underground infrastructure is not totally new to Canada. For example, the City of Calgary's Joint Utility Mapping Project (JUMP) system, the ICI Society in British Columbia, the City of Toronto's Digital Map Owners Group (DMOG) and others have been sharing underground utility information for years.
The Canadian Underground Infrastructure Register (CUIR) is a new national initiative started at the beginning of the year. The CUIR initiative is intended to enhance current practices in a way that enables all stakeholder groups to benefit from the initiative; construction contractors, network owners, government transportation agencies, design engineers, subsurface utility engineering (SUE) engineers, surveyors, locators, workers in the field and the public. The CUIR initiative has been inspired by the National Underground Asset Register (NUAR) project currently being implemented in the UK. The objective of NUAR is to develop a register of location data for all underground infrastructure in England, Wales and Northern Ireland that is accessible to all network owners and government agencies.
This post is based on Joseph Hlady's (Lux Modus) talk at Subsurface Utility Mapping Strategy Forum (SUMSF).
Comments