Maintaining accurate, up-to-date, comprehensive data about assets, above ground and below ground, has been a challenge for utility and communications firms for decades. Now new technologies including machine learning and computer vision - based feature extraction and text recognition and high accuracy location capture using GPS and augmented reality clouds are being applied to automate the real-time capture of utility network information in the field during construction, when new utilities are installed and when utilities are exposed and relocated. When fully implemented this will automate real-time maintenance and update of utility network information and form the foundation for utility digital twins.
Background
The decades-old as-builting process remains marred by manual paper-based processes that is characterized by inefficiency that causes critical processes such as outage restoration to require more time than they should and low data quality which leads to incomplete, out of date and inaccurate underground utility records that are responsible for frequent construction delays, injuries and fatalities for workers and the public, and a drag of billions of dollars on national economies.
One of the persistent themes from the very beginning of the Between The Poles was the challenge of integrating CAD and GIS data in a common workflow (some examples; 2006, 2007, 2008 ). Critical metrics for utility readiness for the smart grid is the as-built backlog, field update backlog and the location accuracy of the assets in utility GISs. As-built backlogs typically stretch into months, updates from the field take months to be entered into the records database (if they are entered at all), and the recorded location of assets in the GIS can be meters and even tens of meters from their actual location. The challenge is to replace conventional paper-based workflows with real-time digital work flows that can provide the basis for utility digital twins.
Real-time reality capture using machine learning and augmented reality clouds
New technologies that have been developed for consumer-based applications include machine learning for image and text recognition and more recently augmented reality clouds for recording accurate relative locations and distance measurements are now being applied for capturing utility network information. This week at the Mirrored World: Role of Digital Twins in the Utilities Industry virtual event, Skye Perry and Peter Batty of SSP Innovations, discussed and demonstrated how these technologies are being applied to the efficient, real-time reality capture of information about utility network facilities in the field during construction.
Peter Batty demonstrated functionality being developed for a new product SSP Vision that supports the semi-automated capture of information about newly installed electric network facilities. At an actual construction site Peter demonstrated automated feature recognition of electric and telecom pedestals, automated capture of attribute information from cabinets, scanning bar and QR codes and text for work order number, serial number, manufacturer and other attribute information, and using augmented reality + GPS to record the location of equipment and linear features such as a cable from the pedestal to the transformer. All of this information was captured and pushed up in real-time and was available in seconds in the utility GIS.
The key features that are being developed for SSP Vision are automatically recognizing equipment such as pedestals and cabinets, automatically detecting and capturing attributes from bar codes, QR codes and text recognition using computer vision and machine learning. The recording of accurate locations is enabled using augmented reality clouds and GPS. Augmented reality clouds add much greater relative accuracy including distance measurements and work both outdoors and indoors. At this point properties like feature type can be specified manually if automated feature recognition is unable to detect the type of feature or if a new feature is being entered as part of a new design being carried out using AR in the field. AR also enables capturing linear features with support for snapping to features such as pedestals and cabinets.
Foundation for keeping a utility digital twin up to date
This is truly ground-breaking and shows how these technologies can be applied now in a semi-automated fashion to dramatically improve the efficiency of capturing location and attribute information about utility facilities during construction. The longer term vision is that the capture of this information could be fully automated by simply mounting a camera on a construction crew member's hard hat and feature recognition and attribute identification and capture fully automated requiring no manual input from the crew. This will automate the real-time maintenance and update of utility network information and form the foundation for utility digital twins.
Comments