I was fortunate to be invited to be a moderator at the 2nd Annual Asset Management for Energy and Utilities Conference in Toronto last week. This conference is one of the most interesting I have participated in in the last year because it focused on what I think is the central issue faced by the utility industry in the new era of the smart grid, optimizing business processes for data quality.
Ivano Labricciosa of Toronto Hydro, one of the two largest utilities in Ontario, focused his presentation on the issues that keep utility executives up at night. The two at the top of his list are
- Exponentially increasing volume of data
- Data quality
He put this is the context of the smart grid, where networks are becoming more complex and the number of asset classes or equipment types a utility has to manage is going up exponentially. The different types of data he identified includes asset registry (financial tables), maintenance and inspection data and reports, records (often maintained in a GIS), power usage data from smart meters, data relating to the automation of the distribution network (self-healing), and live condition monitoring of equipment (sensors).
From his perspective the key business drivers pushing utilities to focus on data quality are
- Regulation (Ontario Energy Board in Ontario)
- Customer satisfaction
- Industry best practices
In North America, as the ASCE scorecard has been very effective in making many people aware, we have not invested in our infrastructure for the past couple of decades. As a result much of our infrastructure is beyond its life expectancy. The result is that outage rates are increasing and the cost of maintenance is going up.
The key challenges relating to data quality facing utilities that Mr. Labricciosa sees are
- Paper-based processes
- Aging workforce
- Field force participation
- Manual processes
- Sustainable data quality
All of these are certainly on my list of data quality-related issues. His recommendation is that utilities need to focus on developing the right business processes to sustain data quality. For example, he described a data quality dashboard that allows management to see at a glance when there are data quality issues such as missing or incomplete maintenance reports. The good news is that right now "energy is sexy" and is getting a lot of attention and funding, so he is optimistic that these problems are going get the attention they require.
The As-built Problem
Alan Saunders talked about some of the data management issues that contribute to poor data quality. In particular he discussed the as-built problem. He cited the results of a survey conducted by Sierra Energy on the length of as-built backlogs, which is typically 6 months but can stretch to two or more years. One of the contributors to as-built backlogs is paper-based processes. Engineering design drawings, which are created with a CAD desktop application such as AutoCAD and printed for use by the construction team in the field, are returned to Records as marked-up paper as-builts. What then happens in many utilities is rework, the design data (which is already in electronic form as a DWG or DGN file) is redigitized by Records from the paper as-built into a GIS, a process that can introduce errors, consume resources that utilities can't
spare, and is responsible for the as-built backlog found at virtually all utilities.
Condition-based Maintenance and the Aging Workforce
The opening address by Professor Andrew Jardine was focussed on
condition-based maintenance. To replace traditional calendar-based
maintenance approach, utilities need to develop a hazard model for each
type of equipment that allows the utility to identify when to replace or
conduct preventative maintenance on each piece of equipment. When
reliable historical data for a piece of equipment as well as the
variables that determine its life expectancy are available, then it is
possible to create a hazard model to compute the probability of failure
for the device. However, in many cases historical data is missing or
limited, in which case a process of knowledge elicitation (capturing tacit knowledge), which
Dr Ali Zuashkiani, also of the University of Toronto, has used at
several utilities, can be applied to develop a hazard model. The method
used by Dr Zuashkiani relies on interviewing experts familiar with each
type of equipment and involves pairwise comparison of alternative
scenarios. Professor Jardine reported on a research project in which
the results of a knowledge elicitation approach were compared with
hazard models developed from reliable historical data. Hazard models
developed by Dr Zuashkiani using a knowledge elicitation process were
found to be a close enough approximations to hazard models developed
using historical data. This is an important result because it means
that there is a potentially reliable alternative for developing hazard
models when historical data is unavailable or spotty. Utilities
interested in taking advantage of this approach should look at doing
this sooner rather than later, because the aging workforce problem in
many utilities around the world - many of the experts are retiring.
Yury Tsimberg and Stephen Cress of Kinectrics emphasized that reducing stress on equipment can extend the life of equipment. One of the promises of the smart grid is to manage stress better. For example, spreading the load evenly between two transformers is better than stressing one, while the other is idling. An important benefit of the smart grid could be longer life expectancies for some types of equipment.
Moving from a utility that redraws all of its plans to one that does not, I was quite amazed at the difference in backlogs. Weeks rather than years.
Posted by: Ben Reilly | March 03, 2010 at 01:14 PM