USGIF GotGeoint Blog USGIF promotes geospatial intelligence tradecraft and a stronger community of interest between government, industry, academia, professional organizations and individuals focused on the development and application of geospatial intelligence to address national security objectives.
At DistribuTECH Nancy Bui-Thompson, President of Sacramento Municipal Utilities District (SMUD), gave an insightful presentation about her utility from the perspective of an elected official. I think that Nancy is one of the first examples of an elected official presenting at DistribuTECH. SMUD is and has been for some time one of the most forward-looking utilities in North America. It was the first California utility to reach 20% renewable power and the first to commit to 33% renewables. It is also one of the most energy efficient utiltiies in California.
It is governed by an elected board of directors, who are responsible for policy and strategy, and perhaps this is the reason it is very customer focussed and has consistently maintained high customer satisfaction ratings. For example during the rollout of the smart meter program it maintained an impressive 95% customer satisfaction rating. The major customer drivers are green, renewable and reliable, and the need for choice. Nancy emphasized that SMUDs unique governance model reserving policy for the board of directors allows SMUDs management and employees freedom to implement without interference.
SMUD is moving from a centralized utility with a business model based on selling electricity to a distributed utility providing localized grid services. This means SMUD is getting out of the business of selling electricity, and into the business of selling grid services.
At the policy level the focus areas are customer analytics, changing the rate structure and grid analytics.
Customer analytics includes collecting operational data on customer behaviour to help identify and provide new services. For example, these statistics helps identify early adopters, customers who are interested in renewable energy or energy efficiency in the home and who help drive programs focussed on theses areas. Customer data and analytics also helps with segmentation, defining the different market segments that require different types and levels of grid services.
Grid analytics is helping SMUD better manage outages. Collecing operational data and analyzing it to be able to predict outages has reduced the number of outages by 20% and the average durating of outages by 28%.
One of the most interesting innovations at SMUD is in the area of rate structure. SMUD and other utilities are in the interesting position for a retailer of trying to sell less of its product, in this case electricity. People are using less electricity as a result of personal interest as well as SMUD's own energy efficiency programs. But the money has to come from somewhere. SMUD has introduced a flat infrastructure fee. Currently every customer pays $18/month for the grid, independent from how much power they consume. SMUD has determined that $28 is the breakeven point, where the cost of maintaining the grid would be covered by infrastructure fees, and is moving towards that monthly change.
Another interesting innovation at SMUD (which I have seen elsewhere) is their solar power program. Customers can buy into solar power without the bother of having to install solar panels on their roofs. The first solar program was a 1 MW solar PV program designed for residential customers. The next will be a an 11 MW program for commercial customers.
I am at DistribuTECH this week in Orlando. This is North America's largest electric power utility event. I don't know the number of attendees this year, but last year it was close to 10,000.
The first talk I heard was a riveting talk that hit on a key theme of this year's conference, big data and analytics. The talk was given by Lee Krevat of Sempra US Gas and Power and Tim Fairchild of SAS and was entitled "What can a regulated electric utility learn from Moneyball ?" Moneyball is a book by Michael Lewis about baseball and how the Oakland A's applied big data and analytics to become one of the top teams but without having the deep pockets of the richest teams.
Moneyball for transformers
As an example of the relevance of Moneyball is a utility that took every bit of data that could potentially be relevant to the lifecycle of a transformer, some 80 variables in all, and ran correlations with transformer failure data - for 1500 transformers. They found that just 10 of the variables predicted 91% of transformer failures. This compares with the utility's traditional way of forecasting failure which only predicted 15% of the failures. If you ask an experienced power engineer what the most important factor in determining the lifetime of a transformer, the historical load, and especially the overload, on the transformer would be at the top of the list. What the utility actually found was that the correlation analysis showed that load was only the eighth most important factor. The analysis showed that the most important factor was the number of meters attached to the transformer - which probably wouldn't have been on anyone's list.
Lee and Tim presented other analogies where analytics play a key role in both baseball and electric power.
The good face
Baseball scouts often profile players by the look of their face, body, and a "baseball look", not always on their actual statistics. Similarly utilities often profile circuits to prioritize capacity investment decisions, often based on "gut" feeling. Now they are able to leverage data to make better investments.
Age before beauty
Scouts show a preference for younger players even though baseball players with better statistics, but who are older, are also often available without long term contracts and at a significantly lower salary. By analogy large utility transformers have traditionally been replaced based on age. But now statistics on many transformers are being monitored and the transformers only replaced if their statistics are in decline and predict failure.
Lee and Tim suggested other practical applications of this approach which include using machine learning to predict failures of wind turbines and to determine why solar farms often generate less electricity than expected.
Worldwide there is increasing demand from building and infrastructure owners for service provision throughout the entire life cycle of a building or infrastructure. This represents a distinct break with the design/build tradition which has dominated construction for years. At the Year in Infrastructure conference in London, a dominant theme was the growing recognition of the importance of full lifecycle management of infrastructure. I found it symptomatic of the direction of the construction industry that fully one third of the 54 finalists for the annual Be Inspired Awards involved mapping, rehabbing, retrofitting, replacing and managing existing infrastructure. This is my classification of these Be Inspired finalists;
Patrick MacLeamy, Chairman of buildingSMART and CEO of HOK, has been pushing a very simple message about the U.S. construction industry for years. Buildings are too expensive, are too inefficient to operate and maintain, and don't last long. As a result the U.S. construction industry is falling behind the Nordic countries, the U.K. and Singapore. His solution is a full life cycle approach to construction. Information has to be shared between owners, designers, contractors, operations and facilities management over the entire life cycle of the building or infrastructure.
Over 50% of the cost of maintaining a building is operations and maintenance which is comprised of administration, maintenance and repairs, and restoration projects. In several countries BIM has become essential for design and construction. But many including the UK government believe that the full value of BIM can only be found during the operational life of the building where the majority of the life cycle costs occur. The UK government has said that "the 20% saving refers to CapEx cost savings however we know that the largest prize for BIM lies in the operational stages of the project life-cycle".
Road and highway infrastructure
Highway construction is being transformed, due in part to the arrival of autonomous vehicles. I've blogged about the startling (at least to traditional construction contractors) vision of the future of highway construction of the Chief of Surveys at the Oregon Department of Transportation (DoT) which targets the full lifecycle of highway assets from planning through design and construction and operation and maintainenance. Some large construction projects are already being designed, built and operated and maintained with a full lifecycle perspective.
Industry surveys report that up to 80 percent of a utility's resources and budget can be spent on operating and maintaining existing utility infrastructure. Surveys also show that aging utility infrastructure is a top priority for most utilities.
Be Inspired Awards: Mapping, Monitoring, Rehabbing, and Replacing Infrastructure
One of the projects focused specifically on full lifecycle data management for highway construction. The project, which was submitted by the Roads Directorate, Denmark, is for the $ 580 million 39-kilometer Herning – Holstebro highway which includes eight interchanges, four railway crossings, and five bridges. The important achievement of the project was to create a digital workflow with meaningful requirements for sharing data among disciplines and across the entire project lifecycle. The project was a finalist for the year's Be Inspired Award for Innovation in Roads.
Seven of the finalists' submissions involved renovation, rehabbing, and retrofit. An outstanding example of a rehab project is the Bond Street to Baker Street Tunnel Remediation Project. This is a London Underground project in the UK. It involved the replacement of the existing elastoplastic concrete lining of a 215-meter tunnel segment on the Jubilee Line with a spheroidal graphite iron lining - all while the line was running at full capacity. This achievement won this year's Be Inspired Award for Innovation in Rail and Transit at the Year in Infrastructure 2015 conference.
Two of the finalists' projects involved replacement. An example of a replacement project was the Decommissioning and Replacement of Del Rio Bridge on US 20 this was carried out by Harper-Leavitt Engineering for the Idaho Transportation Department with minimum disruption to traffic.
Five of the finalists submitted projects that involved monitoring and extracting more value from existing transportation and utility infrastructure including rail, electric substations, electric and water and waste water distribution networks.
An example is a project submitted by SA Water which won the Be InspiredInnovation in Asset Performance Management Award. The project involved integrating a hydraulic model and an operational analytics tool with network sensors to help them optimize their network. These tools enable them to optimize chlorine dosing for different water sources (runoff, desalinization, rivers), minimize electric power costs, and improve water quality by mapping water age across their entire network. SA Water have not only been able to reduce their power bill by A$3 million, but also have cut their network operating costs by nearly a A$ million. It has also resulted in improved water quality. More fundamentally it has given them much greater insight into sources of revenue and the costs of various aspects of operating a water network.
Four of the finalist projects involved mapping and historic site protection. An example is a gas main project submitted by Utility Mapping Services Inc. This project involved creating a 3D map of underground utilities along a stretch of highway with complex utility infrastructure woven through dense commercial and residential areas with limited right-of-way and heavy traffic congestion. Most critically from a safety perspective, there were no utility strikes on the project. As a result the 3D model is credited with reducing construction time from 10 to 7 weeks. Most importantly from a budget perspective, there were no change orders and the total cost of the project came in at 10-15% less than estimated in the absence of a 3D model.
Another example is a project submitted by the Singapore Land Authority (SLA). Singapore intends to be the world's first "smart nation". Part of this initiative involves developing a virtual Singapore that is intended to be the source of authoritative information about Singapore for use by government agencies. The project involves capturing large amounts of data using multiple rapid mapping technologies including oblique imagery, airborne laser scanning, mobile laser scanning, and terrestrial scanning. The data has been compiled into 3D city model in a single database repository which includes geometry, topology, semantics and appearance. The database relies on CityGML, a standard managed by the Open Geospatial Consortium (OGC), for the database schema and for data exchange. The total volume of data is more than 50 terabytes. The database is open and accessible to all government agencies. The most challenging part of the project has been the development of business processes and technologies for ensuring the data remains current. At the the Year in Infrastructure conference in London the SLA the won the annual Be Inspired AwardforInnovation in Government.
At the Year in Infrastructure conference in London SA Water is one of the finalists for the annual Be Inspired Innovation in Roads Awards. Today Rowan Steele presented an overview of how SA Water, based in Adelaide, South Australia, has applied operation analytics to reduce the water utility's power bill by A$3 million.
South Australia has been suffering from an extended drought for nearly a decade and has just built a desalinization plant to provide more reliable water to their customers.
About 40% of the South Australian power generation is now renewable. Most of this is wind (33%) and solar (8%). Fluctuating sources of power generation means that the price of electricity can range widely from -A$1000 to A$13,000 per megawatt hour (Mwh).
The drought, the new desalinization plant and the fluctuating cost of power introduced complexity into managing what used to be a fairly simple water network.
To address these issues SA Water decided to invest significantly in IT to help manage the water network better. They acquired a hydraulic model that allowed them to simulate the network under different conditions. They also invested in an operational analytics tool.
Together these applications have helped them optimize their network in various ways, such as optimizing chlorine dosing (water from the desal plant has very little organics compared to river water), minimizing electric power costs and reducing water age in some parts of the network. The benefits have been significant. They not only have been able to reduce their power bill by A$3 million, but also have cut their network operating costs by nearly a A$ million. It has also resulted in improved water quality. For example, they can map water age geographically for their entire service area. More fundamentally it has given them much greater insight into sources of revenue and the costs of various aspects of operating a water network.
INSPIRE-Geospatial World Forum 2015, a joint conference organized by the European Commission and Geospatial Media and Communications, has made available its full conference program. Almost 500 presentations are scheduled on topics including building information modelling (BIM), open data, big data analytics, open standards, linked data, cloud computing, crowdsourcing, Earth observation, indoor positioning, land information systems for smart cities, urban resilience and sustainability, health, agriculture and others. Some 2000 delegates are expected to attend from more than 80 countries. Top sponsors include Trimble, Topcon, ESRI, Digital Globe, Oracle and Bentley.
The theme of the conference is CONVERGENCE: Policies + Practices + Processes via PPP with a focus on improving coordination among policy-makers, technology providers and users. Including geospatial data and technology in construction, agriculture, health and other industry workflows is an enabler for more successful public–private partnerships (PPP) by facilitating more informed decision making among the stakeholders.
Dr Anne Kemp, Director and Fellow, Atkins, Vice-Chair of BuildingSmart, UK, Chair of ICE BIM Action Group, and Chair of BIM4Infrastructure UK, has published very thought-provoking insights into how the convergence of BIM and geospatial can contribute to the better management of information to help generate the understanding to make better decisions.
Her first assertion “So, let’s put paid to the hang-ups of what is and is not GIS and BIM, and discover what really deserves our focus” is a very good place for all of us to start if we are going to tear down the discipline boundaries that are inhibiting us from moving to a more holistic approach to problem solving in the era of smart cities.
Better outcomes, not BIM or geospatial
Her goal is not to support BIM or geospatial per se, but to use these technologies to improve outcomes. From Anne's perspective the key outcomes we should be aiming at are
CLARITY - Clarity of delivery
TECHNICAL JUDGEMENT - Converging information production with sound engineering judgment and design
ACCESS - Wider, faster access to comprehensible and integrated information
LATERAL THINKING - Enabling reflective, adaptive thinking to incorporate whole life and integrated systems approach within the wider geographic context
INNOVATION - Harnessing innovative technologies and harvesting intelligence from big data
DECISIONS - Fostering instinctive, but rigorous collaboration and better decision making
Data, not documents
The construction industry is based on documents such as drawings. Documents lock data up within a discipline and prevents the wider access that can be used to build up an integrated view of an asset. In contrast digital data can be used, many times, for different purposes, by different disciplines. This requires interoperability and the ability to map semantics across different disciplines.
Assets, not projects
The full lifecycle view of a building, road or airport requires thinking of assets not projects. Anne's perspective is that this is where the convergence of BIM with geospatial provides the biggest benefits. The UK government would agree. The short term objective of the UK Government BIM mandate is to reduce the cost of construction (design, tender, build) by 20%. The longer term objective is by 2025 to reduce the costs associated with designing, building, operating and maintaining buildings and infrastructure by a third. ‘In-use’ data from facilities management (FM) systems, building management systems, and sensors including smart phones provides information on how an asset is actually serving the needs of people, and the patterns of behaviour of people using the infrastructure. This information can be used to optimize building or infrastructure design. A geospatial perspective enables this data to be used not only with individual buildings or infrastructure, but for a whole neighbourhood, town or city.
Ensuring that data is not manipulated to distort decision making is critical to enabling the true data-driven organization of the future. Anne's perspective is that the industry is becoming increasingly dependent on data management professionals. This will require standards and a code of ethics to address challenges of privacy, distortion, and manipulation so as to ensure that data is made available in a way that aids rather than confuses decision making. In the future chief data officers and other information professionals will have even much greater responsibilities - they will be responsible for specifying, collecting, and analyzing the information for decision making that will be critical to the organization's success, even its existence.
Information is not understanding
Malcolm Gladwell in "Blink" points out that “We live in a world saturated with information. We have virtually unlimited amounts of data at our fingertips at all times, and we’re well versed in the arguments about the dangers of not knowing enough and not doing our homework. But what I have sensed is an enormous frustration with the unexpected costs of knowing too much, of being inundated with information. We have come to confuse information with understanding.”
At a recent BIM conference the term “infobesity” came up more than once. A decade ago people were concerned about not having enough data to make informed decisions. Now that we have more data being collected by sensors such as smart meters and smart phones, the problem is how do we make sense from the huge volumes of data that all these smart devices are collecting.
Anne makes the point that when managed correctly, “instant” decisions based on a small amount of data are not just as good, but can be better than those made after analyzing all available data. The "less is more" challenge is to distill the data to just the right subset to enable you can make better decisions faster with less data. Anne believes that this will require more sophisticated visualization techniques to enable insights from patterns in large amounts of data and better collaboration technology to enable a large number of individuals from different disciplines to understand each other (even when using different terms for the same piece of equipment or construction material) and to collaborate fruitfully.
This means that we will be asking our human or computer information engineers to deliver that essential subset of information to the right people at the right time, and in an intuitively understandable way. Anne suggests that our cartographic and GIS heritage of creating, analyzing and visualizing a view of the physical world as maps may provide a model for future data managers. But, as Anne points out, this will have to be transformed for a virtual environment.
Anne's final point is often overlooked. BIM, geospatial, augmented reality and other technologies are transforming how we view "reality". There are very real consequences for people working in a virtual world. Anne mentions the first case of internet addition disorder (IAD) involving Google Glasses on October 14, 2014 and asks how many of us are already there with our smartphones and tablets ?
At DistribuTECH 2015, Raiford Smith of CPS Energy and Jason P. Handley of Duke Energy, presented their perspectives on the smart grid; what is motivating it in terms of business and technology drivers, a roadmap for implementing it at their utilities, and the benefits that are expected from it for customers and for utilities. They also outlined a smart grid architecture based on open standards to enable seamless interoperability that enables distributed as opposed to centralized intelligence. Some of the advantages of a distributed architecture are scalability, reduced latency and implementing security at the grid edge instead of via the central control application.
Megatrends driving smart grid
Raiford Smith sees four major megatrends that are major motivating forces behind the smart grid. Moore's Law means there are intelligent devices for power networks with greater capabilities and at less cost. Metcalfe's Law means more interconnections and greater interoperability. Big data analytics means the ability to extract more meaningful information and insights from rapidly increasing volumes of data coming from thousands and even millions of intelligent devices. Distributed energy generation (DER) means more complicated power management - balancing intermittent generation and new load profiles from an increasing number of new electronic devices.
From a business perspective a major benefit is greater customer choice. In the future the customer will be able to not only manage his/her consumption of power, but also its generation. With rooftop solar PV and batteries the customer may elect to not even be on the grid, but to create his/her own microgrid. But it also means the utility business model will have to evolve from what it has been for the past 100 years. New York is one of the jurisdictions that is already changing its regulatory framework to enable utilities to move to a new business model.
As an aside, at this year's DistribuTECH if there was one technology that seemed to be everywhere and on almost every utility's radar, it is microgrids. Duke Energy is even playing with the idea of offering microgrids as a service.
Technology roadmap for the smart grid
From Raiford Smith's perspective the technology roadmap for the smart grid involves the deployment of increasing numbers of intelligent electronic devices for sensing and for control. The challenge is federating the data from all of these devices, extracting information from it, and dispatching the information to the right control devices. From an architectural perspective this drives the need for a field message bus which enables interoperability between different devices from different vendors. It also requires a common semantic model, such as the Common Information Model (CIM), adding security at the edge of the grid in addition to the central control room, and analytics to extract information from the huge volume of data collected from the sensors.
To test different smart grid configurations, CPS Energy is assembling a test facility for a three year smart grid testing program. It will have 30,000 customers, 15 circuits, solar generation, smart inverters, battery storage and the ability to disconnect from the grid to form a microgrid.
Benefits of the smart grid
Raiford expects major benefits for customers and for the utility from implementing a smart grid. For customers, perhaps the biggest benefit is that the smart grid avoids divergence of utility services and customers needs. Sometimes referred to as disaggregation, in this context it means a 3rd party coming between a utility and its customers. Historical examples are Microsoft Hohm and Google Powermeter which were perceived as threats because utilities found the idea of a Microsoft or Google insinuating itself between the utility and its customers unattractive. Opower is an example of a different approach that is much more attractive to utilities. Instead of doing an end-run around the utility, Opower focussed on the utility as their direct customer. Services using Opower's solutions is then offered by the utility to its customers.
Secondly, smart grid provides a flexible foundation for providing new services to customers (which also creates new sources of revenue for utilities). Raiford suggested some examples including electric vehicles and charging, premium (high quality) power, premium reliability, and asset control (for ex, inverters and batteries) and advanced demand response (the utility would provide these as a service to customers, rather than customers buying these devices from a 3rd party). The result is greater customer satisfaction and improved brand recognition as CPS Energy is perceived as a leader in providing new and improved services to customers.
The benefits that Raiford sees for utilities are equally important. They include improved operational metrics (SAIDI, SAIFI, asset utilization), better financial metrics (O&M spend, revenue generation), environmental benefits, improved safety, better trained and skilled staff, and more reliable risk modeling (better predictions of revenue, costs, customer satisfaction, and asset condition).
Probably the greatest challenge Raiford sees is managing organizational change, because the smart grid will mean that just about everything will change including the utility business model and most aspects of how we design, build, maintain and operate the grid.
Drivers for utility industry change
Jason Handley, of the Emerging Technology Office at Duke Energy, reviewed some of the major drivers for industry change. Many applications currently used by power utilities are proprietary, with the result that the utility has many application silos that don't interoperate. The rapid adoption of DERs is requiring utilities to move toward faster response times, reduced costs, better safety, and improved reliability. Dynamic load management and low voltage power electronics will mean greater adoption of rooftop PVs and other DERs. Increasingly utilities will invest in standards-based, modular systems for hardware, multi-function devices, and a field message bus for software that will enable interoperability. From a business perspective broader interoperability facilitates more competition which lowers costs, encourages innovation and improves reliability.
Other important drivers that Jason sees that are impacting utilities include demand response, electric vehicles, in-premise automation, cybersecurity threats, aging infrastructure, big data complexity, and avoiding stranded assets. The smart grid is requiring utilities to change how they do things. Utilities realize they have to be more proactive in their operations, rather than waiting for something to happen and then reacting to it. Situational awareness has become a critical capability for utilities in enabling utilities to be more proactive. It is made possible by having a variety of sensors in the field that together can present a snapshot of the status of the grid. The key functionality required to enable this to happen is seamless interoperability.
Centralized or distributed intelligence
As utilities implement thousands and even millions of smart devices in the field, a centralized architecture runs into scalability and latency problems. Duke's solution is an architecture with distributed as opposed to centralized intelligence. Duke sees this as comprised of layers so that with this architecture, not all data needs to go to the central control application. Some can be handled at lower levels. A self healing network is an example where a problem can be handled locally without the central control application knowing anything about it. Distributed intelligence also enables fast edge decisions that can be made without waiting for the central control application. For example, an advantage that cannot be underestimated with this architecture is that it enables security at the edge of the grid, not just via the central control application. Based on this concept Duke has defined a Distributed Intelligence Platform (DIP) Reference Architecture designed to take advantage of the tremendous intelligence that is out in the field in addition to the intelligence in the control centre.
Duke Energy, CPS Energy and 25 vendors, called the Coalition of the Willing (COW) have just embarked on an implementation of this architecture that supports a microgrid. The smart grid requires exchanging data between different devices from different manufacturers in the field. Traditional utility technologies are very often vendor silos utilizing proprietary hardware, telecommunications and software platforms. The goal of the “Coalition of the Willing" (COW) is to demonstrate that data and control commands can be shared across multiple vendor platforms (typically proprietary) to achieve interoperability with lower costs and faster response times. A key part of the demonstration is an open standard field message bus implemented as an open source project. The Smart Grid Interoperability Panel (SGIP) has created an OpenFMB working group to support this effort.
For the first time in a hundred years, the electric power utility industry is undergoing a momentous change. Distributed renewable power generation, especially solar photovoltaics (PV), is introducing competition into an industry that has been managed as regulated monopolies. Consumers with solar PV panels on their roofs are fundamentally changing the traditional utility business model. A recent report from the Edison Electric Institute (EEI) report refers to disruptive challenges that threaten to force electric power utilities to change or adapt the business model that has been in place since the first half of the 20th century.
Most utilities are in the midst of deploying smart grids, which basically amounts to applying the internet to the electric power grid to link intelligent electronic devices, sensors and grid control applications to enable data-driven decision making. One of the most important changes driven by the implementation of smart grid is the much greater importance of location. Geospatial technology (location, geospatial data management and spatial analytics) is seen as foundational techology for the smart grid.
The other major global change in energy is the shift in energy demand from the world's advanced economies to emerging economies. Energy demand from OECD countries has hit a plateau. Currently China is driving world energy demand. The International Energy Agency's (IEA) World Energy Outlook 2014 projects that in the future as demand slows from China, world energy demand will be driven by India, the Middle East, and Africa and Latin America.
Recently IDC Energy Insights released a report IDC FutureScape: Worldwide Utilities 2015 Predictions with predictions for the future of the utility business. Some of these are startling, suggesting that the utility industry is going to experience fundamental changes in how they do business over the next few years.
New business models
IDC predicts that utilities will be looking less at generation as a source of revenue. IDC predicts that by 2018 45% of new data traffic in utilities' control systems will originate from distributed energy resources that are not owned by the utility.
To make up for this loss of generation revenue IDC predicts that utilities will be looking for new business opportunities such as services. Specifically, IDC predicts that utilities will derive at least 40% of their earnings from new business models by 2017.
Cloud - By 2018 cloud services will make up half of the IT portfolio for over 60% of utilities.
Integration - In 2015 utilities will invest over a quarter of their IT budgets on integrating new technologies with legacy enterprise systems.
Analytics - By 2017 45% of utilities' new investment in analytics will be used in operations and maintenance of plant and network infrastructure.
Mobility - 60% of utilities will focus on transitioning enterprise mobility to capitalize on the consumer mobility wave.
Smart systems - By 2018 cognitive systems will penetrate utilities' customer operation to improve service and reduce costs.
Some of the important drivers for these trends include the global redistribution of energy demand from the world's advanced to the emerging economies, the rapid emergence of cloud-based provisioning and services, increasing regulatory pressure responding to customer demand to improve energy market transparency and competitiveness, cross-industry competition for technical, especially IT skills, smart analytics, and virtual and augmented reality beginning to be applied in business.
Top 10 technology trends
In March 2014 Gartner, Inc. identified the top ten technology trends which it saw impacting the global energy and utility markets. There is considerable overlap between IDC's business predictions and the technology trends identified by Gartner, Inc.
Social media are beginning to be used as a customer acquisition and retention medium, as a consumer engagement channel to drive customer participation in energy efficiency programs, a source of information about outages, and as the emerging area of crowd-sourcing distributed energy resources coordination. Social media are also being used by utilities for communicating information about outages with customers.
Smart grid will increase the quantity of data that utilities have to manage by a factor of about 10,000 according to a recent estimate. This trend is driven by intelligent devices, sensors, social networks, and new IT and OT applications such as advanced metering infrastructure (AMI), synchrophasors, smart appliances, microgrids, advanced distribution management, remote asset monitoring, and self-healing networks. The type of data that utilities will need to manage will change: for example, real-time data from sensors and intelligent devices including smart phones and unstructured data from social networks will play a much greater role for utilities in the future.
Mobile and Location-Aware Technology
Mobile and location-aware technology which includes hardware (laptops and smartphones), communication products (GPS-based navigation, routing and tracking technologies), social networks (Twitter,Facebook and others) and services (WiFi, satellites, and packet switched networks) are transforming all industries. Utilities for the most part have been slow to adopt consumer mobile technology, but this is changing.
Acccording to Gartner, areas such as smart meter, big data analytics, demand response coordination and GIS are driving utilities to adopt cloud-based solutions. Early adopters of cloud technologies include small utilities with limited in-house IT skills and budgets, organizations which provide application and data services to multiple utilities, such as cooperative associations and transmission system operators, and investor-owned utilities (IoUs) conducting short-term smart grid pilots.
Sensors, which are being applied extensively throughout the entire supply, transmission and distribution domains of utilities, provide a stream of real-time information from which a real-time state of the grid can be derived.
IT and OT Convergence
Virtually all new technology projects in utilities will require a combination of IT and OT investment and planning, such as AMI or advanced distribution management systems (ADMSs). This will be a challenge for many utilities, especially smaller ones, which don't have in-house IT skills.
Advanced Metering Infrastructure
AMI provides a communication backbone aimed at improving distribution asset utilization and facilitating consumer inclusion in energy markets.
Internet of Things
Sensors and actuators embedded in physical objects are linked through wired and wireless networks, using the same Internet Protocol (IP) that connects the Internet. When intelligent objects can both sense the environment and communicate, they become tools for understanding utility grids and responding to changes in near real-time. Following McKinsey there are two key benefits arising from the Internet of Things for utilities
Enhanced situational awareness - acheiving real-time awareness of physical environment, in this case, the grid
Sensor-driven decision analytics - assisting human decision making through deep analysis and data visualization
Asset performance management
Traditional asset management approaches are too limiting for today’s performance-based, data-driven utility environment. Asset performance management solutions need to deliver real-time equipment performance, reliability, maintenance and decision support for effective resource management so that operations and maintenance teams are empowered with real-time decision support information, providing the right information to the right people at the right time and in the right context. The result is improved operational performance and better asset availability and utilization.
Business Intelligence and Advanced Analytics
Analytics will become essential as the volume of data generated by intelligent devices and sensors, mobile devices (the Internet of Things) and social media increases and huge pools of structured and unstructured data need to be analyzed to extract actionable information. Analytics will become embedded everywhere, often invisibly.
For the first time in a hundred years, the electric power utility industry is undergoing a momentous change. Distributed renewable power generation, especially solar photovoltaics (PV), is introducing competition into an industry that has been managed as regulated monopolies. Consumers with solar PV panels on their roofs (and in not-too-distant future with Tesla batteries in the basement) and companies like Solar City (co-founded by Tesla co-founder Elon Musk) are fundamentally changing the traditional utility business model. A recent report from the Edison Electric Institute (EEI) report refers to disruptive challenges that threaten to force electric power utilities to change or adapt the business model that has been in place since the first half of the 20th century.
As a result, every aspect of the the electric power industry is changing. One of these changes involves the role that geospatial data and technology play in the electricity industry. In the past, geospatial has been a tactical tool — it was (and still is) used in a variety of applications — in outage management, asset management, mobile work- force management, energy density modelling, vegetation management, demand modelling, transmission line siting, substation siting and design, energy performance modelling of buildings, disaster management, and mapping renewable resources, to name just a few. However, with the changes that the industry is undergoing now, geospatial is poised to become a foundation technology for the smart grid.
The Energy Issue of Geospatial World Magazine explores the impact that this momentous change is having on the application of geospatial technology in the electric power utility sector. Below I'm providing an overview of the material relating to electric power you'll find in this issue.
GIS has been widely used by utilities for years for automated mapping/facilities management, back office records management, asset management, transmission line siting, and more recently for design and construction, energy conservation, vegetation management, mobile workforce management (MWFM), and outage management (OMS). Now, utilities are integrating GIS with automated meter infrastructure (AMI) and supervisory control and data acquisition (SCADA) systems. Intelligent design has crossed over from the office to the field in utilities, also enabled by the capabilities of GIS, says Smith. Geospatial-related analytics (spatial analytics) is seen as one of the key aspects of success for electric utility operations in the smart grid era. Looking for patterns and correlations between different land, weather, terrain, assets, and other types of geodata will be increasingly important for utilities. Power-related analytics with geospatial components include network fault tracing, load flow analysis, Volt/VAR analysis, real-time disaster situational awareness, condition-based maintenance, and vegetation management. The smart grid is all about situation awareness and effective anticipation of and response to events that might disrupt the performance of the power grid. Since spatial data underlies everything an electric utility does, GIS is the only foundational view that can potentially link every operational activity of an electric utility, including design and construction, asset management, workforce management, and outage management as well as supervisory control and data acquisition (SCADA), distribution management systems (DMSs), renewables, and strategy planning.
Peter Batty reports on the major growth in geospatially-enabled Web and mobile applications with a special focus on the open source geospatial community and the significant impact of these technologies in the utility sector. "In general, there are a lot of geospatial open source software components available now that have the capabilities and robustness to be used in serious enterprise applications." John McDonald, Chairman of the Smart Grid Interoperability Panel has been a firm believer for a long time that geospatial information is part of the foundational platform for smart grid. SGIP has signed a memorandum of understanding with the Open Geospatial Consortium with the goal of incorporating more geospatial standards into SGIP standards. Cindi Smith of Bentley goes even further and argues that “geospatial technology is already a foundational component of electric power utilities’ IT/OT systems. Smart grid simply brings more focus to the role it can play by virtue of the visibility of smart grid projects and processes in a utility and their need to exploit the vast amounts of data produced by the smart grid." Loek Bakker & Jan van GelderIt of Alliander, a Dutch utility company, describe how essential it has been for Alliander to integrate GIS, ERP and SCADA systems for a correct picture of its assets. As electric utilities evolve into increasingly data-driven organisations, Jeffrey Pires and G. Ben Binger describe how GIS is fast emerging as the backbone for data management platforms.
Cities are beginning to develop 3D models of underground infrastructure motivated by new underground remote-sensing technologies and by ROIs of up to of US$21.00 saved for every US$1.00 spent on improving the quality level of subsurface utility information. Steve Dibenedetto, Senior Geoscientist and Technology Manager, Underground Imaging Technologies (UIT), part of Caterpillar describes new remote-sensing technology for detecting and geolocating in 3D underground utility infrastructure such as Ground Penetrating Radar (GPR) and Electromagnetic Induction (EMI).
The Indian on-going Restructured – Accelerated Power Development and Reforms Program (R-APDRP) is one of the largest IT initiatives by electric utilities anywhere in the world — in one integrated project, all state-owned distribution utilities in India are building IT infrastructure, IT applications and automation systems. The programme set out to create baseline data in the form of consumer indexing, GIS mapping and asset mapping. Reji Pillai & C. Amritha assess how GIS can be applied in this context.
Integrating geospatial and BIM is a key enabler for energy performance modeling which is a fundamental instrument for reducing the energy consumption and improving the energy performance of new and existing buildings. According to a report from Navigant Research, global zero energy buildings revenue is expected to grow from $629.3 million in 2014 to $1.4 trillion by 2035.
Wolfgang Eyrich of Entegra shares how Entegra’s primtech product, which is designed to help substation designers deliver designs based on integrated product modelling, provides a geographical context to substation designing.
Matt Zimmerman of Schneider Electric highlights one of Schneider Electric's key techologies "graphic work design" which is integrated geospatial and engineering design (CAD or BIM). Schneider Electric's geospatial division focuses on developing integrated, location-aware enterprise solutions such as integrated outage management (OMS), customer information system (CIS), GIS, and external weather reporting and forecasting service to help plan crew deployment during a storm. Matt foresees that location-aware predictive analytics for electric networks is going to be one of the major development areas for utilities in the future. Brad Williams of Oracle points out that spatial analytics is becoming a key technology for electric utilities because everything a utility does - customers, assets, and operations - involves location.
One of the biggest challenges that utilities are experiencing is increasing volumes of structured and unstructured data (big data) that is overwhelming traditional enterprise systems. The structured data comes from smart meters and intelligent electronic devices, and the unstructured data from social networks including Twitter, Google, Facebook and other social applications. Consumerization of geospatial technology (we are all GPS-enabled sensors) will enable crowd-sourcing new sources of information about electric power networks most of which involves location (big spatial data).
At the SPAR International conference Avideh Zakhor from the University of California Berkeley gave an enthralling presentation about her research on capturing optical and thermal point clouds of the insides of buildings using a portable backpack containing a collection of sensors. Her team has also developed analytical software tools for generating floor plans, distinguishing rooms, identifying staircases, texture mapping surfaces, identifying windows, finding heat sources such as computers and human occupants, identifying lights, and even estimating plug load lower consumption - all automatically. The software her team has developed can generate input in the form of an IDF file for the Department of Energy's EnergyPlus energy performance analysis program. Avideh said that soon it will also be able to generate a gbXML file, which can be consumed by almost all building energy performance analysis software.
Man portable system
The "man portable system" her team has developed is a 32 pound backback with a collection of sensors. There are sensors for accurately tracking the six degrees of spatial freedom which determine the location and orientation of the backpack in addition to sensors for recording optical and thermal point clouds. The operator walks through the building going into all rooms, halls, stairwells and other human accessible spaces. With the backback this is a much faster process than capturing indoor spaces with static laser scanners. For example, Avideh's team was able to scan Union Station in Washington DC in a matter of hours. A comparable static scan takes days.
Automated feature extraction
The resulting georeferenced data is then offloaded from the backpack and analyzed to generate different 3D data products including optical and thermal 3D point clouds, a colourized 3D point cloud, surface texture map, and triangulated mesh surfaces. Two different triangulated surfaces are generated for different applications - a very detailed surface and a much simpler model of spaces and walls. Using these data products the software automates feature extraction. It can distinguish windows, staircases, and individual rooms and generate a floorplan. Surface textures can be added to create a 3D photorealtistic rendering. By analyzing both the optical and thermal point clouds the software tools can identify heat sources including computers and human occupants and can even identify lights and estimate power plug loads.
Building energy performance modeling
Using the results of the automated feature extraction the software tools can be used to generate the input required for a building energy performance audit. Currently the tools are able to generate input in the form of an IDF file for the Department of Energy's Energyplus whole building energy simulation program. Avideh said that soon it will be possible to generate an gbXML file, which can be consumed by most building energy performance analysis applications including IESVE and Green Building Studio. This clearly has the potential to convert what is now a slow, manual, expensive process into a fast, repeatable, affordable process.