Posted at 09:30 AM in 3D data, 3D Visualization, Asset management, Asset Management, Big data, BIM, Design, Digital Cities, Engineering, Facilities and operations management, Geospatial IT, Indoor location, Laser scanning, Mapping Applications, Modeling, Sensor web, Smart cities, Transportation infrastructure, Underground infrastructure | Permalink | Comments (0)
At this year's RICS BIM conference the elephant in the room was April 4th 2016, the deadline for all public construction projects to be BIM Level 2 compliant. This is the culmination of a five year process starting in 2011 focussed on developing a set of documents defining BIM Level 2 and providing best practices for achieving compliance.
This year's conference suggests that for many in the industry they are now wrestling with implementation issues and are reporting a shortage of BIM skills. They are finding that BIM is really about data and are realizing that data management and analytics are essential to handling and interpreting the huge volume of data that is being generated. A shortage of data scientists to help them to do this has become an acute problem.
The government's focus from the beginning has been on full life cycle BIM. Back in 2010/2011, the Government's BIM roadmap was perceived to be very aggressive, mandating BIM Level 2 compliance for all national government projects after just five years. In the UK public projects comprise about 50% of the construction industry so this amounted to radically transforming the UK construction industry in half a decade. Put another way the construction industry had to move from thinking analog to thinking digital in just five years.
The utility industry has been going through a similar transition in moving to what is called a smart grid and distributed generation. Some think that this process will take 20 years. But new technology and a younger generation steeped in digital is stepping in to make this happen much sooner than that. Energy is going digital more rapidly than expected. I think it likely that something similar will happen in the construction industry because digital enables all sorts of efficiencies that we often cannot even imagine. Uber and Airbnb are classic examples. Both are the largest or second largest companies in their respective industries. Gartner has forecasted a similar type of company will emerge for managing energy flows in the electric power industry by 2020.
For 2012 the Government's strategy focussed on discovery through pilot projects, notably the Ministry of Justice Cookham Wood project, a £20 mllion project where it is estimated that BIM saved hundreds of thousands of pounds in CAPEX costs.
At the 2013 conference David Philp,Head of BIM Implementation, Cabinet Office, UK Government, announced PAS 1192-2 a key document supporting the Construction BIM Strategy to achieve Level 2 compliance. It specified requirements for achieving BIM Level 2, sets set out the framework for collaborative working on BIM enabled projects and provided specific guidance for the information management requirements associated with projects delivered using BIM. David Philp made it clear that although the initial focus is on the design/build part of the lifecycle with the goal of saving 20% of Capex and reducing the 30% waste in most construction projects, the "largest prize for BIM lies in the operational stages of the project life-cycle".
At the 2014 conference Peter Hansford, Government Chief Construction Advisor, made a link between the overwhelming interest in BIM as represented by the sold-out RICS BIM conference and the industry’s enhanced interest and appetite for BIM. He saw a shift in emphasis from asking ‘’should we do BIM?’’ to ‘’how do we do BIM?’’ This was happengin inspite of the fact that as, Rich Saxon, the UK Government's BIM Ambassador for Growth, pointed out In a panel discussion on the future of BIM that some aspects of BIM Level 2 were still incomplete.
Deborah Rowland, Head of Facilities Management, Government Property Unit, Cabinet Office in the UK Government outlined the Government's motivation and plans for a program to improve the post construction handover and operation of newly constructed buildings. She described the Government Soft Landings (GSL) program which was intended to rectify this and was designed to put in place a legal, contractual, and technical framework (based on the BSRIA Soft Landings Framework) incorporating building information modeling (BIM) to fix the problem by ensuring continuity thoughout the buiiding lifecycle from inception, though design, construction, commissioning, training and handover through to operations and maintenance for 1-3 years after handover.
At the 2015 conference seven key BIM Level 2 documents (PAS 1192:2, PAS 1192:3, BS 1192-4, CIC BIM Protocol, Classification (Uniclass), Digital Plan of Works / Levels of Detail (LoD), Government Soft Landings) which collectively represented the current repository of BIM best practices were announced. Every speaker at the conference referenced these documents.
Seven major government projects using BIM, worth about £ 10 billion, have maintained records of the savings they have seen from using BIM. The measured benefits of BIM and factors associated with BIM range between15-20 %. By 2014 the government estimated that it had achieved savings of £ 1.4 billion through BIM and related improvements in the 2013/2014 fiscal year. But with full lifecyle BIM, the government saw an upside potential of 40%+ with the biggest benefits being realized in the operations phase of the building lifecycle.
This year it was announced that Government statistics show that in 2015 Her Majesty's Government saved GBP 3 billion on these demonstration projects as a result of gains in construction productivity related to BIM.
But BIM Level 3 is where the gold is. Level 3 will enable the interconnected digital design of different elements in a built environment and will extend BIM into the operation of assets over their lifetime. It will support the accelerated delivery of smart cities, services and grids. As the AGI has pointed out in a recent Foresight Report integrated BIM and geospatial technology will be fundamental for BIM Level 3.
An industry survey in the UK in 2013 reported that 54% of respondents said that they were using BIM on projects. Most of those not yet using BIM said that they would be within one to two years. 70 % of the respondents reporting using BIM said that BIM has given them a competitive advantage. But this year's conference suggests that most in the industry are doing BIM although for some it is not really clear what BIM Level 2 is and how one demonstrates compliance.
This year's RICS BIM conference was quite different from previous conferences. The focus was heavily on implementation, not on progress in completing specifications or developing and working with 3D models as in previous RICS BIM conferences.
The event was led off by a panel discussion that set the tone for the rest of the conference - the focus was on data and people rather than 3D BIM models. The industry people on the panel were patently up to their elbows in real world BIM projects including data, the analytics to make sense of it, and the shortage of the required skills (data scientists and people with BIM skills).
Alex Jones, Interserve, said that within the construction industry there is now a focus on data and how data is used downstream. He said that clients (owners) are now aware of BIM and that both private and public procurements are asking for BIM. But he qualified this - they are not so much asking for BIM (as in 3D models) as asking for data that can be used downstream. David Throssel, Skanska, added that deliverables now means data.
In the context of data, Simon Rawlinson, Arcadis UK, gave an example of utilities where smart devices and automated capture has resulted in a database of asset conditions, and which also includes geospatial data for geolocating those assets. This information allows utilities to move forward to condition-based maintenance where prioritizing maintenance and replacement depends on asset condition not the calendar. Real-time monitoring is being adopted by virtually all industries. For example, devices are being implanted in structures such as bridges to monitor distortions. He added that the leadership has not yet recognized that sensors and analytics enable a major change that will revolutionize asset maintenance.
Also the new digital economy is disintermediating certain skills. For example, digitalized construction requires fewer layers. Simon cited a KPMG study of disintermediation by technology. As an example, he mentioned data such as that collected by utilities from smart devices that allows owners/clients/utilities to do condition-based maintenance, which can lower field staff requirements significantly.
In the construction industry as a whole one of the major issues that emerged is a shortage of BIM related skills.Simon emphasized that now that we have lots of data, what we need urgently are data scientists to help make sense of it. He identified this as a major problem because in his experience data scientists are as rare as “hens teeth”.
David Hancock, UK Government Construction, said that construction industry still thinks analog. Nowadays we can all produce data, but analyzing it is the challenge. We need people who can analyze to produce actionable items. He also identified BIM skills shortage as a major problem, and repeated that data scientists people are “as rare as hens' teeth. He suggested that one solution is upskilling – training existing construction/bldg maintenance/utility folks in BIM and data science. Another may be raiding the U.S. baseball industry where teams such as the Oakland Atheletics starting using analytics several years ago. "Moneyball" analytics has been recommended as a strategy for the utility sector in the U.S.
In the question period a speaker argued that the 3D BIM model has been "massively overplayed". BIM is really about data and collaboration, not about 3D. To maximize the benefits of BIM clients (owners) have to be clear about what the model, and much more importantly the data associated with the model, are going to be used for. For example, if for facilities management (FM) then clients (owners) have to be specific about what data is required for maintaining and operating the building as part of defining deliverables.
What if you are a small company with no more experience with BIM than discovering what the acronym stands for ? At RICS BIM 2016 there was a sequence of presentations by a group of mostly quantity surveyors (people who do quantity takeoff), all BIM novices, who came together to try to understand and share openly what BIM Level 2 means practically. (As one speaker pointed out, there is no legal definition of BIM Level 2 compliance. They called their project KT4BIM, which you can find on LinkeIn. The objective is to answer the questions
how do we do BIM Level 2
how do we demonstrate compliance.
The target audience is BIM novices and for both the residential and commercial sectors. The objective is to share all the material used and developed as part of this project with the industry. More information about this very important project will be provided in my next blog post.
I've included this list just to show how comprehensive this set of BIM and BIM-related documents is.
Keep a close watch on the Hawaiian power utilities (HECO) because Hawaii has committed to 100% renewables by 2045 and 65% by 2030. That was the recommendation of Sharon Allan, CEO of the Smart Grid Interoperability Panel (SGIP) at a DistribuTECH2016 MegaSession on distributed energy resources (DER) in response to a question from the audience asking what should utilities do to get ready for GRID 3.0.
Hawaii already has 487 MW of solar PV capacity, 90% of which is residential rooftop panels. At DistribuTECH2016, Talin Sakugawa from HECO and Matthew Shawver from in2lytics gave an insightful presentation on distributed energy generation (DER), big data and analytics at the Hawaiian power utility. Hawaii has seen an exponential rise in rooftop solar PV. As a result since 2010 the utility has been experiencing a reduction in net load and revenue. The most dramatic drop in load is during daytime hours as is readily apparent by comparing the utility's load on clear and cloudy days which reflects the impact of residential solar PV.
The Rocky Mountain Institute (RMI) has published an analysis of the economics of leaving the grid. The central thesis is that solar PV and electricity storage enables consumers to leave the grid completely. Solar PV has already reached grid parity in about 10% of the U.S. and declining PV prices suggest that this trend will continue. However, without the ability to store electric power, consumers with rooftop PV still require the grid for nights and cloudy days. The development of combined solar PV and batteries from Tesla and others promises to make solar + storage accessible for increasing numbers of consumers. These consumers could form their own microgrid, either by themselves or with their neighbours and disconnect from the grid. Alternatively they could become a source of dispatchable power and become an energy provider. Increasingly they could find that either of these options is economically advantageous. This has serious implications for local utilities because if this trend develops it will seriously erode the traditional utility revenue base. It could also lead to a completely decentralized grid comprised of many microgrids. From an operations perspective it increases the complexity of managing the grid compared to the centralized model in use today.
The other challenge for a utility with a high penetration of intermittent, distributed generation (DER) is load balancing, ensuring that generation meets demand. The German power industry, the U.S. Department of Energy, Hawaiian power utilities, California power utilities, ERCOT, and many others are working to address this challenge. HECO's presentation at DistribuTECH gave an insightful overview of what they are doing to meet this technical challenge.
HECO collects a large amount of data from diverse sources. It is comprised of internal operational data plus data from esternal sources. It includes weather forecasts, data from customer sited PV, consumer/public resource data, phasor (PMU) data, renewables power quality, feeder data, irradiance meters, generator and substations power quality, SCADA, and vendor data such as forecasts and data from Independent Power Producers (IPPs). The data has widely different time resolutions ranging from real time such as PMUs sampling 30 times per second, SCADA reporting every 2 seconds, calculated gross load every 2 seconds, PV inverter data reporting every 5 minutes and aggregated monthly, weather forecast reporting every 15 minutes and aggregated daily, transformers collecting data every 15 mins and aggregated monthly, and smart meters reporting every 15 minutes and aggregated quarterly. Much of the data includes location. The data comes in different formats and has different levels of quality.
Fundamental to distributed energy are weather maps. Maps of irradiance and wind velocity can be directly used to estimate or predict solar and wind generation in different parts of the islands. Temperature maps are also important because the efficiency of solar panels depends on temperature. The more accurate the weather forecasts are, the more predictable the generation. Knowing that there it will be cloudy over the south of Oahu, but sunny over the rest of the island, or cloudy over the whole island, but with moderate to strong winds allows operations to estimate how much backup generation is required and where.
Data from new sources relating to distributed renewable generation are integrated with traditional sources in a distributed energy management system which enables operations and planning visibility into how distributed generation is impacting thte grid. It not only provides a view into the status of the grid (situational awareness) but also allows simulations to assess the impact of different future renewable generation scenarios on the grid. This helps to determine where backup generation may be required because high demand and low generation from renewables or curtailment when there is excess generation and insufficient load. Location is fundamental to the analytics, because wind and irradiance varies widely over the islands.
Traditional input sources include SCADA and data from transmission, generators, protection, and so on. The new input data sources include GIS-based infrastructure models and data from the distributed generation sources. Also new is weather forecasting which enables predicting generation from renewable sources. It also includes historical and actual weather reports, and satellite and other weather data.
HECO uses an IT platform from In2lytics (a spinoff from Referentia) which integrates all of these time series data with EMS, SCADA, CIS, and GIS and makes it available to operational users, planners, modelers and the public. in2lytics is a high performance time series database that is specially designed to provide instant data accessibility for planning and operational decision making. in2lytics enables load, query, analysis using MATLAB, and sharing of the "big data" required to monitor and manage today’s complex electric grid. in2lytics has native high performance interfaces for MATLAB in addition to programming languages. It translates data from different sources into its internal time series database. All data is archived in a spatially enabled time series database for future analysis. Matthew said that in2lytics is able to analyze large volumes of historical time series data very rapidly. For example, a longitudinal analysis on two years worth of archived data can be run in an hour or two.
Accounting for renewables requires a lot of data. Some areas that use this data include generation planning, load forecasting, distribution planning, and contract administration. One of the first applications is a customer facing web site called Renewable Watch - Oahu that reports the total renewables generation, solar and wind, on Oahu in real-time.
An example of an application that will be used for contract administration beginning next month estimates how much power is lost from Independent Power Producers (IPPs) in the case of curtailment. It uses weather information to estimate wind speed and irradiance meters to estimate wind and solar generation potential.
Another analytical application was used to study how solar PV variability affects transformer tap changes. This application allows HECO to relate frequency of tap changes to solar PV variability and to determine which transformers are most affected by solar variability.
HECO has a number of projects with the Department of Energy and other partners.
Department of Energy's Integrating System to Edge-of-Network Architecture and Management" (SEAMS) A federally-funded research initiative on high penetration grids which aims to streamline grid planning and operations for utilities in regions with high concentrations of distributed generation (DG) resources.
Department of Energy's Distributed Resource Energy Analysis and Management System (DREAMS) Department of Energy system to provide useful information about the distributed grid to grid operators.
Partners: Siemens, Alstom, DNV, AWS Truepower, Referentia, Sunshot
Advisory Team:SMUD, Southern California Edison, California ISO, TEP, APS, Kaua'i Island Utility Cooperative, Western APA
Making sense of synchrophasor data for utilities. The Synchrophasor Visual Integration and Event Evaluation for Utilities (SynchroVIEEU) with High Penetrations of Renewables. Accelerate the integration of synchrophasor information into production grade data visualization and analysis platforms/models. Leverage PMU capability at many substations – explore ways to tap resources and provide real-time visibility and real-time data. Make synchrophasor data accessible for efficient and reliable operations of a modern grid in light of high penetrations of renewable resources
Partners: SEL, DNV GL, Referentia
Monitoring, planning, and modeling high PV penetration microgrid. Decision support for microgrid design and operation at Marine Corps Base Hawaii. The Office of Naval Research (ONR) has launched an ambitious program to demonstrate and evaluate energy technologies using Navy and Marine Corps facilities as test beds, known as the Energy Systems Technology and Evaluation Program (ESTEP). Program management is being handled by the Space and Naval Warfare Systems Command (SPAWAR) Systems Center Pacific (SSC Pacific). ESTEP was established in 2013. It brings together the Department of the Navy, academia, and private industry to investigate and test emerging energy technologies at Navy and Marine Corps installations. At present, ESTEP conducts over 20 in-house government energy projects, ranging from energy management to alternative energy and storage technologies.
The ultimate objective for HECO is to become a company offering diversified services providing value to engaged customers, of course with satisfied regulators and sustainable costs and margins.
It was clear from this presentation that essential for HECO, or to any utility, with a high penetration of solar or wind are weather maps, forecasted, actual and historical because these enable the utility to project generation. The also allow utilities to analyze the historical record in conjunction with other data to discover trends that may help improve forecasting. Collecting and analyzing temporal geospatial data is fundamental for distributed energy management. This is in addition to the expanded application of geospatial technology for utility operations and planning in the smart grid era.
I am at DistribuTECH this week in Orlando. This is North America's largest electric power utility event. I don't know the number of attendees this year, but last year it was close to 10,000.
The first talk I heard was a riveting talk that hit on a key theme of this year's conference, big data and analytics. The talk was given by Lee Krevat of Sempra US Gas and Power and Tim Fairchild of SAS and was entitled "What can a regulated electric utility learn from Moneyball ?" Moneyball is a book by Michael Lewis about baseball and how the Oakland A's applied big data and analytics to become one of the top teams but without having the deep pockets of the richest teams.
Moneyball for transformers
As an example of the relevance of Moneyball is a utility that took every bit of data that could potentially be relevant to the lifecycle of a transformer, some 80 variables in all, and ran correlations with transformer failure data - for 1500 transformers. They found that just 10 of the variables predicted 91% of transformer failures. This compares with the utility's traditional way of forecasting failure which only predicted 15% of the failures. If you ask an experienced power engineer what the most important factor in determining the lifetime of a transformer, the historical load, and especially the overload, on the transformer would be at the top of the list. What the utility actually found was that the correlation analysis showed that load was only the eighth most important factor. The analysis showed that the most important factor was the number of meters attached to the transformer - which probably wouldn't have been on anyone's list.
Lee and Tim presented other analogies where analytics play a key role in both baseball and electric power.
The good face
Baseball scouts often profile players by the look of their face, body, and a "baseball look", not always on their actual statistics. Similarly utilities often profile circuits to prioritize capacity investment decisions, often based on "gut" feeling. Now they are able to leverage data to make better investments.
Age before beauty
Scouts show a preference for younger players even though baseball players with better statistics, but who are older, are also often available without long term contracts and at a significantly lower salary. By analogy large utility transformers have traditionally been replaced based on age. But now statistics on many transformers are being monitored and the transformers only replaced if their statistics are in decline and predict failure.
Lee and Tim suggested other practical applications of this approach which include using machine learning to predict failures of wind turbines and to determine why solar farms often generate less electricity than expected.
The electricity industry is undergoing a transformation. With utilities embracing geospatial technology and turning into data driven enterprises in the Smart Grid scenario, the sector is staring at an innovative future.
The National Academy of Engineering identifies the electric power grid as the first of 20 major engineering achievements that has had the greatest impact on the quality of life in the 20th Century. Modern society has reached a point where virtually every crucial economic and social function depends on the secure, reliable operation of electric power infrastructure. But because it has become so crucial for modern life, it faces major challenges.
The major drivers for the fundamental change underway in the electric power industry are increasing demand, universal access, decarbonizing electric power, reducing revenue losses, and grid reliability and resilience. Some of the technologies that are contributing to this transformation are intelligent devices integrated with a communications network, distributed renewable power generation especially wind and solar PV, net zero energy buildings, microgrids, and the new remote sensing technologies of subsurface utility engineering.
For an industry not known historically for rapid change, many utilities are in the midst of transforming themselves into data driven enterprises. Recently IDC published its future predictions for the development of the electric power industry over the period 2015-2018. For many in the industry these are quite startling and clearly reflect an industry that is rapidly evolving.
The technology roadmap for the smart grid involves the deployment of increasing numbers of intelligent electronic devices for sensing and for control. The challenge is federating the data from all of these devices, extracting information from it, and dispatching the information to the right control devices.
With the changes that the electric power industry is undergoing now, analysts see geospatial technology poised to become a foundation technology for the smart grid. The role of utility GIS is expected to touch every aspect of a utilities business, affecting customers, operations and management because geospatial is the logical technology that can provide the basis for integrating data from intelligent electronic devices such as smart meters and the information silos associated with proprietary applications.
You can read more about smart grid and relevance of geospatial data and technology, real-time big spatial data, standards for interoperability, the importance of data quality, open source geospatial technology, spatial analytics and other aspects of the role of geospatial technology in the smart grid in Geospatial World.
Posted at 01:00 PM in Asset management, Big data, Convergence, Data Quality, Electric Power, Energy efficient buildings, Facilities and operations management, Geospatial IT, Geospatial Open Source, Geospatial Standards, Grid reliability, Grid resilience, Interoperability, Microgrid, Open Source Geospatial, Productivity, Resilience, Sensor web, Smart-grid, Social media, Spatial analytics, Spatial Data, Underground infrastructure, Utility Solutions | Permalink | Comments (0)
INSPIRE-Geospatial World Forum 2015, a joint conference organized by the European Commission and Geospatial Media and Communications, has made available its full conference program. Almost 500 presentations are scheduled on topics including building information modelling (BIM), open data, big data analytics, open standards, linked data, cloud computing, crowdsourcing, Earth observation, indoor positioning, land information systems for smart cities, urban resilience and sustainability, health, agriculture and others. Some 2000 delegates are expected to attend from more than 80 countries. Top sponsors include Trimble, Topcon, ESRI, Digital Globe, Oracle and Bentley.
The theme of the conference is CONVERGENCE: Policies + Practices + Processes via PPP with a focus on improving coordination among policy-makers, technology providers and users. Including geospatial data and technology in construction, agriculture, health and other industry workflows is an enabler for more successful public–private partnerships (PPP) by facilitating more informed decision making among the stakeholders.
Posted at 09:30 AM in Analytics, Big data, BIM, Conferences, Crowdsourced data, Digital Cities, Earth from space, Geomatics, Geospatial IT, Geospatial Standards, Imagery, Indoor location, Land management, Mapping Applications, New Technology, Open Data, Open Standards, Smart cities, Spatial analytics | Permalink | Comments (0)
A very exciting project has been proposed at LocationTech (it's in the Project Proposal Phase as defined in the Eclipse Development Process). Simply put, GeoWave intends to do for "big data" databases (initially Apache Accumulo) what PostGIS does for SQL databases (PostgreSQL). GeoWave is open source software (licensed under Apache 2.0) that adds support for geographic objects, multi-dimensional indexing and geospatial operators to Apache Accumulo.
To deal with data volumes that are too large for traditional SQL databases, beginning in 2004 Google developed "BigTable" which is a compressed, high performance, and proprietary data storage system built on the Google File System that is used by a number of Google applications including Google Maps. Apache Accumulo is a distributed database that is based on Google's BigTable design and is built on top of Apache Hadoop and other Apache projects. For putting geospatial data into a key/value store like Accumulo the key concept is that of the "geospatial hash" which converts a 2D, 3D or 4D coordinate such as a lon and lat, lon, lat and elevation or a lon, lat, elevation, and time to an integer index, such as a quadtree or R-Tree index, that can be used to order and rapidly retrieve spatial data. GeoWave means that you can manage massive amounts of geoinformation in key/value databases such as Accumulo and take advantage of programs such as MapReduce which Accumulo uses for distributed processing.
Connecting Accumulo to GeoServer, GeoTools, and PDAL
In addition GeoWave includes a GeoServer plugin to enable geospatial data in Accumulo to be shared and visualized via GeoServer OGC standard web services. It provides plugins to connect the popular geospatial toolset GeoTools and the point cloud library PDAL to an Accumulo based data store. The PDAL plugin makes it possible to interact with point cloud data in Accumulo through the PDAL library.
The GeoWave project Work plans to extend the same geospatial capabilities to other distributed key-value stores in addtition to Accumulo. The next data store will be HBase. It also will support other geospatial frameworks in addition to GeoTools/GeoServer. Mapnik is the next geospatial framework targeted for GeoWave support. GeoWave says it is very interested in GeoGig and support for this geospatial data versioning library is currently on their backlog. GeoGig takes the concepts used in distributed version control such as Git and applies them to versioned spatial data.
GeoWave was developed at the National Geospatial-Intelligence Agency (NGA) in collaboration with RadiantBlue Technologies and Booz Allen Hamilton. The NGA released GeoWave under an open source license in June, 2014. The primary goal of GeoWave is to bridge the gap between well-known geospatial projects such as GeoTools and distributed databases.
I blogged previously about GeoMesa, the first LocationTech project that aims at providing a foundation for storing, querying, and transforming spatio-temporal data in Accumulo. It implements interfaces that enable Geoserver and other Geotools projects to use Accumulo as a data store.
Dr Anne Kemp, Director and Fellow, Atkins, Vice-Chair of BuildingSmart, UK, Chair of ICE BIM Action Group, and Chair of BIM4Infrastructure UK, has published very thought-provoking insights into how the convergence of BIM and geospatial can contribute to the better management of information to help generate the understanding to make better decisions.
Her first assertion “So, let’s put paid to the hang-ups of what is and is not GIS and BIM, and discover what really deserves our focus” is a very good place for all of us to start if we are going to tear down the discipline boundaries that are inhibiting us from moving to a more holistic approach to problem solving in the era of smart cities.
Better outcomes, not BIM or geospatial
Her goal is not to support BIM or geospatial per se, but to use these technologies to improve outcomes. From Anne's perspective the key outcomes we should be aiming at are
Data, not documents
The construction industry is based on documents such as drawings. Documents lock data up within a discipline and prevents the wider access that can be used to build up an integrated view of an asset. In contrast digital data can be used, many times, for different purposes, by different disciplines. This requires interoperability and the ability to map semantics across different disciplines.
Assets, not projects
The full lifecycle view of a building, road or airport requires thinking of assets not projects. Anne's perspective is that this is where the convergence of BIM with geospatial provides the biggest benefits. The UK government would agree. The short term objective of the UK Government BIM mandate is to reduce the cost of construction (design, tender, build) by 20%. The longer term objective is by 2025 to reduce the costs associated with designing, building, operating and maintaining buildings and infrastructure by a third. ‘In-use’ data from facilities management (FM) systems, building management systems, and sensors including smart phones provides information on how an asset is actually serving the needs of people, and the patterns of behaviour of people using the infrastructure. This information can be used to optimize building or infrastructure design. A geospatial perspective enables this data to be used not only with individual buildings or infrastructure, but for a whole neighbourhood, town or city.
Ensuring that data is not manipulated to distort decision making is critical to enabling the true data-driven organization of the future. Anne's perspective is that the industry is becoming increasingly dependent on data management professionals. This will require standards and a code of ethics to address challenges of privacy, distortion, and manipulation so as to ensure that data is made available in a way that aids rather than confuses decision making. In the future chief data officers and other information professionals will have even much greater responsibilities - they will be responsible for specifying, collecting, and analyzing the information for decision making that will be critical to the organization's success, even its existence.
Information is not understanding
Malcolm Gladwell in "Blink" points out that “We live in a world saturated with information. We have virtually unlimited amounts of data at our fingertips at all times, and we’re well versed in the arguments about the dangers of not knowing enough and not doing our homework. But what I have sensed is an enormous frustration with the unexpected costs of knowing too much, of being inundated with information. We have come to confuse information with understanding.”
At a recent BIM conference the term “infobesity” came up more than once. A decade ago people were concerned about not having enough data to make informed decisions. Now that we have more data being collected by sensors such as smart meters and smart phones, the problem is how do we make sense from the huge volumes of data that all these smart devices are collecting.
Anne makes the point that when managed correctly, “instant” decisions based on a small amount of data are not just as good, but can be better than those made after analyzing all available data. The "less is more" challenge is to distill the data to just the right subset to enable you can make better decisions faster with less data. Anne believes that this will require more sophisticated visualization techniques to enable insights from patterns in large amounts of data and better collaboration technology to enable a large number of individuals from different disciplines to understand each other (even when using different terms for the same piece of equipment or construction material) and to collaborate fruitfully.
This means that we will be asking our human or computer information engineers to deliver that essential subset of information to the right people at the right time, and in an intuitively understandable way. Anne suggests that our cartographic and GIS heritage of creating, analyzing and visualizing a view of the physical world as maps may provide a model for future data managers. But, as Anne points out, this will have to be transformed for a virtual environment.
Anne's final point is often overlooked. BIM, geospatial, augmented reality and other technologies are transforming how we view "reality". There are very real consequences for people working in a virtual world. Anne mentions the first case of internet addition disorder (IAD) involving Google Glasses on October 14, 2014 and asks how many of us are already there with our smartphones and tablets ?
Posted at 12:45 PM in 3D data, 3D Visualization, Analytics, Asset management, Asset Management, Big data, BIM, Buildings, CAD, Construction, Convergence, Design, Geomatics, Geospatial IT, Interoperability, Municipal infrastructure, Smart cities, Spatial Data, Temporal geospatial | Permalink | Comments (0)
At DistribuTECH 2015, Raiford Smith of CPS Energy and Jason P. Handley of Duke Energy, presented their perspectives on the smart grid; what is motivating it in terms of business and technology drivers, a roadmap for implementing it at their utilities, and the benefits that are expected from it for customers and for utilities. They also outlined a smart grid architecture based on open standards to enable seamless interoperability that enables distributed as opposed to centralized intelligence. Some of the advantages of a distributed architecture are scalability, reduced latency and implementing security at the grid edge instead of via the central control application.
Megatrends driving smart grid
Raiford Smith sees four major megatrends that are major motivating forces behind the smart grid. Moore's Law means there are intelligent devices for power networks with greater capabilities and at less cost. Metcalfe's Law means more interconnections and greater interoperability. Big data analytics means the ability to extract more meaningful information and insights from rapidly increasing volumes of data coming from thousands and even millions of intelligent devices. Distributed energy generation (DER) means more complicated power management - balancing intermittent generation and new load profiles from an increasing number of new electronic devices.
From a business perspective a major benefit is greater customer choice. In the future the customer will be able to not only manage his/her consumption of power, but also its generation. With rooftop solar PV and batteries the customer may elect to not even be on the grid, but to create his/her own microgrid. But it also means the utility business model will have to evolve from what it has been for the past 100 years. New York is one of the jurisdictions that is already changing its regulatory framework to enable utilities to move to a new business model.
As an aside, at this year's DistribuTECH if there was one technology that seemed to be everywhere and on almost every utility's radar, it is microgrids. Duke Energy is even playing with the idea of offering microgrids as a service.
Technology roadmap for the smart grid
From Raiford Smith's perspective the technology roadmap for the smart grid involves the deployment of increasing numbers of intelligent electronic devices for sensing and for control. The challenge is federating the data from all of these devices, extracting information from it, and dispatching the information to the right control devices. From an architectural perspective this drives the need for a field message bus which enables interoperability between different devices from different vendors. It also requires a common semantic model, such as the Common Information Model (CIM), adding security at the edge of the grid in addition to the central control room, and analytics to extract information from the huge volume of data collected from the sensors.
To test different smart grid configurations, CPS Energy is assembling a test facility for a three year smart grid testing program. It will have 30,000 customers, 15 circuits, solar generation, smart inverters, battery storage and the ability to disconnect from the grid to form a microgrid.
Benefits of the smart grid
Raiford expects major benefits for customers and for the utility from implementing a smart grid. For customers, perhaps the biggest benefit is that the smart grid avoids divergence of utility services and customers needs. Sometimes referred to as disaggregation, in this context it means a 3rd party coming between a utility and its customers. Historical examples are Microsoft Hohm and Google Powermeter which were perceived as threats because utilities found the idea of a Microsoft or Google insinuating itself between the utility and its customers unattractive. Opower is an example of a different approach that is much more attractive to utilities. Instead of doing an end-run around the utility, Opower focussed on the utility as their direct customer. Services using Opower's solutions is then offered by the utility to its customers.
Secondly, smart grid provides a flexible foundation for providing new services to customers (which also creates new sources of revenue for utilities). Raiford suggested some examples including electric vehicles and charging, premium (high quality) power, premium reliability, and asset control (for ex, inverters and batteries) and advanced demand response (the utility would provide these as a service to customers, rather than customers buying these devices from a 3rd party). The result is greater customer satisfaction and improved brand recognition as CPS Energy is perceived as a leader in providing new and improved services to customers.
The benefits that Raiford sees for utilities are equally important. They include improved operational metrics (SAIDI, SAIFI, asset utilization), better financial metrics (O&M spend, revenue generation), environmental benefits, improved safety, better trained and skilled staff, and more reliable risk modeling (better predictions of revenue, costs, customer satisfaction, and asset condition).
Probably the greatest challenge Raiford sees is managing organizational change, because the smart grid will mean that just about everything will change including the utility business model and most aspects of how we design, build, maintain and operate the grid.
Jason Handley, of the Emerging Technology Office at Duke Energy, reviewed some of the major drivers for industry change. Many applications currently used by power utilities are proprietary, with the result that the utility has many application silos that don't interoperate. The rapid adoption of DERs is requiring utilities to move toward faster response times, reduced costs, better safety, and improved reliability. Dynamic load management and low voltage power electronics will mean greater adoption of rooftop PVs and other DERs. Increasingly utilities will invest in standards-based, modular systems for hardware, multi-function devices, and a field message bus for software that will enable interoperability. From a business perspective broader interoperability facilitates more competition which lowers costs, encourages innovation and improves reliability.
Other important drivers that Jason sees that are impacting utilities include demand response, electric vehicles, in-premise automation, cybersecurity threats, aging infrastructure, big data complexity, and avoiding stranded assets. The smart grid is requiring utilities to change how they do things. Utilities realize they have to be more proactive in their operations, rather than waiting for something to happen and then reacting to it. Situational awareness has become a critical capability for utilities in enabling utilities to be more proactive. It is made possible by having a variety of sensors in the field that together can present a snapshot of the status of the grid. The key functionality required to enable this to happen is seamless interoperability.
As utilities implement thousands and even millions of smart devices in the field, a centralized architecture runs into scalability and latency problems. Duke's solution is an architecture with distributed as opposed to centralized intelligence. Duke sees this as comprised of layers so that with this architecture, not all data needs to go to the central control application. Some can be handled at lower levels. A self healing network is an example where a problem can be handled locally without the central control application knowing anything about it. Distributed intelligence also enables fast edge decisions that can be made without waiting for the central control application. For example, an advantage that cannot be underestimated with this architecture is that it enables security at the edge of the grid, not just via the central control application. Based on this concept Duke has defined a Distributed Intelligence Platform (DIP) Reference Architecture designed to take advantage of the tremendous intelligence that is out in the field in addition to the intelligence in the control centre.
Duke Energy, CPS Energy and 25 vendors, called the Coalition of the Willing (COW) have just embarked on an implementation of this architecture that supports a microgrid. The smart grid requires exchanging data between different devices from different manufacturers in the field. Traditional utility technologies are very often vendor silos utilizing proprietary hardware, telecommunications and software platforms. The goal of the “Coalition of the Willing" (COW) is to demonstrate that data and control commands can be shared across multiple vendor platforms (typically proprietary) to achieve interoperability with lower costs and faster response times. A key part of the demonstration is an open standard field message bus implemented as an open source project. The Smart Grid Interoperability Panel (SGIP) has created an OpenFMB working group to support this effort.
For the first time in a hundred years, the electric power utility industry is undergoing a momentous change. Distributed renewable power generation, especially solar photovoltaics (PV), is introducing competition into an industry that has been managed as regulated monopolies. Consumers with solar PV panels on their roofs are fundamentally changing the traditional utility business model. A recent report from the Edison Electric Institute (EEI) report refers to disruptive challenges that threaten to force electric power utilities to change or adapt the business model that has been in place since the first half of the 20th century.
Most utilities are in the midst of deploying smart grids, which basically amounts to applying the internet to the electric power grid to link intelligent electronic devices, sensors and grid control applications to enable data-driven decision making. One of the most important changes driven by the implementation of smart grid is the much greater importance of location. Geospatial technology (location, geospatial data management and spatial analytics) is seen as foundational techology for the smart grid.
The other major global change in energy is the shift in energy demand from the world's advanced economies to emerging economies. Energy demand from OECD countries has hit a plateau. Currently China is driving world energy demand. The International Energy Agency's (IEA) World Energy Outlook 2014 projects that in the future as demand slows from China, world energy demand will be driven by India, the Middle East, and Africa and Latin America.
Recently IDC Energy Insights released a report IDC FutureScape: Worldwide Utilities 2015 Predictions with predictions for the future of the utility business. Some of these are startling, suggesting that the utility industry is going to experience fundamental changes in how they do business over the next few years.
New business models
IDC predicts that utilities will be looking less at generation as a source of revenue. IDC predicts that by 2018 45% of new data traffic in utilities' control systems will originate from distributed energy resources that are not owned by the utility.
To make up for this loss of generation revenue IDC predicts that utilities will be looking for new business opportunities such as services. Specifically, IDC predicts that utilities will derive at least 40% of their earnings from new business models by 2017.
Some of the important drivers for these trends include the global redistribution of energy demand from the world's advanced to the emerging economies, the rapid emergence of cloud-based provisioning and services, increasing regulatory pressure responding to customer demand to improve energy market transparency and competitiveness, cross-industry competition for technical, especially IT skills, smart analytics, and virtual and augmented reality beginning to be applied in business.
Top 10 technology trends
In March 2014 Gartner, Inc. identified the top ten technology trends which it saw impacting the global energy and utility markets. There is considerable overlap between IDC's business predictions and the technology trends identified by Gartner, Inc.
Social media are beginning to be used as a customer acquisition and retention medium, as a consumer engagement channel to drive customer participation in energy efficiency programs, a source of information about outages, and as the emerging area of crowd-sourcing distributed energy resources coordination. Social media are also being used by utilities for communicating information about outages with customers.
Smart grid will increase the quantity of data that utilities have to manage by a factor of about 10,000 according to a recent estimate. This trend is driven by intelligent devices, sensors, social networks, and new IT and OT applications such as advanced metering infrastructure (AMI), synchrophasors, smart appliances, microgrids, advanced distribution management, remote asset monitoring, and self-healing networks. The type of data that utilities will need to manage will change: for example, real-time data from sensors and intelligent devices including smart phones and unstructured data from social networks will play a much greater role for utilities in the future.
Mobile and Location-Aware Technology
Mobile and location-aware technology which includes hardware (laptops and smartphones), communication products (GPS-based navigation, routing and tracking technologies), social networks (Twitter,Facebook and others) and services (WiFi, satellites, and packet switched networks) are transforming all industries. Utilities for the most part have been slow to adopt consumer mobile technology, but this is changing.
Acccording to Gartner, areas such as smart meter, big data analytics, demand response coordination and GIS are driving utilities to adopt cloud-based solutions. Early adopters of cloud technologies include small utilities with limited in-house IT skills and budgets, organizations which provide application and data services to multiple utilities, such as cooperative associations and transmission system operators, and investor-owned utilities (IoUs) conducting short-term smart grid pilots.
Sensors, which are being applied extensively throughout the entire supply, transmission and distribution domains of utilities, provide a stream of real-time information from which a real-time state of the grid can be derived.
IT and OT Convergence
Virtually all new technology projects in utilities will require a combination of IT and OT investment and planning, such as AMI or advanced distribution management systems (ADMSs). This will be a challenge for many utilities, especially smaller ones, which don't have in-house IT skills.
Advanced Metering Infrastructure
AMI provides a communication backbone aimed at improving distribution asset utilization and facilitating consumer inclusion in energy markets.
Internet of Things
Sensors and actuators embedded in physical objects are linked through wired and wireless networks, using the same Internet Protocol (IP) that connects the Internet. When intelligent objects can both sense the environment and communicate, they become tools for understanding utility grids and responding to changes in near real-time. Following McKinsey there are two key benefits arising from the Internet of Things for utilities
Asset performance management
Traditional asset management approaches are too limiting for today’s performance-based, data-driven utility environment. Asset performance management solutions need to deliver real-time equipment performance, reliability, maintenance and decision support for effective resource management so that operations and maintenance teams are empowered with real-time decision support information, providing the right information to the right people at the right time and in the right context. The result is improved operational performance and better asset availability and utilization.
Business Intelligence and Advanced Analytics
Analytics will become essential as the volume of data generated by intelligent devices and sensors, mobile devices (the Internet of Things) and social media increases and huge pools of structured and unstructured data need to be analyzed to extract actionable information. Analytics will become embedded everywhere, often invisibly.
Posted at 10:00 AM in Analytics, Asset management, Asset Management, Big data, Cloud, Crowdsourced data, Electric Power, Energy, Facilities and operations management, General IT, Geospatial IT, Internet of things, Mobile, Renewable energy, Sensor web, Smart-grid, Social media, Spatial analytics, Utility Solutions, Water and wastewater | Permalink | Comments (0)