At the SPAR International Conference in Colorado Springs this morning Magnus Rönnäng, Technical Expert in Virtual Manufacturing at Volvo Car Group gave a fascinating presentation on how Volvo has transformed the modeling of their automobile assembly plant using laser scanning.
Volvo is different from many other car manufacturers in having a single assembly line that is driven by customer orders (just-in-time manufacturing). Volvo makes 10 models and at any given time several of these models will be moving down the line.
When they introduce a new model, they can't afford to shut down the assembly line. What they do is simulate in 3D how the new model will progress down the line. Until several years ago they simulated the assembly line using 3D CAD, but just the assembly line not the rest of the plant. That led to problems because they couldn't determine from the simulation whether there was enough room outside of the assembly line to manipulate a new part or assembly. To resolve these issues they needed a model of the entire plant.
In 1999 they tried to model the entire plant using 3D CAD. By the time they completed the model, the plant had changed so much that the model was out of date. They tried to do this several times again with the same result. Then they decided to try laser scanning which turned out to be the solution. They laser scanned the entire plant. It took 2500 scans and generated 80 billion points. But they didn't convert or reverse engineer the point cloud - the point cloud was the digital model of the plant. They abandoned trying to use 3D CAD for modeling the plant, and continued to use it only for the vehicles themselves and other movable and moving equipment.
The key to keeping the model up to date in a very dynamic plant environment is incremental scanning which allowed them to update only part of the point cloud model of the plant. This technique allowed them to scan new equipment installations and renovations and replace only part of the point cloud model.
Volvo uses commercial software that allows them to simulate a vehicle represented by a 3D CAD model moving through the point cloud of the assembly line and the plant looking for any collisions, basically a dynamic clash detection algorithm. They also use 3D visualization software to demonstrate to non-technical staff in management and on the Board of Directors how a new model moves through the assembly line.
Over the next two decades the total investment in infrastructure is estimated to be somewhere in the range of $20 to 50 trillion. A lot of this money is going to have to come from the private sector because governments are increasingly spending their money on services and less on capital projects. To attract private investment will require governments to provide for decent financial returns and reduced risk. A major challenge face by many developed economies including the U.S., Japan, the E.U. and Korea that productivity in the construction industry has been static or declining for the past few decades. As a recent McKinsey report has pointed out there are a number of areas where productivity can be improved in the construction industry, but it will require investment in technology.
Infrastructure projects are big and getting bigger. 60% of these projects fail. The either exceed their budget or their schedule or both. Some of the reasons for this is information silos (CAD, GIS, BIM, etc), huge volumes of data (tens of thousands of design drawings, terabytes of point clouds and imagery, billions of tweets, etc), and the difficulty of coordinating large virtual teams. Also increasingly stakeholders who are not architects, engineers or GIS analysts need to be involved in the design process, whether it is a new transmission line, substation, bridge, or subway, municipal government, the public, environmental review agencies, regulators, and others impacted by the project are involved in the design process because they need to influence how it is going to impact them and their constituency.
Rich Humphrey described what he calls design in context which is enabled by the convergence of and breaking down of barriers between geospatial and engineering design and relies on big data, 3D visualization and web technologies.
Fundamentally it means using BIM or BIM for infrastructure to design a new structure in situ, using a city model to represent the rest of the city. The data generated in large infrastructure projects together with the city model can become huge so this requires big data technology. For example, Rich mentioned a Chicago model that is comprised of 800,000 buildings. The concept also makes possible analytics (for example, energy and water performance analysis, line of sight analysis for traffic signs) that require information about prevailing weather patterns and neighbouring structures (for example, right to light in the UK). And it also aims at automating design optimization based on defined design goals. It is web-based to allow access for large extended virtual teams. It uses 3D visualization technology and simplified ways of interacting with the design to enable all the stakeholders, including non-technical folks, to actively participate in the design process.
The goals are improved productivity, optimized designs, and greater predictability for large infrastructure projects, all of which are required if we are going to attract the tens of trillions of private investment dollars that it is estimated we need to invest in our infrastructure.
At the Oracle Spatial and Graph User Meeting, the folks from Oracle outlined some of the things to expect in the next version of Oracle 12c. I was very impressed. All of the below is my interpretation of what will be in Oracle 12c.
Parametric curve support
The really big news for the architecture, engineering and construction community (AEC) who use CAD and BIM applications is that Oracle 12c is expected to support NURBS (non-uniform rational basis spline) a type of parametric curve widely used in the design space. According the Siva Ravada the support will be very general and will be able to handle different ways of representing NURBS. Up to now the only parametric curve supported by Oracle was circular arcs.
To me it looks like Oracle is taking a major step toward being able to store natively and manipulate (in some cases by stroking) the parametrized curves used by CAD and BIM applications. There is a lot of CAD and BIM data out there and being able to store it and manipulate it in Oracle Spatial and Graph is going to be a game changer.
Oracle is providing support for orthorectification embedded directly in the database. This will have important implications for the industry. Up to now, you had to pull the image out of where ever it was stored, transfer it over the network to someone's application often from an image processing company to orthorectify it, and then transfer it back to store the image or stream it to the end user. With 12c you will be able to orthorectify images in the database. This is in line with Oracle's objective of bringing processing to the data, rather than the data to processing.
Oracle 12c will also have support for raster algebra, again in the database. For example, you will be able to average in the database12 months of national temperature data stored in monthly raster files.
GDAL and PDAL support
Oracle supports the open source GDAL raster libraries, the industry standard developed by Frank Warmerdam (now with Google) and used by just about everyone for accessing raster images including ESRI. PDAL is a similar type of open source library for point clouds developed by Howard Butler and Michael Gerlek.
Oracle supports three types of 3D data. 3D vectors, point clouds and terrain (surfaces). Oracle 12c will provide full support for 3D geodetic data, lon lat and elevation in meters or feet.
Oracle has developed a new engine for handling massive
point clouds. Dan Geringer showed impressive results for a 2.8 billion
point point cloud (286 gigabyte file).
Processing vector geospatial
data is reported to be 50 to 100 times faster in 12c than in 11g.
operations have been parallelized which can dramatically reduce the time it takes for these operations.
Since acquiring Sun, Oracle has become a hardware and software company and is offering integrated hardware and software solutions that offer extremely good performance characteristics. For example, processing LiDAR data benefits from hardware acceleration on engineered machines such as Exadata boxes.
Oracle is planning to bundle vector data from Nokia (Navteq), TomTom and others.
Oracle Spatial and Graph
The name of the product is changing to Oracle Spatial and Graph partly because it has had graph capabilities (network and RDF semantic) already for several years. But probably more importantly because the demand for graph capabilities is accelerating, especially RDF semantic graphs for linked data, text mining, and for social media analytics, according to Xavier Lopez. Even Oracle's NOSQL database (aka BerkeleyDB) is apparently getting graph capabilities.
Kevin Gilson, of the Project Visualization group at Parsons Brinckerhoff (PB), gave an overview of how PB manages 3D+ datasets in support of design and construction for large infrastructure projects such as the San Francisco-Oakland Bay Bridge, the I-95 New Haven Harbour Crossing /Q-Bridge reconstrcution, and the Alaskan Way viaduct project. He showed how the project visualization team leverages geospatial data, LiDAR data, design data and construction planning data together in large integrated datasets that concurrently support visualization, stakeholder communication, design, construction planning and site logistics.
PB has more than 30 staff dedicated to project visualization. That included folks who do 3D modeling, design visualization, web design, and programming. 3D technologies include building information modeling (BIM), 3D design, virtual desgin and construction (VDC) and laser scanning (LiDAR).
3D modeling for design and construction
Some of the reasons that designers and construction firms are adopting 3D technology are the serious limitations of 2D sheet sets, still legally required fro s-builting but very unreliable, the design efficiencies of 3D parametric models, and the limitations of siloed project data. PB has adopted the concept of project information models or PIMs, that integrate information about all aspects of the design and build process and that potentially can flow data through for operations and maintenance.
3D modeling and model-based design are used by PB for design validation, clash detection, and parameteric modeling, civil integration management, 4D modeling (time+3D), 5D (cost+time+3D), and design visualization. Model-based design involves integrating CAD, GIS, survey, analytics, BIM and associated databases, and 3D visualization typically in a vitual gaming environment.
Construction firms in particular see tremendous advantages in 4D models that integrate design and gespatial data and allow the construction team to virtually build the project so that they can check staging and construction sequencing and site logistics virtually. 5D and higher dimensions allow them to add additional metrics including cost, resources, materials, and equipment.
The benefits that PB realize by this integrated approach based on 3D+ dataest include finding and fixing conflicts between different design groups (HVAC, plumbing, electrical, structural, and so on) during the design phase rather than during construction when clashes are much more expensive to fix. Another benefit is being able to quantify things using analytics, so that decisions are data-driven rather than subjective.
A major benefit is less re-work is required reducing the risk of cost and schedule overruns. A big advantage is improved communication and coordination between all project stakeholders, and especially with non-technical decision makers. Being able to see photorealistic visualizations of what the project is going to look like makes a huge difference in communicating with politicians and the public. PB uses gaming technology making it possible to experience a project before a shovel has gone into the ground. For example, on highway projects, PB's visualization makes it possible to drive the highway and even the detours required during construction in a virtual environment so that the public can experience the changes and be prepared for them before they actually happen.
Another important benefit that Kevin sees from a project approach that includes visualization is that "sexy deliverables" get peoples attention, within the project team as well as with external stakeholders. Several other speakers at SPAR 2013 also emphasized the importance of the "sexiness" factor.
Laser scanning in construction
Kevin described a number of applications of laser scanning (LiDAR) as part of the 3D constrcution process. A major application of LiDAR is contruction monitoring, capturing construction progress as well as being able to automate the process of checking for divergence from design when contractors for a variety of reasons don't build what is designed.
Another important Laser scanning application is accurate and reliable as-builts. At SPAR 2013, speaker after speaker in different sectors of the construction industry reiterated that current 2D as-builts, still legally required, are unreliable and rarely looked at by anyone post-construction. Marco Vidali Castillo at ICA which is responsible for the Autopista Urbana Sur project in Mexico City gave an example of how trying to use 2D as-builts can lead to problems necessitating major re-engineering. LiDAR scans of a completed project can provide the reliability that 2D as-built sheet sets lack. On the Autopista Urbana Sur project Marco says that ICA intends to deliver LiDAR scans as well as the legally required 2D as-built sheet sets.
3D modeling and sustainability
There are also benefits from the perspective of sustainability to PBs project approach. Collaboration typically has relied on the exchange of paper documents which on a large project can involve many thousands of paper documents. Collaborating in a virtual environment can dramatically reduce paper flow while improving the quality of communication. It can also improve traffic flow and resource management during construction, thereby reducing emissions. A life-cycle approach to data management during design and construction, with data being reused for operations and maintenenace can reduce rework resulting in significant efficiencies.
Selecting software tools
The PB project visualization team takes a best of breed approach in selecting software for 3D modeling and visualization. An important criterion in selecting tools is interoperability between the different technologies. Currently the software stack that PB relies on includes
At SPAR 2013 Dennis Rodriguez, Project Manager at Denver International Airport (DIA), and Daniel Stonecipher of Immersiv, gave a fascinating presentation on how they combined a BIM-based design and construction process with GIS to create a facilities and operations management (FM/OM) solution for managing DIA's infrastructure.
DIA is a small city and managing its infrastructure involves many of the same probelms that utilities and municipal governments have to deal with. As-builts are unreliable or not existent, which meant that DIA really didn't know reliably where their assets were. DIA has nine million CAD files which meant a major data management problem. Their maintenance program was dysfunctional because they didn't have the information they required to rationalize it with a data-driven process. They had a lot of redundant data because of silos of information in different departments using different applications. The most important impact of all of this was subjective decision making during the budget planning process, characterized by the so-called HiPPO (highest paid person's opinion) problem.
Their ultimate goal was data driven processes that supported predictive maintenance and objective metrics for decision making especially in prioritizing projects during the budgeting process.
The way they addressed this was by a combination of data normalization and business process rationalization. The swiitched to an integrated building information management (BIM) design and construction process that produced a normalized data stream. During design and construction the data that was required for facilities management and operations (FM/OM) was collected. This involved collecting and validation existing data and developing continuous data and process management. The result was FM/OM ready BIMs (both horizontal and vertical) and current relational data structures.
A critical aspect of their solution was that all data including BIM models was geolocated, which made it possible to load all the data into a GIS linked to Maximo. Geolocating all their facilities made it easy for them to organize maintenance work on different types of infrastructure concurrently, so that they could excavate once for multiple utilities rather than different crews reexcavating at the same location multiple times.
Dennis and Daniel said that a number of airports are interested in DIA's solution. I would expect that universities, municipal governments, and anyone who needs to manage infrastructure in a campus or municipal environment would find DIA's solution worthwhile investigating.
The Utilities Standards Forum (USF) is an organization representing about 50 Ontario electricity distribution companies. Its goal is collaboratively to define standards for its members to reduce duplication of effort.
The USF has developed design standards for electricity distribution that meet the requirements of Ontario Regulation 22/04 under the Electricity Act of 1998. Currently these are in the form of paper or PDF documents containing 2D isometric views of the components used in electricity distribution such as insulators, clamps, surge arrestors, fuse-cutouts, terminations, cross arms, cable guards, and other types of equipment.
There is pressure to move to 3D design standards. There are several motivations for this. In this time of the challenge of an aging workforce in the utility industry, increasing the efficiency of engineers and designers is an important motivator. But another equally important one is the increasing number of younger designers and engineers that have been brought up in 3D digital world and want to work in a modern 3D environment.
At the EDIST 2013 Conference in Toronto an overview of the new 3D design standards was presented by Ron Lapier and Lori Gallauger of the USF. The new 3D standard is a major step forward in two ways. First of all the components required for electricity distribution design are now 3D so that electricity distribution design can be performed in a 3D environment. But equally important USF intends to make them available digitally in the form of a library of DWG files containing 3D objects that can be brought into a 3D CAD environment and integrated with other components to create a completely 3D representation of an electricity distribution design. The design can be viewed and manipulated in 3D. As well 2D drawings for construction can be generated from the 3d model. The components have real world coordinates so that clearances and dimensions can be directly measured from the 3D model.
Project Basejump is a
free technology preview that enables AutoCAD Map 3D/Civil 3D users to access Bing web mapping service (WMS) maps including aerial imagery, road, traffic, and other information. Project Basejump is a FDO provider that enables you to bring Bing Maps layers into an AutoCAD environment as a base map. Project Basejump was first made available in May 2012.
I blogged previously about Burlington Hydro, a relatively small utility in southern Ontario that is integrating into an intelligent network many aspects of what is typically included in smart grid including intelligent netwrok devices, self healing networks, smart meters, distributed generation, electric vehicles (EV) , factory ride-through systems (enables factories to continue functioning through outages), battery-based electric storage, bidirectional communications network linking the intelligent devices to the control center, and dramatically increased volumes of real-time data.
Most utilities, large and small, are experiencing or very soon will experience a similar change in the customer services, responsivness, and level of reliability their electric power networks are expected to provide.
An important difference between large utilities and small utiltiies, many of which are municipally-owned or rural coops, is that large utilities have the IT staff and experience to deal with intelligent electric power networks whereas small utilities often don't.
What struck me as as so unique about what Terraspatial is offering and so valuable to small utilities is that it is a hosted solution, in other words, all the utility needs to install at their site is a browser, everything else is running in the cloud. The most important benefit of a hosted solution like this is that it reduces the level of IT capacity that the utility needs to maintain in house.
One of the utility associations that Terraspatial has been working with is the Central Services Association (CSA), an association of 116 smaller utilities, in central Mississippi. Jointly Terraspatial and CSA have developed the requirements for a hosted solution called PlantWorx for electric power utilities. Based on thes requirements, Terraspatial has designed and built, and CSA has been testing a solution called PlantWorx.
The design goals of the solution that CSA and Terraspatial have developed are very relevant to small utilities.
It is hosted, which means that the utility does not need to own or manage servers or software.
It is secure because it relies on the security of Rackspace a major cloud hosting provider that can provide a level of security, including protection from internal tampering, role-based access by users, protection from external threats, the latest encryption, redundancy and back-ups, ISO certified data centers, and mirrored servers for persistent backup, in other words a much higher level of security than the average utility network is capable of.
Because it it running in the cloud, it is accessible from anywhere and from any device, in the office and in the field.
It is an integrated work order management sytem solution supports staking through to accounting and reporting. It interfaces to customer information systems, accounting and billing systems, and materials management systems,
It provides a dashboard allowing the utility to monitor key performance indicators of the utility's business.
There is a new technology preview at Autodesk Labs, Augmented Reality for Showcase. Augmented reality makes it possible to overlay information (graphics, text, video, sound) on to a live video feed of the real-world in real-time. Based on Autodesk Showcase, which is an application that produces photorealistic high quality interactive 3D real-time rendering of design models, the Augmented Reality plug-in makes it possible to visualize models in the real-world as viewed through your web or video camera.
The process to overlay a design model on the real world as seen through your video camera is pretty simple. You prepare your model in Showcase and then associate a supplied marker (jpg file) with your virtual object. You print the marker and place it in the real world and point your web camera at the printed marker. Then Showcase replaces the marker in your video recording with the model you've created in Showcase. The technology preview will be available until October 31, 2012. There is a video on Youtube that gives you an introduction to the augmented reality preview.