At the Spar 3D Expo and Conference in Houston The Woodlands, Texas, Avideh Zakhor from the University of California Berkeley was onsite with some remarkable products from her new startup Indoor Reality. I blogged previously about the technology when it was still under development. The "man portable system" she has developed is a 32 pound backback with a collection of sensors. The most important feature of the technology are sensors and algorithms for accurately tracking the six degrees of spatial freedom which determine the location and orientation of the backpack in the building. In addition there are sensors for recording optical and thermal point clouds. She has also developed a handheld device that supplements the capabilities of the backpack.
The technology can track accurately the location and orientation of the backpack as the operator walks through the building going into all rooms, halls, up and down stairs and into other human accessible spaces. Software tools then generate different 3D data products including optical and thermal 3D point clouds, a colourized 3D point cloud, surface texture map, and triangulated mesh surfaces. The package includes analytical software tools for automatically generating floor plans, distinguishing rooms, identifying staircases, identifying windows, finding heat sources such as computers and human occupants, and identifying lights. Surface textures can be added to create a 3D photo-realtistic rendering. The sensors and software can capture and generate all required input for the Department of Energy's EnergyPlus energy performance analysis program. It has GPS and can generate real world coordinates. Indoor Reality is an ESRI Emerging Business Partner.
I had a chance to sit down with Avideh to talk about her new company and the remarkable products she was demonstrating at SPAR2016.
Avideh, what are the key features of the products you are demonstrating here that distinguish them from some of the other devices vendors are showing here at SPAR ?
We have developed a unique way to track where the operator is in a building and the orientation of the backpack. We have a lot of sensors. That is the key to all of our intellectual property (IP). It's called sensor fusion and most of our IP is in that area. It does not suffer from drift. It tracks accurately where you are in the building even if you go upstairs. You can even go up in an elevator because it has a barometer. In addition we detect wifi. It can tell you what the wifi strength is in different parts of the building. We have an infrared sensor and we have developed an app that will give you real-time occupancy of the building, for example, we can tell you there are four people in this room, two people in that room. We have patents on that.
Based on the data collected by the operator's walk through the building, we allow users to virtually navigate through the space and spatially collaborate with each other through our web based visualization tool. During virtual navigation, you can look around in 3D as if you were there. That's our second data product. We don't just record dumb point clouds. We generate intelligent meshes. Not only can I can pan around, look up and down at each location, but the shape of the cursor shows you the orientation of the surface - for a horizontal surface it's a circle, for vertical surfaces it's a square. The cursor gets bigger as it gets nearer, smaller as it goes farther away. We know the normals of all the surfaces because of the 3D mesh behind what you see. We recognize doors and windows. Of course, we can measure distances and we allow users to annotate objects and locations. So this mesh is very intelligent, so intelligent that we can automatically generate a Revit model with a floor plan, rooms, ceilings, doors and windows. That is our third data product.
How do you foresee people will use the tool ?
We went to the University of Illinois just a month ago and we did an energy model for a 14 story bldg. We can do 3d geometry, we can detect lights automatically, we can detect windows automatically, and we can detect AC plugs automatically. This gets fed into an energy simulation model which is a complete model for an Energy Plus performance audit. Typically when people do an energy audit they are only interested in the lowest hanging fruit, for example, replacing incandescent light bulbs with LEDs. This is because collecting the data for a full Energy Plus model has been time consuming. With the Indoor Reality backpack you can collect whatever you need for a complete Energy Plus audit very simply. For example, we can superimpose the infrared record on a ceiling to show a hot area which is very likely a hot water pipe behind the dry wall.
We have just come out with a visual building documentation tool for stakeholders to collaborate with each other throughout the lifecycle of a building - during design, construction and operations and maintenance. If I want to remodel a building using an architect, designer and contractor they can communicate with each other by leaving notes for each other in the virtual building. For example, I can put a tag on a door for John the contractor. "Please fix this door." John can come along in the virtual building and edit that note "Avideh, I can do this Tuesday." So this makes it possible to collaborate virtually on a building project in 3D, in much the same way that we can collaborate on a Word document.
There is another dimension to this. Let's say we want to remodel a space in a building. The first thing the architect will want to see is the existing floor plan and the other professionals will also want to see other aspects of the space. Normally at this point the building owner will ask everyone involved in the planning to come over for a walk-through the building. That requires a lot of valuable time from very busy people. With the virtual model that we can generate, most of this won't be necessary any more. The owner can ask a relatively junior person equipped with the backpack to walk the building to collect the data. Then the resulting model can be put online and everyone can virtually walk through it. For bidding on a big renovation project, a relatively junior operator does the walk through and puts it on the cloud and the construction companies who are interested in the project can virtually walk through it online for bidding purposes.
For a new building, during the actual construction you can conduct a walk through every day and update visually what's going on. Then if there are lawsuits in the future, you will have a permanent record of who did what and when. You can show the legal folks where a component was supposed to go and where it was actually installed. It is also useful for monitoring progress, so you know who did what when to determine who gets paid and how much. If there is a lawsuit and insurance companies get involved, you can step through the record to see who did what. Then when you want to sell the finished building, the real estate folks can do a virtual walk through of the building with prospective customers. Property managers and facility managers can use the virtual model for maintaining the building. For example, we talked about the energy performance audit. Facility managers can use it to identify lights that need to be replaced with more efficient bulbs. Then they can put a virtual tag on the lights for the maintenance folks. With a virtual model like this, owners, contractors, designers, and facility managers can all collaborate throughout the lifecycle of the building.
There seems to be a lot of areas where the technology you have developed could be used ?
There are quite a number of other application areas for the technology we have identified including historical building and monument recording, scan-to-BIM, variance detection and virtual inspection, real estate valuations, building plans for first responders, change management and dispute resolution, and detecting thermal anomalies including water leaks and insulation deficiencies to name just a few.