Analyzing building data for predictive energy management

June 1, 2014 OpenSystems Media

In an increasingly energy-conscious society, optimizing energy use is a must. Buildings are rapidly getting smarter through a large number of sensors and actuators collecting data throughout. Our homes' energy use can be monitored and controlled with platforms like Google's Nest, but that's a drop in the bucket compared to large buildings, especially dispersed groups of buildings. To make the vast amount of building data valuable to building owners and others with stakes in building automation and energy management, the data needs to be useful for long-term optimization, and for discovering and understanding patterns for predictive planning and new building design.

Arizona State University's K. Selçuk Candan, Professor of Computer Science and Engineering at the School of Computing, Informatics, and Decision Systems Engineering, is conducting research into using observed data (from building sensors themselves) and data from building simulations to do predictive analysis for optimizing day to day energy operations of large buildings, retrofit old buildings, and contribute to long-term optimization like predicting failures that negatively affect energy efficiency. Funding comes from energy optimization company Johnson Controls, Inc. (JCI), who is interested in energy management, specifically for buildings, and wants to be able to leverage large numbers of simulations to do predictive analysis.

"JCI is managing operations for thousands of buildings internationally," Candan says. "They want to be able to see if observations in one building can be used for another to make cross-building decisions."

Simulation analysis is compared with actual observed data to see if the algorithms they develop give meaningful results.

Data analysis challenges

Existing simulation tools can simulate one building under a given parameter setting and return aggregate results, such as overall energy consumption over a period of time. Outcome of a single simulation, however, is of little use for decision making. Aggregate results don't say which parts of the simulation were critical on their own, and they depend on someone with domain expertise to recommend changes based on the results. Moreover, tools to help make sense out of multiple simulations under different parameter settings are not currently in available, Candan says.

"We want to develop algorithms and tools for domain experts to make more informed decisions," Candan says.

Managing the complexity in the amount of data and the expense of simulating that data are challenges holding back more sophisticated building data simulation and analysis, Candan says. Buildings can potentially have hundreds of thousands of parameters, which makes simulations very large, very expensive, and very time consuming. For example, someone may want to simulate hourly changes in a building for a month – a building with 20 floors, multiple "zones" per floor, and multiple readings for each room such as temperature, HVAC units, and occupant utilization patterns. This would take multiple simulations under different parameters and each simulation takes time and processing power.

Increasing simulation efficiency through reuse

In order to decrease the cost, simulations can potentially be reused if a simulation has already been conducted on a similar building.

"Once you have a large number of simulations you can see how they're different from each other – what parameters changed – for better optimization," Candan says. "Our end goal is you have a building you're modeling, you go to a database, find simulation runs on similar buildings, and compose a new simulation from that data, or run a partially new simulation instead of starting from scratch."

Candan says the need to understand when key events are occurring in a simulation and being able to focus on those key, interesting parts in order to cut down on the data needed to compare simulations still requires improvement.

Epidemics, water, and traffic analysis

Similar techniques for predicting the impact of epidemics are also in development, Candan says. Epidemic data aren't as precise or plentiful, but a similar technique of running large numbers of simulations can be used when trying to prepare for emergence of a disease. He's also looking into a partnership with the ASU School of Sustainability to expand the technique for water-energy management, and it could be used for other applications such as analyzing traffic patterns for city planning decisions.

More information on Candan's research, visit

Monique DeVoe (Managing Editor)
Previous Article
Controllerless graphics reduce system cost, size, and complexity - Interview with Kurt J. Parker, Microchip Technology Inc.
Controllerless graphics reduce system cost, size, and complexity - Interview with Kurt J. Parker, Microchip Technology Inc.

Graphics are becoming more important as increasing amounts of data are gathered and displayed for use in In...

Next Article
Developing embedded systems with discrete high-performance graphics processors

Designing a discrete graphics-processor-based system for embedded applications presents very different type...