As the burden of building data gets heavier, strategies for managing it take to the sky, so to speak. Whether it’s sorting your services smartly or learning to use SCADA or OPC, modern building automation aims to help you succeed by keeping your head in the clouds and your feet on the ground.

As the convergence communities reinvent the BAS industry, our mission becomes critical. Convergence is changing everything for everyone by leading us down the same data paths while using our data as it has never been used before. In breaking down our past technology silos, our businesses get exposed and what were mandatory requirements for one industry have now become mandatory for all in the data chain.
Although BAS have not yet completely reached the total category of mission critical, these systems inch daily toward that end as new money drives buildings to become an active part of the national electrical grid through DR and GridWise thinking. We can consider BAS, which traditionally resided on the ground, as now taking flight as data becomes part of the information clouds of the enterprise and beyond. Silos of building automation technologies fall daily, while data rises into the communication clouds. As this BAS data is used in ways we never imagined, we must ask what mission critical means, as our mission has clearly changed.

BAS Is Flying High

Toby Considine, a systems specialist, facility services with the University of North Carolina (Chapel Hill), recently wrote on the subject in his column titled, “Clouds and Rain,” in the August issue of “I think this is right. For buildings, only the core processes (those elements on the traditional low-voltage protocols such as BACnet and LON) are on the ground.

“Enterprise energy monitoring and building control, then, are in the low-lying cumulus clouds. A well-architected system does not put the EMCS center in the center of any control loops. TCP/IP is, by design, a non-deterministic protocol, meaning it does not belong inside a control loop. Anything off the ground is in the clouds. Anything in the clouds should interact using internet protocols.
“In the UNC enterprise building management (EBMS) project, we restrict all low-level controls to the building. All communications outside the building are using internet protocols. Each building has its enterprise building local gateway (EBLG) speaking traditional standards and proprietary protocols on the building side, and Web services on the outside. We keep the EBMS close to the buildings as a business decision, but there is nothing on the architecture that would prevent us from moving this service up into the higher-up stratus cloud layer, or even up into the high-flying cirrus layer.

“The middle tier of stratus clouds is outside facilities operations and hosted in the wider enterprise. We plan for the Registrar’s Office, in the stratus cloud, to submit room schedules and head counts for every classroom down to the buildings. For now, this communication will have to be with the cumulus layer, but we would like to push it down to the ground at the building gateway.
“We have long used building analytics products like Packrat at UNC, bolted onto the silo. It would be far better for these services to live in the cirrus clouds, under the direct control of someone with the in-house expertise to process the data into information. The processing necessary to turn operating data into predictive maintenance workorders is intense, but only needed sporadically. The whole purpose of cloud computing is to reduce costs by sharing expertise and resources so they are fully utilized. Building analytics should move up into the highest clouds, with the highest expertise.

“The remotest services all belong in the cirrus clouds: DR, energy markets, third-party maintenance, all are cirrus-tier cloud services.
“Keep some clouds close to you, ones in which fast response and control are the most important. Let some clouds drift up into the atmosphere, where forces out of your control may determine their performance and availability, but where superior resources or specialize knowledge can be purchased. And put services where enterprise identity and line of business interaction are the most important in the stratus layer.

“Just remember, changing business conditions can move any cloud up or down. The protocol for communication to any cloud layer should be the same: internet-ready, secured, and standards-based, ready for e-commerce. Nothing but Web services belongs anywhere in the clouds.”
I hope Toby’s words have not completely clouded your thinking, and have provided a clearer view of where function and form should lay in a well-designed IT architecture and how mission critical may interact.


There is a strong trend these days to use industrial-strength cloud connectors that still have the ability to reach the ground.
From another article, this one written by Ronald D. Padilla of Control Technologies, titled, “Using SCADA Systems in an HVAC Environment,” I have extracted the following:

“If you take the time to analyze the features and power of these SCADA (supervisory control and data acquisition) systems, you will realize that these systems are actually a perfect fit for these types of applications in the HVAC environment. These were never considered in the past, since all DDC systems for the HVAC industry were strictly proprietary with custom code and protocols developed by the manufacturers from the controller level to the enterprise level as one system (Author’s note: This is the information silo of which we speak.). There weren’t any opportunities for owners, nor integrators for that matter, to even consider looking into any type of driver development since there would certainly be nothing but resistance from the manufacturers, and perhaps ensuing legal action.

“The industrial market has had open systems for over 20 years, and all of the manufacturers develop products and services to complement these systems. In many cases, it’s a specialty item that is developed using standard protocols so that customers will welcome the opportunity to add this to their network. It was common practice for integration companies to bid on projects for services against each other for the same customer over and over again, unlike the commercial controls industry where the contractor who won the first phase of a project typically “locked in” the customer for a 10- to 15-year period.

“This practice goes against our core values of what we believe in as individuals, owners of businesses, and Americans. Our nation is one that awards us choices in everything we do, so it is a natural progression to provide for this in our facility management systems. Today, contrary to what some companies would like you to believe, that means open systems utilizing protocols such as BACnet®, LonTalk®, and Modbus on the device level, with SCADA enterprise level systems at the top of the network architecture.”

A New Communication Protocol

AutomatedBuildings.comrecently published an interview with Tom Burke, president of the OPC Foundation, and Sean Leonard, vice president of products for MatrikonOPC. The interview, titled “What is OPC and How Will it Play a Key in the Future of Building Automation?” describes this new communication standard. From this excerpt, you will see that OPC is an industrial strength vehicle capable of information exchange in a mission critical environment.

Sinclair: Tom, OPC is penetrating almost every industry. Those of us in the building automation industry are particularly interested in OPC. Can you give a brief overview of this technology?

Burke: OPC is a communication standard for the transfer of process data. It enables various hardware devices and software applications to pass data between one another. OPC eliminates the need for custom drivers, which means that any two OPC applications can speak with each other … right out of the box.

Leonard:OPC allows many existing visualization, automation, and control applications to seamlessly integrate with devices that speak standard HVAC protocols like BACnet, Johnson Controls, and LonWorks, etc. This enables data to be collected from distributed sources and translated into valuable information that can be presented directly to the decision makers in real-time.

Sinclair: We’ve all heard of the new evolving standard for OPC called OPC Unified Architecture (OPC UA). How does OPC UA fit in?

Burke:OPC first came out in 1996 and was very successful at penetrating almost every level of industrial operations. So now, companies want enterprise-level connectivity. And that’s the focus of OPC UA. It will join the ranks of OPC DA for real-time data access, OPC HDA to access historical data, OPC alarms and events, and so on. So we unified it all by combining all the specifications into a single OPC standard called OPC Unified Architecture, or OPC UA for short.

OPC UA provides enterprise-level connectivity, facilitating secure reliable interoperability. It is the latest in the list of OPC specification standards and is based on Web services to make it independent of the operating system. It also has a rich data model to accommodate the complex data types that enterprise applications require. These applications include ERP, CMMS, production accounting, laboratory information management systems (LIMS), and so on. It also includes support for real-time as well as historical data. OPC UA is a highly scalable architecture. It’s about facilitating deployment from the shop floor to the top floor.

BAS Must Keep Its Cool

BAS become mission critical when the job is to keep data centers cool. Fred Kemp, vice president marketing and sales for Invetex Corporation, states in a recent article on
“IT/facilities management professionals are faced with the challenge of securing a comprehensive solution that protects their important mission critical IT assets. Today’s data centers are complex and handle an immense workload and any failure results significant costs to throughout the entire organization.

“IT/facilities managers should monitor critical issues like temperature, power, water leaks and flood, humidity, smoke, airflow, and room entry. According to Sigourney, these conditions typically referred to as the “Big 7,” can easily bring any data center to a complete stop in minutes if problems arise. These IT environmental concerns are hot topics in any computer trade publication today and will remain so as computer rooms and data centers continue to change in design, form, and function. Because power consumption is rising significantly as devices get smaller and more powerful, allowing users to cram more and more hardware into a single rack, they require significantly more energy to keep cool. If a problem does arise, it now impacts IT uptime and integrity in just minutes instead of hours or days like before the year 2000.

“The “Big 7” of environmental monitoring are:
  • Temperature (high/low)
  • Main and UPS power (interruptions)
  • Flooding/water (water leaks, air conditioning condensation)
  • Humidity (high and low)
  • Smoke/fire
  • Room entry/motion
  • Airflow (A/C or fan status)
“Recent studies in the area of IT environment concerns and practices have lead Forester Research, one of the world’s leading independent technology and market research companies, to state that it expects IT environment monitoring to become a $11 billion industry by the year 2010. They state that a reactive approach is not cost effective and incurs too much downtime - automation is the answer. The need for this technology is obvious and the benefits become highly apparent to IT/facilities managers the first time a problem is experienced.”

The traditional BAS industry has become part of the convergence cloud, and if we choose not to be the change agents to mold our future, there are plenty that will. Remember, our mission is now very critical on many levels.ES