Guest post by James R. Benya, PE, FIES, FIALD
I recently wrote (LD&A, May 2018, “Energy Advisor”) about how lighting energy allowances under current codes are now approaching a practical minimum, thanks to significant improvements in source efficacy, optical efficiency, and controls that automatically extinguish lighting when it is not needed. Yet the efficiency-uber-alles drumbeat continues: if we use the most advanced technology and analytics, we will save so much more. It’s time to question this assumption… How can we save “so much more” than almost nothing?
There was a time when lighting was not controlled well, if at all, but that was 1973. With current technology and new construction codes, the primary causes of poor lighting control today are bad design, unenforced codes, and worst of all, improper commissioning. By requiring certified controls acceptance testing, California’s Title 24 significantly improves the likelihood that controls will be built per code and work as designed. Now what?
Meanwhile industry, looking ahead to develop new products that can offer irresistible benefits, still defaults to the efficiency drumbeat. But with the current codes this means trying to save only a few milliwatt-hours of energy per square foot, because the big savings are already being produced by occupancy/vacancy sensors, manual switching and dimming, and daylight harvesting. It’s reasonable to doubt that additional savings can be achieved cost effectively, especially when lighting power density will soon be less than 0.35 w/sf for most non-retail space types. If you can’t save energy and controls already can easily dim and change color, what other new tricks can “smart” controls perform that building owners understand, care about, and will pay for?
Let’s first eliminate demand response (DR) from future ideas. It was just a stopgap solution for the problems facing grid reliability when statewide electricity demand approached generating capacity. This may have been valid when lighting was 1.5 watts per square foot and controls were mostly switches, but no longer. To the contrary, increasing the lighting load may be now desirable for grid stability due to the widespread use of solar PV panels. Energy savings was never the real reason for DR, but now, how does increasing lighting power use pay for itself? Moreover, maintaining the ability to dynamically change building load may have benefit, but why choose lighting? Why not a relatively larger energy user?
Also, lets challenge the ideal of “analytics” as important in lighting energy management in new buildings. What will Watson tell you, exactly? That you can save a kilowatt hour per day by turning off the lights entirely in 20 offices? We run the risk of spending millions to micromanage milliwatts.
The best idea lives in the evolution of lighting controls into a building-wide management network. Why waste all those wires or bandwidth on so little useful data flow? Integrated controls could allow for the plug-and-play connection of all types of devices, from thermostats and air quality sensors to energy measurement. Maybe this “Building Internet of Things (BIoT)” could even allow services like asset tracking that don’t need high speed communications or full Internet connectivity. There could be powered “smart” outlets, using power derived from smart drivers or power packs, that could permit plug-and-play sensors or transceivers connecting the network to a host of wired and wireless devices – without the inherent issues of logging into a general purpose WiFi or Bluetooth network, network security, and similar challenges facing the IoT as envisioned today.
I am hardly the only person to envision this. Last year, several manufacturers formed the IoT Ready Alliance, a non-profit organization organized to develop standards and protocols for this “BIoT”. Unfortunately, only a few firms in the controls industry are members, while the rest of the industry is becoming increasingly proprietary. This reminds me of the early days of personal computers or networking when almost nothing was interchangeable. I argue that for a whole industry to make products enticing to the largest market, there must be a basic level of compatibility. Like USB for computers, good examples will include standard connectors, DC voltages, communications network wiring and other strategies that promote interchangeability. There are at least two primary reasons why this is good: 1) standardization means standard design of key components, installation, programming and testing protocols and 2) consumers will benefit from plug-and- play alternatives when companies fail or stop producing or supporting products.
The usual prognosticators of IoT and lighting rely heavily on touting the energy savings that big data and analytics will bring. With lighting energy use asymptotically approaching practical minimums, this is rather silly (at least for lighting). Ten years ago, I tried to convince the California Energy Commission to embrace network lighting controls in Title 24, because I knew they could reduce energy use 35-50%, and I proved it with several successful projects. But two things made this especially relevant: the relatively high lighting power density (more than twice what can be used today) and controls problems with fluorescent and compact fluorescent, especially with respect to lamp life impacts, decreasing efficacy at low levels, and poor low-end performance. None of these problems exist with LEDs. Today, with practical lighting power densities at 0.35 w/sf, 10% power savings is a “whopping” 0.035 w/sf, only 350 watts over 10,000 sf of a typical building. During a year, the savings will amount to all about 1,050 kWh – saving a mere $125 (1.25 cents per square foot). Assuming an incremental first cost of about $1.00/sf for network and analytics, the payback period will be 80 years. Even if operating time were somehow reduced by 50%, the payback period will “only” be 40 years. In most buildings this proposition will not survive value engineering or in most cases, the laugh test.
On the other hand, I believe that within 10 years, lighting in all new commercial construction will be DC. Whether it is PoE or something else, it will cost next to nothing to provide auxiliary power and the data network at every luminaire, without the need for wireless systems or separate data networks. Imagine every luminaire being a port – or at least a place where a low voltage cable or wireless link can be plugged in and work almost without programming. In addition to the wiring cost advantage, the BIoT is distributed almost for free, and suddenly it will be cost effective to micromanage even milliwatts.
My fear is that like most trends in technology these days, this idea could fall into the bottomless pit of WiFi, LiFi and other means of making sure that a “smart” phone can connect to it. With that comes all the problems of the Internet. I suppose that the BIoT network could ultimately connect to the world, but if we are smart, it will use gateways to ensure that we don’t try to overdesign the BIoT to try to achieve the unnecessarily fast data speeds and related protocols, security issues, addressing issues and other problems associated with conventional IP communication networks. For once in this overhyped world, it might make sense to be simply sensible – clever, rather than too “smart” for our own good.
With thanks to Clifton Lemon for great comments and editing