Home Mail me! rss

WSNBuzz

Fresh news on smart grid and green technologies

US Energy Department Invests $6 Million to Support Commercial Building Efficiency

The Energy Department today announced up to $6 million in funding to deploy and demonstrate four emerging energy-saving technologies in commercial buildings across the country. These projects will help businesses cut energy costs through improved efficiency, while also reducing carbon pollution. Last year, commercial buildings consumed about 20 percent of all energy used in the United States at an estimated cost of nearly $180 billion, and are responsible for 18 percent of total U.S. carbon emissions.

The projects announced today will generate data, case studies, and information intended to help commercial building owners adopt new energy efficient technologies, including advanced ventilation, building energy use optimization software, more efficient commercial refrigeration fan motors, and advanced lighting controls. The selected projects include:

  • enVerid Systems (Houston, Texas) – enVerid will retrofit building ventilation systems with modules that remove indoor air pollutants such as carbon dioxide. This enables the indoor air to be recycled while greatly reducing the amount of outside air ventilated into the building and reducing the loads on the heating, ventilation, and air conditioning (HVAC) system. Facilities could experience significant energy savings with this retrofit technology.  Ten separate commercial building demonstrations will be conducted over 3 years.
  • BuildingIQ, Inc. (Foster City, California) – BuildingIQ will optimize HVAC energy use across commercial buildings using Predictive Energy Optimization (PEO), a cloud-based software application that runs on top of existing building automation systems. PEO uses data from weather forecasts, utility tariffs, demand response event signals, and occupant schedules to automatically adjust energy-consuming building systems. These adjustments are based on building-specific modeling that PEO uses over time employing building use data, as well as predictive algorithms and advanced control strategies. Sixteen separate building demonstrations will be conducted.
  • QM Power, Inc. (Lee’s Summit, Missouri) – QM Power has developed high efficiency  7-16 watt fan motors that are often used in commercial refrigeration systems.  QM Power intends to install and demonstrate approximately 12,000 high efficiency fans in more than 50 grocery stores throughout the U.S., focusing on open display case retrofits that could result in significant efficiency improvements. If fully adopted, the motor application has the potential to achieve more than 0.6 quads in energy savings and reduce energy costs by $1 billion.

More here.

Opower Enters Rare Partnership With FirstFuel to Expand Into Commercial Building Efficiency

Opower doesn’t often enter into strategic partnerships with companies that overlap with its business. The behavioral efficiency company has been very successful landing utility partners so far, and other than starting a relationship with Honeywell for a smart thermostat deployment, it hasn’t needed to rely on others to expand.

That’s why today’s announcement that Opower has teamed up with the Lexington, Mass.-based commercial building analytics company FirstFuel is a fairly big deal.

Over the last six months, Opower and FirstFuel have been working to tie their user interfaces together to give clients a seamless analytics offering across the residential, small commercial and large commercial segments. Opower has solid footing in the residential sector, where it pioneered the use of behavioral messaging to cut home energy use. The company has also been building out its analytics for small businesses. But it hasn’t yet moved into large commercial buildings, a segment in which only a small number of customers currently participate in utility efficiency programs.

FirstFuel was initially founded as a virtual auditing company that uses meter data, mapping tools, weather data and occupancy data to create models of how commercial buildings are performing without ever needing to step inside. The company said it has identified more than a half-billion dollars in low- or no-touch efficiency opportunities at six times the speed of ordinary audits.

Investors and utilities have responded favorably to FirstFuel’s approach. The startup has raised nearly $21 million since launching in 2011 and expanded to eighteen utility partners and five government agencies. FirstFuel has also expanded into efficiency services by partnering with large program administrators to use its analytics to help execute projects for utilities. (Boston-based competitor Retroficiency has expanded in a similar way, inking new deals with utilities and efficiency providers.)

Opower also took notice of FirstFuel’s capabilities, which filled a gap in the company’s offerings. After working for some of the same utilities over the last two years, including E.ON, National Grid and PG&E, the executive teams at Opower and FirstFuel realized how much their analytics engines and approach to the market overlapped.

More here.

Enhancing Data Centre Control over Energy Use with Real-Time Energy Context Awareness

Anthony Schoofs, CTO
Wattics Ltd
September 15, 2014

Cloud applications are becoming an increasingly prevalent model for delivery of services to individuals and organisations around the world. The widespread use of this model, in turn, translates into an increased need for Data Centres, establishing them as one of today’s major consumers of electricity and as one of tomorrow’s threats to grid supply stability.

The energy consumption of Data Centres has become a significant issue for both the industry and policy makers. Energy efficiency best practices are already widely adopted in the design and choice of technology for controlling the cooling of computing infrastructures, for monitoring energy use and discovering inefficiencies, and for supporting integration of renewable energy generation locally. The impact of these, however, is not sufficient to counterbalance the continuously increasing need for more Data Centres and greater electricity consumption.

As part of the modernisation of the grid, new opportunities have emerged for Data Centres to become a major player in their local environment – particularly in a smart city context, in order to benefit from access to both additional supply capacity to cover their own power requirements and an electricity marketplace to resell surplus instantaneous energy. The GEYSER project is working towards realising such coordination by defining a framework which enables Data Centres to optimise their energy use and at the same time offer demand response services towards the smart city and the grid. A key element of the approach is an in depth understanding of the current data center energy profile, both in terms of consumption and production: this is necessary to effect the right level of control to maximize the energy efficiency of the system.

The limitations of conventional monitoring for supporting load management and demand response mechanisms

A typical breakdown of the energy use within a data center will have IT equipment consume about 50% and cooling account for about 35% of the total energy use with the remainder being quite miscellaneous. The conventional measure of a Data Centre’s energy efficiency involves a methodology based on a set of metrics, such as the Power Usage Effectiveness (PUE), which assesses the power used by IT equipment versus facility equipment. The calculation of such energy performance metrics will typically incur a combination of time-correlated power measurements made at multiple points over the electrical installation, e.g. UPS and cooling units circuits.

PUE-GEYSER

Figure 1: Measuring and correlating the total amount of energy used by components of both the Data Centre facility and the computing infrastructure enables the calculation of the Power Usage Effectiveness metric to determine the energy efficiency of the data center.

Read the rest of this entry »

Google Hooks Startups With $100K Worth of Free Cloud

Google is offering early-stage startups that meet certain criteria $100,000 worth of services available on the Google Cloud Platform, which includes everything from Infrastructure- and Platform-as-a-Service to Database-as-a-Service and APIs for a handful of application services.

The company says the move is aimed at attracting more developers to its cloud. Startups with relatively complex applications that do make the move, however, are likely to stay with Google when their credit runs out if they see business momentum, since shifting applications from one type of infrastructure is a complicated and lengthy process.

Urs Hölzle, senior vice president of technical infrastructure at Google, announced the program Friday at the Google for Entrepreneurs Global Partner Summit.

Google is competing tooth-and-nail with Amazon Web Services and Microsoft Azure for public cloud market share. All three have been continuously reducing cloud prices and growing feature sets available on their cloud platforms, but AWS IaaS services continue to be in the lead in terms of popularity.

There is also a lot of competition for developer dollars in the PaaS space, which has more big players, such as Salesforce’s Force.com and Red Hat’s OpenShift. The PaaS market is becoming more crowded with the push behind Cloud Foundry, an open source PaaS created by EMC’s Pivotal and adapted by a number of big IT players, including HP and IBM.

More here.

5 Ways Using Analytics can Help Improve the Smart Grid

5. They can help manage the load balance.
As grids age, properly balancing the power distribution can sometimes be a bit tricky. However, by using smart grid analytics, power can be distributed as needed in an effective manner. Jeff McCracken, Senior Products Line Manager at Itron Analytics, believes it can address many current issues with power grids.
“While not a panacea for all challenges, transformer load management analytics can utilize smart-meter data and actual weather models to continuously monitor and analyze distribution transformer loading levels and report on asset health, helping utilities make informed decisions to balance loads,” he writes.

4. They can help forecast severe weather events and other emergencies that could disrupt the grid.
One of the biggest issues with aging grids is the fact that anything can take them down at any given time. With analytics, these events can be better predicted and prepared for. Severe storms are more and more common, making grid outages more common. McCracken believes they have a great usefulness in furthering preventative measures.
“Analytics enable operators to monitor and report the exact times of service interruption at each system endpoint and use these results to measure improvement in restoration time from automated distribution processes,” he writes. “This allows utilities to identify and restore outages more rapidly, without having to rely on customer inquiries.” This means faster response times and faster repairs.

More here.

Using Analytics to Optimize the Grid

It’s generally well known across the industry that smart metering and advanced metering infrastructure (AMI) provide large amounts of critical data to utility companies. However, many companies are not equipped to fully utilize the benefits and opportunities this data provides. To use smart meter data to balance loads, improve outage response times and better serve customers, utilities should also employ analytics to realize the full value of smart meter and AMI data and gain tangible insight into their operations. In short, simply gathering data isn’t enough. Utilities must also turn data into actionable intelligence.

Analytics are the foundation of the next-generation utility that will improve the information available both internally to the utility and externally to customers. Analytics use the two-way AMI data for programs such as remote connect/disconnect and access to load profile data for customers. With analytics, utilities can balance loads, enhance outage restoration, improve resource efficiency and streamline financial analysis.
Load balancing and outage restoration

Factors such as changing weather patterns, aging infrastructure, and increased adoption of electric vehicles and other new technologies are creating new challenges for utilities. While not a panacea for all challenges, transformer load management analytics can utilize smart-meter data and actual weather models to continuously monitor and analyze distribution transformer loading levels and report on asset health, helping utilities make informed decisions to balance loads.

These analytics also include scenario analysis capabilities that allow operators to predict the impact that new loads — such as electric vehicles — will have on transformers. In addition to transformer load management, voltage monitoring assesses voltage at every delivery point in the distribution network, allowing analysts to evaluate and understand the impacts of customer loads, variable energy sources or distributed energy. Trends from voltage monitoring can be synthesized to allow for system improvements with a holistic approach using the measured data, rather than relying on individual customer complaints or system models.

More here.

The Last Data Center On Earth? The Case for ‘As-a-Service’ Everywhere

Pundits say we’re headed toward a nearly data center-free future in which all computing infrastructure is consolidated in a few über-locations around the globe. The cost of maintaining an on-premises resource is rapidly becoming more than what it would cost to outsource — driving more as-a-service (XaaS) consumption, and fewer data centers.

Certainly the XaaS model is permeating the IT industry and even extending beyond to domains such as parking lot lighting as a service, of all things. The business case is just so tremendously appealing: funds paid from OpEx not CapEx, no need for staff to upgrade and maintain resources, and greater agility born out of the ability to more rapidly and inexpensively provision resources on a pay-per-use basis.

We all know why, of course. Using XaaS approaches, organizations can swiftly explore, test and develop new ideas to address rapidly-evolving market conditions without having to embark on substantial infrastructure investments or over-provisioning physical resources. New agile business models are therefore evolving that allow IT managers to be highly reactive to department needs and focus on enabling business operations.

Is Off-Premise the Future of IT?
Undoubtedly, the public cloud will continue to develop rapidly over the coming years, as the expansion in software virtualization, software-defined storage and the rise of common standards such as OpenStack will satisfactorily address former barriers to widespread adoption including security, second sourcing, application compatibility and enterprise-grade performance levels.

We can also assume that the public cloud and as-a-Service models will continue to offer unmatched efficiencies for both users and providers, with hyperscale computing as a cornerstone of future infrastructure architecture, and therefore contribute to the dramatic and rapid drop in the number of on-premise IT infrastructures. Forrester Research has recently cited the as-a-Service solutions sector as having the fastest growth of any IT spend category. Beyond that, data center consolidation among service providers and technologies will likely be taking place to gain even further economies of scale.

But even with the majority of workloads moving to the cloud already this year — as Cisco Global Cloud Index reported—the biggest shift taking place in the enterprise market is toward a hybrid cloud model. A Gartner survey in October 2013 indicated that half of large enterprises will have hybrid cloud deployments by 2017 – with HP research suggesting that 75 percent of IT executives plan to pursue a hybrid cloud delivery model by 2016. Even the pundits acknowledge that for certain business needs, enterprise infrastructure size and business requirements will continue to dictate on-premise technologies where the infrastructure remains under an enterprise’s direct and full control.

More here.

Top 10 Companies Innovating Using the Smart Grid

10. Tendril
Smart grid innovation doesn’t always have to be about technology. Tendril’s Energy Services Management platform helps utilities and energy service providers better understand the uses and needs of their customers. The platform even uses gamifacation in the form of leader boards so one user’s usage can be compared to another. Tendril wants the platform to help “deliver the right message to the right customer, every time.”

9. Silver Springs Networks
Silver Springs Gen4 Communications Module is one of the leading smart grid platforms on the market. It offers cellular connectivity, the ability to form what Silver Springs calls Micromesh, high data rates, and the ability to automatically customize the amount of data usage within the network. Many of these features are unique to Silver Springs, making them a leader in energy smart grids.

8. Opower
Opwer’s innovative thermostat management system looks to change the way customers engage with their thermostats using smart grid connectivity. Based on a customer’s usage, utilities will let them know they’re eligible for a smart thermostat. Once they receive their thermostat, the customer can customize their usage to suit their needs. The customer’s utility will track their usage and make setting suggestions. These include savings figures for adjusting the thermostat. The thermostat’s app will also let the customer know what others are doing in terms of keeping costs down and will make suggestions on how they can proceed.

More here.

Data centers are the new polluters

U.S. data centers are using more electricity than they need. It takes 34 power plants, each capable of generating 500 megawatts of electricity, to power all of the data centers in operation today. By 2020, the nation will need another 17 similarly sized power plants to meet projected data center energy demands as economic activity becomes increasingly digital.

Any increase in the use of fossil fuels to generate electricity will result in an increase in carbon emissions. But added pollution isn’t an inevitability, according to a new report on data center energy efficiency from the National Resources Defense Council (NRDC), an environmental action organization.

Nationwide, data centers in total used 91 billion kilowatt-hours of electrical energy in 2013, and they will be using 139 billion kilowatt-hours by 2020 — a 53 percent increase.

This chart shows the estimated power usage (in billions of kilowatt-hours), and the cost of power used, by U.S. data centers in 2013 and 2020, and the number of power plants needed to support the demand. The last column shows carbon dioxide (CO 2) emissions in millions of metric tons. (Source: NRDC)

The report argues that an improvement in energy efficiency practices by data centers could cut energy waste by at least 40 percent. The problems hindering efficiency include “comatose” servers, also known as ghost servers, which use power but don’t run any workloads; overprovisioned IT resources; lack of virtualization; and procurement models that don’t address energy efficiency. The typical computer server operates at no more than 12 percent to 18 percent of capacity, and as many as 30 percent of servers are comatose, the report states.

More here.

ZigBee Certified Products Surpass 1,000

The ZigBee Alliance, a global ecosystem of organizations creating wireless solutions for use in energy management, commercial and consumer applications, today announced that there are now more than 1,000 ZigBee Certified products available to the market, demonstrating the rising demand for the Internet of Things.

Logo – http://photos.prnewswire.com/prnh/20091005/SF86596LOGO

ZigBee standards have been implemented in products developed by more than 400 global manufacturers. The rising number of Alliance members and ZigBee Certified products is a reflection of market demand. In fact, analysts from Research and Markets have forecasted that the global ZigBee enabled devices market will grow at a compound annual growth rate of more than 67 percent from 2014-2018.

The ZigBee Certified program ensures that quality, interoperable ZigBee products are available for product manufacturers and their customers. The program focuses on two areas, the ZigBee Compliant Platform program and the ZigBee Certified Product program. The ZigBee Compliant Platform program evaluates all platforms before they can be engineered into products, including testing for compliance to a ZigBee specification. The ZigBee Certified Product program tests any product using an Alliance-developed standard, ensuring that the product is fully compliant to the standard. To learn more about the ZigBee Certified program, visit: http://www.zigbee.org/Certification.aspx

“The rise to 1,000 certified products is the result of ZigBee’s prominence in the booming Internet of Things industry,” said Tobin J.M. Richardson, President and CEO of the ZigBee Alliance. “ZigBee is the only standards-based wireless technology designed to address the unique needs of device-to-device communication in just about any market, and the more than 400 member companies consistently certifying new products with ZigBee are a testament to that fact.”

More here.

Next entries »