Fresh news on smart grid and green technologies
You probably didn’t notice it, but one of the most significant acquisitions in the history of the utility sector took place this week. Hawaiian Electric Industries agreed to sell its utility business to NextEra Energy for $4.3 billion.
The move is significant not because it creates an ever larger utility, but because one of the largest renewable energy asset owners in the world is acquiring a utility that is being overrun by renewable energy. Hawaiian Electric has been fighting an onslaught of rooftop solar energy that has transferred energy generating assets from its own control to customers. Regulators have even squashed an effort by the utility to slow the growth of solar.
If it chooses to, NextEra could build the next generation utility, one that embraces solar and wind energy while addressing the challenges that come from these energy sources.
Can NextEra Energy create the utility of the future?
Hawaii’s first challenge is that it gets about 75% of its electricity from imported oil. This is incredibly expensive and is the reason Hawaiians paid an average of $0.34/kW-hr of electricity in September versus $0.10 nationwide (this includes all sectors).
Source: U.S. Energy Information Administration.
This high cost electricity has made solar energy extremely attractive for consumers, who can save money by putting solar panels on their roof for a lease of around $0.15/kW-hr and $0 down. But distributed solar systems rely on net metering to send extra electricity during the day back to the grid and the high penetration of distributed solar has been a challenge for the utility. As a result, Hawaiian Electric has tried to slow solar growth by adding fees and limiting net metering for solar systems.
The UK’s new electricity capacity market suffered a potential setback on Thursday when it emerged that a demand response company is to launch a legal challenge against the mechanism in the European General Court.
Tempus energy, which is a member of the UK’s Demand Response Association, said the mechanism acts as an “unlawful subsidy” that unfairly favours new generation assets above consumers capable of moving demand away from peak hours.
The market is set to open to auction later this month for delivery of power capacity from Winter 2018. However the government said it remained “fully confident” in its timetable.
Some operators of new-build plant are eligible to enter this month’s auction for contract lengths of up to fifteen years, while customers offering demand response capabilities are only allowed to enter a separate subsidy scheme that will entitle them to maximum one-year contracts.
EnerNOC, Inc. (Nasdaq:ENOC), a leading provider of energy intelligence software (EIS), and Pulse Energy, a leader in customer engagement software for the utility industry, today announced that EnerNOC has acquired Pulse Energy to help utilities better engage all of their commercial and industrial customers, from small businesses to the largest enterprises.
“Forward-thinking utilities are striving to be trusted energy advisors to their customers by delivering tools that provide greater visibility and control over energy use. The combination of EnerNOC and Pulse Energy will offer utilities the only integrated platform purpose-built to engage utilities’ entire commercial and industrial customer base,” said Tim Healy, Chairman and CEO of EnerNOC. “This acquisition strengthens EnerNOC’s software product offerings for utilities and significantly increases our addressable market.”
Pulse Energy’s software enables utilities to deliver targeted energy saving recommendations in a branded environment, catered to each customer’s unique profile, including business type, location, and energy use. It has detailed analytics models for over 100 commercial customer market segments and is currently deployed by utilities in North America, Europe, and Australia, including BC Hydro, British Gas, Ergon Energy, FortisBC, and Pacific Gas & Electric. Together, EnerNOC and Pulse Energy serve 54 utilities worldwide.
The Bluetooth Special Interest Group (SIG) officially adopted version 4.2 of the Bluetooth core specification this week. Key updates in 4.2 improve privacy and increase speed, and a soon-to-be ratified profile will enable IP connectivity. Bluetooth 4.2 opens up new opportunities for developers, OEMs and the industry to build a better user experience for consumers while creating use cases never before imagined.
“Bluetooth 4.2 is all about continuing to make Bluetooth Smart the best solution to connect all the technology in your life – from personal sensors to your connected home. In addition to the improvements to the specification itself, a new profile known as IPSP enables IPv6 for Bluetooth, opening entirely new doors for device connectivity,” said Mark Powell, executive director of the Bluetooth SIG. “Bluetooth Smart is the only technology that can scale with the market, provide developers the flexibility to innovate, and be the foundation for the IoT.”
Privacy and Security
Bluetooth 4.2 introduces industry-leading privacy settings that lowers power consumption and builds upon the government-grade security features of the Bluetooth specification. The new privacy features put control back into the hands of the consumer by making it difficult for eavesdroppers to track a device through its Bluetooth connection without permission. For example, when shopping in a retail store with beacons, unless you’ve enabled permission for the beacon to engage with your device, you can’t be tracked.
Bluetooth 4.2 increases the speed and reliability of data transfers between Bluetooth Smart devices. By increasing the capacity of Bluetooth Smart packets, devices transfer data up to 2.5 times faster than with previous versions. Increased data transfer speeds and packet capacity reduces the opportunity for transmission errors to occur and reduces battery consumption, resulting in a more efficient connection.
Building on the capabilities released earlier with Bluetooth 4.1 and the new features released in 4.2, the Internet Protocol Support Profile (IPSP) will allow Bluetooth Smart sensors to access the Internet directly via IPv6/6LoWPAN. IP connectivity makes it possible to use existing IP infrastructure to manage Bluetooth Smart “edge” devices. This is ideal for connected home scenarios that need both personal and wide area control. This profile will be ratified by the end of the year.
For the latest Bluetooth 4.2 and IPSP technical details, tools and other information including an FAQ and more, visit: www.bluetooth.com/bluetooth4-2
About Bluetooth® Wireless Technology
Bluetooth wireless technology is the global wireless standard enabling simple, secure connectivity for an expanding range of devices and serves as the backbone of the connected world. Bluetooth Smart technology, through an updatable platform and low power consumption, creates new application opportunities for the mobile phone, consumer electronics, PC, automotive, health & fitness and smart home industries. With nearly three billion devices shipping annually, Bluetooth is the wireless technology of choice for developers, product manufacturers, and consumers worldwide. Backed by industry leading companies, the Bluetooth SIG empowers over 24,000 member companies to collaborate, innovate and guide Bluetooth wireless technology. For more information, please visit www.bluetooth.com.
FirstEnergy Corp. has a traditional view of wholesale electricity markets: They’re a competition between iron-in-the-ground facilities that can put megawatts on the grid when those megawatts are needed. Think coal plants, nuclear reactors and hydroelectric dams.
Missing from the definition is a consumer’s promise to turn off the lights when the grid is stressed — so-called demand response.
Instead of creating energy during peak times, demand response resources conserve it, freeing up megawatts that don’t need to come from generators.
The idea is not new and has been expanding in the territory of PJM Interconnection, a Valley Forge-based grid operator that manages the flow of electricity to 13 states, including Pennsylvania.
FirstEnergy, which owns power plants and utility companies across several states, wants PJM to abandon the demand response concept.
The Ohio-based energy company says demand response, which doesn’t require any kind of capital commitment, is “starving” traditional generation out of its rightful revenue in wholesale markets.
“We feel that it’s going to lead to even more premature closures of power plants,” said Doug Colafella, a spokesman for the firm.
Specifically, FirstEnergy is fighting to get demand response kicked out of PJM’s annual capacity auction, which ensures there’s enough electricity resources to meet projected power demand three years in advance. The auction establishes a single clearing price that will be paid to all successful bidders, like a retainer fee, in exchange for their promise to be available to be called upon three years from now.
During the May auction, which set capacity prices for the 2017-2018 year, the clearing price was $120 a day for each megawatt of electricity bidders committed. About 6 percent, or about 11,000 megawatts, of the capacity secured came from demand response.
FirstEnergy’s Bruce Mansfield coal-fired power plant in Beaver County failed to clear the auction. The company has since postponed upgrades to the facility, which could jeopardize its functioning beyond 2016.
Capacity payments are a stable source of revenue for baseload generation plants, Mr. Colafella said, and a price signal to the market about which way demand is headed, giving generators some indication about whether new facilities will be necessary and profitable.
Demand response distorts that dynamic, he said.
As we wrap up 2014, it’s time we took a look at some of the biggest cloud technologies that made an impact over the course of the year and thought about cloud predictions for 2015. I’m most likely not going to list all of the technologies that were big this year, so if you feel I missed something, feel free to add it in the comments section!
That said, the concentration around the user and the information delivery model has allowed the modern data center and the cloud infrastructure in general to really evolve. We’re seeing new methods of optimization, cloud control and entirely new ways of controlling the user experience. And so, what were some of the big technologies that impacted the cloud?
- APIs (cloud apps). This has been a big one. Platforms from VMware, OpenStack, CloudStack, Eucalyptus, and Amazon are all creating easier ways to connect via the cloud. APIs are creating intelligent infrastructure cross-connects to reduce the amount of resources required. APIs at the software and hardware layer will continue to make cloud communication easier on an application and infrastructure level.
- Software-Defined Everything (SDx). We’re really taking off with this whole virtualization concept. Software-defined platforms really do revolve around specific components virtualization. This can be storage, networking, security, or even a data center platform. With technologies like SDN, we’re able to create intricately connected data centers capable of greater resiliency and business continuity.
- The Hybrid Cloud. There is going to be a lot of blurring when it comes to cloud model definitions. The future of the cloud will pretty much see everyone adopt some type of hybrid cloud platform. Why? Firstly, most organizations are already in the cloud. Secondly, there are a lot of new options in terms of connecting a private cloud with some cloud resources. More companies are moving just a part of their environment into various cloud options. The reason it’ll all start to blur together is the management framework is evolving. New cloud management solutions aim to control your cloud regardless of the platform. Hybrid, public, private and even community clouds can all be controlled from a single console.
Managing data centers to keep servers and their applications functioning properly is a core responsibility of every IT manager, as is keeping costs and energy usage to a minimum. CIOs and IT managers don’t have to be cooling experts, but they should know enough to make intelligent management-level decisions about the most cost effective way to manage energy usage in their data center.
Today’s robust servers allow data centers to operate at hotter temperatures. You don’t need to run iceboxes anymore, nor should you need to wear a sweater inside your data center. A smart approach can reduce the electricity bill for cooling by up to 50 percent at many data centers, with simple but smart adjustments or investments to the cooling system.
The following questions and best practices will help you identify the best thermal management strategy for your data center, with the end goal of slashing energy costs without reducing availability of critical infrastructure.
Turn Up the Thermostat
Are you hot when you walk through your data center? You probably should be a little. The old standard was 72 degrees for return air (the mixture of air returning from computers to the cooling unit) and relative humidity at 50 percent. Today, you can push return air temperatures as high as 95 degrees (best to do this in small increments to avoid unexpected humidity trouble and to ensure all the IT equipment is functioning properly). This can be done over a few days with little risk to applications and IT equipment. Enlist your facilities manager or vendor partners to assess how to do so safely. Remember, for every 1 degree Fahrenheit increase in temperature you will save 1.5-2.0 percent of your energy costs.
Raise chilled water temperatures. For many years, 45 degrees was the standard for water in the chiller. That’s changing. Operating chillers up to 55 degrees is possible today, reducing energy consumption by 20 percent. Every degree matters—each 1 degree increase in water temperature reduces chiller energy consumption by 2 percent. This can make a huge difference, since the chiller is the heart of cooling system and consumes approximately 75 percent of the system’s electricity. Be careful to work with your facilities manager, because raising chilled water set points can reduce cooling capacity in your data center cooling units which is fine if you have some excess capacity.
Last week, the Federal Energy Regulatory Commission (FERC) released its annual report on enforcement. The report, prepared by FERC’s Office of Enforcement, provides FY2014 statistics on the investigative and enforcement activities conducted by its four divisions—Investigations, Audits and Accounting, Energy Market Oversight, and Analytics and Surveillance. The full report is available here.
In the report, FERC confirms that its investigation and enforcement priorities for FY2015 and the foreseeable future will continue to focus on matters involving (1) fraud and market manipulation, (2) serious violations of reliability standards, (3) anticompetitive conduct, and (4) conduct that threatens the transparency of regulated markets.
Specific statistics in the report include the following:
- FERC opened 17 investigations in FY2014, over half of which involved market manipulation;
- more than half of these 17 new investigations arose from referrals based on conduct observed by FERC surveillance staff or RTO/ISO Market Monitoring Units;
- nine notices of alleged violations were issued by FERC in FY2014, five of which involved alleged market manipulation (two of which have settled); and
- every settlement of an investigation in FY2014 included provisions requiring the subject to enhance compliance programs and report back to FERC on the results of those enhancements.
The report also discussed recent improvements to FERC’s surveillance capabilities, including FERC’s gaining access to data from the Consumer Futures Trading Commission (CFTC) Large Trader Report and FERC’s ongoing efforts to determine if market manipulation contributed to the historically high natural gas and electric prices that occurred during the 2014 “polar vortex” events.
Since 2007, FERC has issued annual enforcement reports that provide insight into FERC’s largely non-public investigation work. These annual reports provide summary statistics of FERC’s entire enforcement program, as well as descriptions of significant recent cases.
Interest in FERC’s investigation and enforcement program has increased since the passage of the Energy Policy Act of 2005 (EPAct 2005), which amended both the Federal Power Act and Natural Gas Act to enhance FERC’s authority to prohibit market manipulation and assess significant penalties where manipulation was determined to have occurred. FERC Chairman Cheryl LaFleur recently expressed a continuing commitment to investigation and enforcement activities, which are expected to receive a boost when FERC Commissioner Norman Bay, formerly the head of FERC’s Office of Enforcement, replaces LaFleur as chairman in April 2015.
The contours of FERC’s authority over market manipulation—as well as the types of activities that constitute fraud or manipulation—are still being defined. In 2013, the DC Circuit ruled that FERC lacks jurisdiction over manipulation of natural gas futures contracts, and that the CFTC instead has exclusive jurisdiction over the trading of derivatives. However, FERC recently signaled its intention to seek legislation to address this jurisdictional dispute and confirm FERC’s jurisdiction over these products. In addition, other investigative subjects currently are challenging FERC enforcement actions based on both due process issues and jurisdictional issues. For example, FERC’s jurisdiction over allegedly manipulative activity related to retail demand response programs appears to be questionable, given a 2014 DC Circuit decision that such programs are outside of FERC’s jurisdiction.
The Zigbee Alliance is consolidating its specifications spanning six application areas into Zigbee 3.0 at a time when competing standards based on Internet Protocol (IP) are expected to see rapid growth. The move aims to simplify the user’s job of finding compliant Zigbee products by requiring component vendors to pass a more rigorous certification process.
The Alliance expects to start certification testing for Zigbee 3.0 in the fall of 2015. Compliant products will need to support the standard’s application profiles in home and building automation, LED lighting, healthcare, retail, and smart energy.
Zigbee 3.0 does not include two Zigbee specs — Smart Energy 2, a profile based on IP; and RF4CE, a version of Zigbee geared for remote controls. It does cover all specs based on Zigbee Pro, the group’s overarching standard for how networks are formed and devices attach to them across different application areas.
“Underneath the covers we are accommodating these multiple applications in a single standard, so Zigbee thermostats, for example, can be used in either home or office buildings,” says Ryan Maley, director of strategic marketing for the Zigbee Alliance.
The upgrade is a natural consequence of advances in hardware, Maley says. “When Zigbee got started, everything was based on 8-bit MCUs and separate radios, but now devices are running on 32-bit cores in SoCs with many more capabilities.”
The Alliance has already sponsored about three plugfests to test out the feature-complete but still-evolving specification.
The move comes at a time when Zigbee is under threat. Market researchers expect a surge of competing IEEE 802.15.4 products using the emerging IP-based 6LoWPAN protocols will take a significant share of Zigbee’s market over the next few years.
« Previous entries
Silver Spring Networks has been striving to define itself not just as a smart meter vendor — an increasingly tough business to be in — but as a provider of software and services to put the smart grid’s increasing number of devices to use. Last week, it won a contract to provide consumer engagement and efficiency software to Michigan utility Consumers Energy, an early example of how it might be making that transition.
First of all, Consumers Energy picked Silver Spring’s software suite over multiple competitors, including incumbent Opower, a big name in customer engagement and energy efficiency, which had worked with the utility in the pilot version of what the utility is now taking to all of its 2.7 million customers.
Second, it’s the first big contract that’s not built on Silver Spring’s own smart meter network. Instead, Silver Spring will deploy its software over Consumers’ cellular-enabled smart meters from Itron. That in itself isn’t so unusual — many smart meter deployments use different pieces of software from different vendors, both for core network and data management functions and for the consumer engagement portals and platforms they make available to their customers.
But Silver Spring is taking that integration role a step further by promising to put its SilverLink Sensor Network to use across the Itron cellular network. SilverLink is the technology platform Silver Spring launched earlier this year with the promise of fast and agile data management across networked smart grid devices, as well as to the cloud, and this is the company’s first foray into devices that aren’t using its own hardware.
· Next entries »