Home Mail me! rss


Fresh news on smart grid and green technologies

Intel takes ‘significant’ stake in Big Data startup Cloudera

Intel Corp said on Thursday it made a significant investment in Cloudera and will make the fast-growing startup its preferred distributor of software for crunching Big Data.

The top chipmaker said its equity investment in Cloudera makes it the company’s single largest strategic shareholder and it comes as Intel looks to expand its server business to make up for falling sales of personal computers.

Intel said the stake in Cloudera is its largest data center technology investment ever, although it did not disclose the size of the deal.

Cloudera, like rivals HortonWorks and Pivotal, focuses on helping corporate customers manage data through “Hadoop,” an open-source software system that can sort and handle the massive amounts of information, increasingly called Big Data, generated through the Internet and mobile devices.

Intel said it will promote Cloudera’s Hadoop platform and transition away from its own customized version that it had been promoting as optimal for Intel server chips.

Cloudera will now engineer its Hadoop offerings to work best with Intel’s server chip technology, which enjoys around 94 percent market share in data centers.

Intel will have a seat on the board of directors of Cloudera, which is widely expected to go public this year.

Shares of Intel were unchanged at $25.38.

More here.

Abstracting the Data Center: A look at the DCOS Platform

It’s time to take a step back and look at the data center model that’s impacting today’s business, . It’s time to see just how far this platform has come and exactly where it’s going. It’s time to say hello to the truly agnostic data center. Almost every new technology is being pushed through some type of data center model.

Inside of your current data center model – what do you have under the hood?

  • Storage, Networking, Compute
  • Power, Cooling, Environmental Controls
  • Rack and Cable Management
  • Building and Infrastructure Security

Although some of these underlying components have stayed the same. Requirements from the workloads that live on top have drastically evolved. Through it all, we’ve also seen an evolution of the physical aspect of the data center. We’re creating powerful multi-tenant, high-density platforms capable of handling users and the new data-on-demand generation. With all of these new technologies and demands, the modern data center has truly become a distributed node infrastructure.

So here is the real challenge: how do you control and manage it all? How do you control practically every aspect that is critical to data center functionality? Most of all, how do you do it on a distributed plane?

Data Center Abstraction

Data center abstraction is an emerging field where all physical and logical components are abstracted and inserted into a powerful management layer. This new model is sometimes referred to as the software-defined data center. However, today we’re focusing on the management layer. The data center of the future will be truly agnostic where all resources become presented to a powerful management layer, which can then be controlled anywhere and anytime. This is the data center operating system.

One example of this data center operating model is provided by IO and its IO.OS environment, which helps control many of the absolutely critical components, from chip to chiller. The great part is that this DCOS layer has visibility into every critical aspect that a data center has to present.

This conversation takes us far beyond standard DCIM. We’re now looking at an open data center and open cloud architecture. So what makes up a solid data center operating system? What has IO.OS done to really help organizations regain control of their global data center footprint? Let’s examine what it takes to create a powerful DCOS framework.

More here.

What lies beneath: Sensor analytics in the water system

The world’s growing water shortage has inspired a wide variety of solutions ranging from improving agricultural practices to reprocessing waste water. One often overlooked approach is to stem the enormous volumes of clean water that are lost in leaking water distribution systems.

“Water losses are becoming a huge problem,” says Andrew Whittle, the MIT Edmund K. Turner Professor of Civil and Environmental Engineering. “Cities in the developed world typically lose anywhere from 10 to 30 percent of water supplied through the underground pipe networks due to leaks and bursts, primarily caused by antiquated infrastructure.”

Cities are not only wasting water, but they’re also wasting the energy and money spent purifying the water in the first place. While the leaks are bad enough, bursts have additional side effects, releasing massive amounts of water that can cause considerable damage. “A few years ago in Boston, we had a water burst in which the water got into the gas mains and affected heating systems,” Whittle says. “It caused no end of problems.”

Replacing infrastructure is an obvious solution, but it’s expensive and “massively disruptive,” Whittle says. The more common solution is to find and repair the leaks. Utilities typically send crews of technicians around the city with acoustic detection devices to locate leaks hidden underground. This is costly and inefficient, however, and is further limited by the fact that the work must be performed at night when it’s quieter.

Now, a research group that emerged from the Singapore-MIT Alliance for Research and Technology (SMART) called the Center for Environmental Sensing and Modeling (CENSAM) is trying a different approach: monitoring water systems with a network of wireless sensors. Whittle has joined CENSAM as a geotechnical engineering advisor, helping the team work with the Public Utilities Board of Singapore to set up a sensor network in Singapore called WaterWiSe.

More here.

Can Bidgely or PlotWatt Compete With Opower in Home Energy Engagement?

The home energy management market has made plenty of promises, but has yet to broadly deliver deep insight and significant reductions in home energy usage.

Microsoft tried and failed with the Hohm energy platform. Google.org tried with the PowerMeter home energy suite — only to later abandon the effort and acquire the Nest thermostat brand instead. Hardware-centric energy management firms such as Tendril, EnergyHub and Onzo have had to make drastic pivots in product and channel strategy in order to stay in the hunt for the customer. Finding a profitable business plan that navigates utility spend cycles, regulatory shifts and energy customer behavior has proven challenging.

Nest has moved thermostat technology into the year 2008, while IPO-bound Opower saves residential utility customers an average 1.5 percent to 2.5 percent on their electric bill.

But “the disaggregation train is coming,” according to Bidgely CEO Abhay Gupta. Disaggregation could save the customer real money and significant amounts of energy, change the relationship of the utility to its customer, and ignite the home energy management market.

Appliance-level energy data from Bidgely

Bidgely’s CEO claims that its “deep pattern recognition” software can deliver appliance-level energy consumption data — using nothing but the signal of a home’s smart meter.

We met with Bidgely CEO Abhay Gupta earlier this month, and he spoke of moving energy feedback from “a motivational level like Opower to an empowerment level.”

Provided with Green Button or AMI data at hourly intervals, Bidgely can single out and detect heating, cooling and pump loads, as well as rooftop solar. Home area network data from the smart meter can provide enough information to identify additional appliances such as water heaters, stoves, dryers, hot tubs, or EVs.

The CEO said that “utilities are pulling us into new uses.” He gave an example of a utility pool pump program where Bidgely performs measurement and verification of the devices. He also suggested that the utility could pinpoint the homes in a territory that had inefficient pool pumps.

More here.

The Wireless, Networked LED Goes Standard

Smart, networked lighting may the next big market for building energy efficiency and automation technology. But to date, that market has been a battleground of proprietary technologies: different networks, linking up different lighting systems, in a landscape that leaves little room for future mix-and-match interoperability.

That’s what makes the new partnership between lighting networking startup Daintree Networks and LED maker LG Electronics USA a noteworthy one. On Tuesday, the two companies announced they’ve teamed up to sell LG’s LED lighting fixtures that run on Daintree’s smart lighting control platform. It’s not the first such combination of networking smarts embedded in LED lights. But it’s one of the first that comes in a standards-based flavor.

“We’re not actually embedding any proprietary technology in our product to make this happen,” Sean Lafferty, director of LED lighting for LG Electronics USA, said in an interview. Instead, “we’re providing a ZigBee wireless controller embedded in our products,” which is the wireless protocol that Daintree uses to network LED drivers and lighting ballasts, remote-control thermostats and other indoor smart devices.

In other words, “we’re embracing a model where we’ll have interoperability,” he said. That’s not ubiquitous interoperability, of course — ZigBee is one of several low-power wireless mesh networking standards vying for dominance in the commercial building space. But in the almost completely proprietary smart lighting landscape, it’s an outlier.

Consider the recent spate of announcements from big lighting vendors like Cree and Philips, both of which are promoting their own proprietary wireless mesh-embedded LED control platforms. Likewise, the majority of lighting network technology startups out there, such as Digital Lumens, Enlighted, Redwood Systems (bought by CommScope) and Adura (bought by Acuity), are offering technologies that, while sometimes based on standard wireless technologies, tend to be tweaked in one way or another that renders them non-interoperable.

More here.

Landis+Gyr awarded information security ISO 27001 in Europe

Landis+Gyr sites acquire prestigious certification for demonstrated excellence in information security management

Zug, Switzerland, March 27th, 2014 – Landis+Gyr has received ISO 27001 certification for its production facility in Corinth, Greece and for its three sites in the UK. Certification means that Landis+Gyr offers the highest levels of information security in developing and manufacturing its smart metering products and solutions, according to recognised, global standards.

As today’s markets demand the highest levels of information security and data protection for smart metering products and software, ISO 27001 certification is a crucial milestone in establishing standards in the area of information security management. ISO 27001 is part of the ISO/IEC 27000 family of standards. It is an information security management system (ISMS) standard, which specifies the requirements for establishing, maintaining and improving information security against the backdrop of an organization’s overall business risks. The standard can be applied to production and non-production sites.

Richard Mora, Landis+Gyr Chief Operating Officer comments, “ISO 27001 represents the latest best-practice thinking in information security. This milestone achievement means that we can continue to offer utilities the highest levels of data security from product and solution conception through completion – only now we have gone one step further by achieving the leading global benchmark in information security.”

To complete the ISO 27001 certification process, Landis+Gyr was required to successfully implement certain key criteria, including ongoing risk assessments, and the active prevention and detection of information security incidents.

Thomas Mosel, Head of Information Security at Landis+Gyr in Europe, says, “We have introduced heightened physical security across all certified locations, raised security awareness among Landis+Gyr employees regarding their roles and responsibilities and provide reporting guidelines in the event of a security incident at all sites. We have also added security requirements to contracts with external parties.”

The company’s production facility in Montluçon in France recently started the ISO 27001 certification process, with completion scheduled for mid-2015.

About Landis+Gyr

Landis+Gyr is the leading global provider of integrated energy management products tailored to energy company needs and unique in its ability to deliver true end-to-end advanced metering solutions. Today, the Company offers the broadest portfolio of products and services in the electricity metering industry, and is paving the way for the next generation of smart grid. With annualized sales of more than US$1.6 billion, Landis+Gyr, an independent growth platform of the Toshiba Corporation (TKY:6502) and 40% owned by the Innovation Network Corporation of Japan, operates in 30 countries across five continents, and employs 5,200 people with the sole mission of helping the world manage energy better. More information is available at landisgyr.com.


John Harris, Vice President and Head Governmental Affairs and Public Relations

Tel. +41 41 935 6439

How will the Software Defined Datacentre help modern-day IT teams?

IT is an integral part of businesses of any scale and any type today, because it is widely believed that IT improves productivity, ensures business continuity through bringing in a degree of automation to routine tasks; enhances business processes and helps businesses grow. The core focus of IT is to cater to its end-users, help them adapt easily to the changes brought into their IT, and lookout for new ways of improving their IT infrastructure to realise their organisational goals.

Today, IT is changing rapidly. IT teams face an ever-increasing number of complexities in managing, and the issues that are of concern are growing more than ever before. Businesses and Enterprises rely heavily on IT, especially the data it is using IT for internal purposes such as running CRM or certain legacy applications, or external ones such as an online bill payment portal that involves the customer directly, uninterrupted service delivery is the top priority for the IT administrator.

The role of the datacentre

Datacentres have an irreplaceable role in today’s IT scene, no matter the size of the organisation. Organisations leverage small in-house datacentres or several small and medium sized datacentre services offered by various providers. Many organisations go for Google or Amazon depending on their requirement and budgets. However, the basic problems of managing IT with all three kinds are similar. The primary concern for an IT team, whether at the datacentre or at the client-side is, uninterrupted, 24-hour service delivery. That’s a golden rule considering the fact that the dependency on the datacentre is so high and downtime could be an expensive problem.

With regard to providing IT services via a datacentre, the top three challenges faced by IT teams are: visibility, scalability and maintaining end user experience unaffected.

Gaining end-to-end visibility of all aspects of the datacentre is a major challenge for most datacentres. This can be achieved to a certain degree by using different tools that are available in the market. However, narrowing down and pinpointing the exact factors that affect service delivery is quite challenging. For example, a monitoring tool could raise an alarm when a critical server is down but it may not have the intelligence to map the services affected on account of the server that is down. Sadly, it is the data on the services that are affected, which is more important than knowing which device is down. The reason being, the end user is bothered about the service and not the on/off status of a device.

End-user experience is one side of the problem. This side makes sure that in the process of improving end-user experience, the hardware/software footprint is not widened. Let us take an instance where an IT team has to roll out a new service in the organisation. This situation could very well be an outlandish experience for many teams across IT such as the network team, the server team, the apps team and the security team. There are several strings that need to be pulled from several ends and the whole exercise gets increasingly complex depending on the nature of the service and the number of users it has to reach. All of this has to happen without the end-users knowing of it or any of the existing services getting affected. What are the chances?

More here.

Teradata and SAS Solution Accelerates Big Data Analytics

Teradata (TDC) has been selected by UniCredit Group and NTT Docomo for big data solutions. At the SAS Global Forum executive conference next week Rick Andrews from the Center for Medicare and Medicaid Services (CMS) will discuss the advantage from the Teradata platform summarize billions of rows of data in seconds and how that platform is enabling the SAS system to immediately analyze data with incredible performance.

Combined Teradata and SAS solution aids UniCredit Group

Italian financial company UniCredit Group worked with SAS and Teradata to run advanced analytics on processes directly connected to the Teradata Data Warehouse. UniCredit was the first organization to adopt and deploy the Teradata Appliance for SAS, Model 720, running SAS Visual Analytics to fully integrate data and analytics in a streamlined business process. The SAS and Teradata combination enables UniCredit to explore and identify trends and patterns, helping the organization seize emerging opportunities, manage risks and make the right choices. The solution gives UniCredit the best answers to complex business questions in near-real time, providing new insights to analysts – and gaining productivity as well. Teradata BYNET interconnect adds analytic servers to provide rich new tools and algorithms to the Teradata analytic environment. The BYNET enables high-speed, fault-tolerant, warehouse-optimized messaging between nodes.

“Our customers asked for appliance-ready offerings with quick time to value and low total cost of ownership. This is exactly what UniCredit is reporting,” said Rick Lower, CA-AM, Teradata Alliance Director. “As our many shared customers realize increasing value to their organizations, we are confident that we represent the most formidable analytics alliance in the world.”

More here.

TI increases range up to 7x for low-power 2.4 GHz wireless networks

Texas Instruments (TI) (NASDAQ: TXN) today introduced the SimpleLink(TM) CC2592 range extender that provides up to a 7x improvement in range when paired with TI’s 2.4 GHz low power RF solutions for ZigBee(R) , 802.15.4, 6LoWPAN and Bluetooth(R) low energy networks. The first launched pairing is the CC2592 with the SimpleLink ZigBee CC2538 wireless microcontroller (MCU) to speed time to market of ZigBee-enabled equipment. The combined solution benefits from -101 dBm sensitivity and +20 dBm output power, which raises the link budget by 17 dB and equates to up to a 7x range improvement. To order a CC2538-CC2592 evaluation kit please visit the TI eStore.

As a low cost, industrial temperature solution, the CC2592 offers extended range for a variety of applications including gateways, electric and gas meters, home and building automation, lighting, safety and security systems. Additionally, the SimpleLink CC2592 range extender is a highly integrated solution with small external BOM, which reduces cost, simplifies layout and allows for smaller end-equipment form factors.

More here.

7 Tips To Successfully Select an Energy Management Solution

In today’s highly competitive business environments, finding ways of reducing costs is on everyone’s agenda. Energy Management has been proven to reduce energy costs for many different businesses and rising energy costs make the business case even more attractive today.

With a whole host of different of energy management solutions on the market, how can you choose the correct one that meets your own business needs?

The following 7 Tips are meant as a guide to help prospective users of energy management solutions to select the best technology that matches their requirements.

You should understand the energy flows that you wish to monitor so that you can decide which type of
solution to deploy. For example,

▶ Do you want to measure and record data on electricity consumption only or do you also want to monitor thermal fuel use such as gas or oil consumption?
▶ Do you want specific areas and circuits to be monitored?
▶ Do you want the solution to integrate existing monitoring systems?

You should ask the supplier about what type of data will be available from the solution. Some solutions will provide only quarterly hourly energy data on your building or facility on a site wide basis, as this is relatively straight forward to collect from energy utility meters. Other solutions will offer far better granularity on energy data collection when it comes to electricity use, offering real time data not only on a site wide basis also by area or circuit/appliance. This type of information offers far better scope for identification of saving opportunities.

The energy solution should be able to offer much more than just the display of energy data on a daily, weekly, monthly or yearly basis. Does the solution offer a cloud based solution accessible from any device? Are customer support and energy consultancy services provided? Can the solution automatically identify significant energy consumers and discover energy saving opportunities for you?

It is important to understand who will be using the solution. Are the users looking for a complete Building Energy Management System which offers control of equipment and appliances? These solutions can often be over specified, very expensive and sometimes complex for many people to use. A solution that is intuitive to use, allows you to monitor energy use in the areas you want and facilitates reporting & analysis may be a better option for your business.

More here.

« Previous entries · Next entries »