Wednesday, 30 July 2014

Predictable efficiency leveraging prefabrication

As said in Forrest Gump, “Life is like a box of chocolates… You never know what you’re gonna get.”

One of the problems when building data centers in the traditional way (we should know, we have built more than anyone else) is that you have “designed performance” (efficiency, power and cooling capacity, density, availability, etc.) and many times  when you flip the “on switch”, you may not get what you expect.  Why is this?  For one, you have many different parties involved in the process – multiple manufacturers, distributors, designers, architects, contractors, electricians, HVAC installers, inspectors, etc.  Too many cooks, well you know…..   Then you have assembly on site in an uncontrolled environment without quality control and many field changes that pop up because equipment delivered (IT, power cooling) may have gone through a generation change, the room measurements were wrong or have changed,  the installers had to do some “site reengineering”.  All of this is workable but it may involve redesigns, reworks, optimization services that could all cost time and money.

The performance of prefabricated data centers is like a box of chocolates with labels and pictures on the insides – you always know what you are gonna get.  Prefabricated data centers’ performance is more predictable as the initial designs have had robust design time and effort, continuous updates based on actual field performance, with factory manufacturing and testing done by dedicated professional utilising the latest automation tools and processes.  Because the solution is tested and validated in the factory (most cases) you know what the performance will be before you ever start the construction at the site.

The overall predictability benefits are obvious: 1) you can draw a reliable financial model for your data center project before you start it; 2) you can better plan your business and make decisions with more predictable information; 3) you can count on prefabricated data center designs to deliver the “designed performance”.

Guest blog by Steve Carlini, Senior Director of Data Center Marketing, Schneider Electric www.schneider-electric.com    



Tuesday, 22 July 2014

Microsoft Signs Wind Power Deal in Illinois

The latest green datacentre news out of America comes from none other than Microsoft, the blue-chip software-maker that everyone loves to hate yet this little bit of news is one that all of us in Europe should appreciate.  The Redmond, Washington company recently announced the signing of a brand-new energy deal that will see it purchasing a considerable amount of wind power from a project currently under construction outside Chicago, Illinois.

Microsoft announced the deal through a post on their Microsoft Green Blog dated 15 July (2014).   According to the post's author, Robert Bernard, the company signed a 20-year deal to purchase as much as 675,000 MWh of renewable energy from the Pilot Hill wind project annually.  The new wind farm is currently under construction at a location approximately 60 miles from Chicago.

The Pilot Hill agreement is not Microsoft's first foray into renewable energy.  Late last year it signed a deal to purchase wind power from the Keechi project in north-central Texas.  Keechi will have a capacity of 110 MW when it is complete next year, as opposed to the 175 MW capacity at Pilot Hill.

In addition, Microsoft's data centre in Quincy, Washington is run primarily by hydropower.  Another facility in Austin, Texas uses recycled wastewater for cooling purposes.  Moreover, in an innovative new project still being developed, the company is working on creating server racks with built-in fuel cells that could eventually enable the technology giant to generate its own energy in-house.

As for Pilot Hill, all of the energy it generates will be put into the local grid as supplemental power.  The funds collected through Microsoft's carbon tax are being used, in part, to build the wind project, completing a full circle of green energy creation and greenhouse gas emission reduction.  Microsoft hopes that signing the deal will encourage more green energy development in order to reduce dependence on fossil fuels.

Energy of the Future


Microsoft's willingness to utilise renewable energy sources to power its data centres is in keeping with other technology companies that have been doing likewise.  For example, Google has been one of America's biggest boosters of green energy initiatives in the technology sector for a number of years. It is something companies are embracing to help both their businesses and the environment.

Having said that, we find Microsoft's fuel-cell project to be even more intriguing.  Could this be the energy of the future for the world's data centres?  If the company can turn its proof-of-concept model into commercial reality, it could be the first to build a completely self-sustaining data centre with no need for power from the grid.  Transferring the technology to other data centres would require very little change in infrastructure, making it a very cost-effective transition.

The concept of having fuel cells installed side-by-side on server racks is intriguing to say the least.  Microsoft plans a demonstration of the technology later this year; we look forward to seeing what it does.



Saturday, 19 July 2014

New Website Tracks Progress of 'Right to Be Forgotten'

When the European Court of Justice ruled that Google has an obligation to honour an individual's right to be forgotten, it set in motion a process tasked with accomplishing what many think is impossible.  Google now has to remove all references to online content from its European search results when requested by content owners.  Although it might seem like a simple task, it is anything but simple.

The search engine giant claims it has received 70,000 requests since the ruling came down this past May.  It says it is now receiving about 1,000 requests per day.  If you are interested in finding out how the search giant is doing, a brand-new website known as “Hidden From Google” is tracking Google's progress.  It is not going so well at this point.

The site currently shows only about 15 pieces of content that have been successfully removed from European search results and verified by outside sources.  It is not that the content has been removed from online publication, but rather it does not show up in Google's European search results.  Any user who wants to find the material can do so using the US version of the search engine.

From Google's perspective, the court's ruling is nearly impossible to fulfil to the letter of the law.  The automation employed by the company's search algorithm means removals cannot be carried out without direct human intervention.  Someone actually has to view the material in question and make the changes necessary to remove references.  Even at that, Google has got things wrong a number of times.

Google officials say they are opposed to the ruling even though they are complying.  They maintain that their only responsibility is to handle the management of how search engines find material.  They do not believe it is their business to decide what is appropriate for listing and what is not.

Precedent-Setting Action


If nothing else, the Hidden From Google website proves that what is happening now is precedent-setting.  The world's largest search engine is tasked with being the judge, jury and executioner where search results are concerned, irrespective of website owners and the information they decide to publish.  Anyone involved in IT services and online marketing knows what an impossible task this is.

For example, the original author of a controversial blog post may want that post to be forgotten because it has damaged his or her reputation.  However, simply removing the reference from European search results does not change the fact that the blog post remains readily available online.  What's more, the website owner will continue to promote the post as long as it benefits his or her traffic.  This will undoubtedly generate more links Google will have to find and remove from its search results.  It is a never-ending battle that Google can never win.

It seems as though the European Court of Justice did not think this through before issuing their ruling.  If the right to be forgotten is one we truly need to preserve, perhaps it should be implemented at the site level rather than through search engine results.


Monday, 14 July 2014

Renewable Energy Output to Expand Substantially Over Next 10 Years

Proponents of renewable energy are thrilled by a new report from Bloomberg New Energy Finance (BNEF) suggesting that renewable energy output will expand substantially over the next 10 to 15 years.  BNEF projects that are as much as 60% of the expected US $7.7 trillion investment in new power plants through to 2030 will go to plants utilising renewable sources.  The report says renewables are poised to become significantly more competitive with fossil fuels thanks to more efficient technology and lower production prices.

If the Bloomberg publication is correct then it could mean as much as US $5 trillion invested in worldwide renewables over the next 15 years.  Furthermore, if new power plant production creates the expected 1,100 GW of additional capacity over that period, renewable energy power plants should have a combined capacity of 3,000 GW.

BNEF is convinced that renewable energy projects will outpace new fossil fuel plants by 7-to-1 by the year 2030.  Even though coal capacity currently stands at 64%, as opposed to 34% for renewables, the report projects a day when there will be greater balance between the two.  It turns out that levelling the playing field ultimately comes down to production costs.

Making Renewables Cheaper


To date, one of the biggest obstacles to large-scale renewable energy use has been the cost.  It has been extremely expensive to design and build new projects in relation to how much energy these produce.  Nevertheless, as time has gone by, the costs associated with building new plants operating on renewables has dropped.  BNEF sees no reason why that trend will not continue.

According to a study by the US Energy Department, the cost of producing renewable electricity has fallen some 99% since the late 1970s.  That has certainly made the transition from fossil fuels to renewables somewhat easier however, here's the catch: worldwide renewable energy efforts are supported by government subsidies.  Renewables are still not strong enough to stand on their own in the commercial environment.  Until they can, they will not be able to compete with fossil fuels at the same level.

So where does that leave us?  It leaves us in a position of having to continue to develop renewable energy sources and delivery systems while simultaneously continuing to build new fossil fuel power production, including clean coal.  We cannot simply allow fossil fuels to languish in the hope that renewables can adequately replace them.  Co-existing is not a solution everyone likes, but it is the only reality that makes sense.

There may come a day when renewables are capable of replacing fossil fuels 100% however that day is far from arriving.  Moreover, until then, proponents of both types of energy production will have to continue working together to supply the power needs of the global community.  The most important thing is to continue providing cheap, reliable energy that is capable of carrying us into the future – a future that is more dependent than ever on energy to drive modern technology.



Friday, 11 July 2014

IBM Launches Green Horizon Initiative in China

Technology giant IBM has announced plans for a brand-new initiative that will see the company concentrate the vast majority of its global research capabilities in an effort to help China meet its energy and air pollution goals.  The initiative, dubbed 'Green Horizon', is a 10-year programme to transform China's energy landscape and, hopefully, improve national health at the same time.

IBM will work with the Chinese government on three primary areas:

  • reducing environmental air pollution
  • increasing energy efficiency
  • developing renewable energy sources

The key to the initiative is to develop each of the three areas alongside China's growing economy.  Integrating an energy and air pollution policy with an economic policy will enable both IBM and the Chinese government to take full advantage of a number of resources simultaneously.  This will enable them to get more done with the available funds.

Environmental Air Pollution


The robust expansion of the Chinese economy over the last several decades has not come without a cost.  China's manufacturing sector is among the heaviest polluters in the world thanks to outdated technology and poor air quality control. This is the first area that the initiative will tackle.

Beijing's municipal government, one of the first partners to get on board with the Green Horizon initiative, plans to spend upwards of US $160 billion to meet air quality improvement goals in the city by 2017. IBM's contribution involves using cloud computing, data analytics and air quality monitoring to provide the information scientists need to improve air quality.  IBM's supercomputing capabilities make them the ideal candidate for this task.

Energy Efficiency


China's manufacturing sector accounts for nearly 70% of the nation's energy consumption.  IBM will work with government and private industry partners to develop management programmes for commercial enterprise that will increase energy efficiency across the board. IBM will use the knowledge it has gleaned from its own energy efficiency efforts to help Chinese industries do better.

Renewable Energy Sources


Chinese officials realise that continued economic growth in the 21st century will require greater dependence on renewable energy sources rather than continuing to rely solely on fossil fuels.  IBM will assist through a variety of analytical tools built around weather monitoring and forecast modelling.  The data these provide will help the Chinese design and build renewable energy projects that make the best use of sun, wind and water.

IBM's ambitious plan, if successful, will completely transform the way China does business.  We expect IBM's work to also lead to the development of brand-new technologies we are as yet unaware of.  Any success enjoyed in China will translate into better energy use and reduced air pollution across the globe, even as the world's economies continue to expand and grow together.

The only possible snag in the IBM plan is government interference. China is not known as a bastion of freedom, so it will be interesting to see how much leeway IBM is afforded.  We hope interference from Beijing is minimal.



Tuesday, 8 July 2014

Climate Change Agreement Implemented for Data Centre Industry

After four years of negotiations between industry representatives and the government, the UK's data centres now have an official climate change agreement (CCA) active and in place.  In addition to helping the sector meet its energy conservation and emission reduction goals, implementation of the CCA is official recognition of the importance of the data centre industry to the UK economy.

The UK has been seen for quite a while as a leader in collocation, networking and information technology in general.  Nevertheless, until now, the government has failed to officially recognise how important the sector is to the overall economic survival and viability of Britain.  The implementation of the new CCA changes that.  The government now see that the UK economy can only continue leading the way in Europe if the data centre sector remains strong.

According to a report in Computer Weekly, there are now some 217 data centres around the UK.  Among them, 67 are located in the London area.  The data centres combine to offer more than 7.6 million square meters of data centre space covering both commercial and public needs.  The implementation of the new CCA should help to boost that space in the coming years by making companies involved in the sector more competitive.

Climate change agreements are government-industry agreements that establish clear benchmarks for reducing energy consumption and greenhouse gas emissions.  When companies meet their targets they are eligible for a reduction of, or full exclusion from, carbon taxes.  Right now, more than 50 energy-intensive industries are part of the CCA scheme.

A Logical Conclusion


The amount of energy necessary to run the average data centre is such that having more than 200 facilities in the UK automatically qualifies the sector as energy intensive.  It was only a matter of time before the government recognised the need for CCA for colocation providers.

As for the new agreement, it requires data centres to reduce their non-IT energy consumption by 30%.  The use of certain energy efficiency strategies will result in significant cost savings which, when combined with reduced carbon taxes, make the UK data centre sector more competitive with the rest of the European market.  Moreover, a more competitive sector will ultimately result in a better business environment in the UK.

The CCA will also influence future data centres yet to be built.  Companies will have to take the standards set by the agreement and incorporate them into any new designed and built facility.  These new facilities will be more energy-efficient right from the start and, therefore, more competitive.

As you would expect, the news of the CCA has been well received by industry representatives and executives.  The general opinion is that the agreement has been a long time in coming and one that was necessary to keep the data centre industry strong in Britain.  Now we will have to see if the industry responds in the way everyone is hoping and expecting it will.



Thursday, 3 July 2014

Norwegian University Using Data Centre Heat to Keep Students Warm

When you are operating a high-tech university within the confines of the Arctic Circle, space heat is at a premium year-round.  Such is the case with Norway's Arctic University in Tromso. The town sits at a latitude of 70°, making it the perfect spot for the types of research they are doing up there.  To keep everyone warm, the university uses excess heat from its HPC (high-performance computing) data centre in combination with other heat sources.

Norway is already a country that manages to generate nearly all of its power by way of renewable sources therefore what is happening at Arctic University is really no surprise.  It is also no surprise that the university plans to improve its heating capacity as it moves HPC operations to a new 2MW data centre in the near future.

The current data centre's cooling system is a combination of air and warm water cooling.  Air-cooling uses cold air to draw heat away from servers where it can be harnessed and used elsewhere.  Warm water cooling uses a dual loop system and heat exchanger to draw heat directly away from central processing units.  It is the warm water cooling system that harnesses most of the excess heat currently being used to provide warmth for campus buildings.

According to HPC team leader Roy Dragseth, the new data centre will be cooled entirely by the warm water systems.  These were chosen because they are more efficient than air-cooled systems and 1,000 times more capable of capturing excess heat for use in other applications.  With a new data centre that is 100% water-cooled, the university will be able to further reduce its dependence on other energy sources for heat.

No Small Feat

The university's new data centre will undoubtedly be the world's greenest, in terms of power and cooling, when it is finally up and running however this is no small feat given the harsh environment in which it is located.  Thankfully, the high-performance computing at the core of the data centre's operations translates into continuously heavy loads 365 days a year.  According to Dragseth, there is no downtime.

The data centre is used primarily to accommodate research facilities that are constantly collecting data from satellites in orbit.  The collected data centre information is continuously accessed by those research facilities for download.  At the same time, researchers are uploading new data to satellites by way of the university data centre.  The constant stream of information keeps the facility near its peak of more than 250 Tflop per rack at all times.  That is a lot of computing power generating a lot of excess heat.

When the creators of warm water cooling system were first designing their architecture, perhaps they never considered the system might one day provide enough heat for campus buildings in the Arctic Circle.  Nevertheless, indeed it does and Arctic University is more than happy to be one of the first to put it to use.  That is the way they do it in Norway.

Source:  Computer Weekly – http://www.computerweekly.com/news/2240223707/Arctic-University-warms-classrooms-with-waste-datacentre-heat

Tuesday, 1 July 2014

Hackers Hit European Bank and Steal €500,000

A group of sophisticated hackers using a mysterious piece of malware known as the 'Luuuk Trojan' hit a well-known European bank this past winter, getting away with more than €500,000.  The unidentified bank is believed to be an Italian institution, due to the largest number of victims being Italians and Turks.  Kaspersky discovered the seven-day attack on January 20.

The WHIR reports that Kaspersky first identified the raid through log files that showed what appeared to be bots reporting to a central command centre.  Company officials say the offending software was removed from the suspicious server on January 22 yet the attack did not immediately cease.  Kaspersky investigators believe the hackers simply moved to a new, undetected infrastructure.

The Luuuk Trojan is a piece of malware with unknown origins.  Experts are not sure whether it is a standalone programme built from the ground up or some sort of variation of another known Trojan.  One theory suggests Luuuk is a variation of the popular Zeus malware.

Zeus has a number of ways of compromising protected information, but the most common use of the software is one of man-in-the-browser keystroke recording.  Once the software is implanted on a Windows computer, it records every keystroke and reports that data back to a central station.  The data can then be culled for sensitive banking information.

Trojans like Zeus and Luuuk are traditionally spread by way of phishing campaigns and drive-by downloads.  In the January raid, approximately 190 victims were affected.  The software was implanted on their computers without their knowledge, recording each and every keystroke for the purposes of obtaining sensitive information.  The WHIR says that victims lost anywhere from €1,700 to €39,000 apiece.

Consumer Education Needed


We talk an awful lot about the need for better security at the data centre and IT services levels and, while that's true, this story demonstrates that more needs to be done at the consumer level as well.  The fact that consumers continue to be fooled by malicious software and phishing schemes is evidence that we are still far behind in the area of training consumers.  We can go a long way toward thwarting these types of attacks just by educating computer users.

Organisations like Kaspersky can take steps to end attacks from Trojans like Zeus and Luuuk, but not until they are well under way.  This suggests their power to eliminate such attacks is limited.  Once again, it goes back to training at the consumer level.  Individual computer users need to be trained in the architecture of Trojan horses and how these are surreptitiously passed along through attractive e-mail messages, fake websites and the like.

As for the consumers affected by the January attack, we hope some sort of financial protection was in place from the bank in question.  We hope that they got most, if not all, of their money back.  Undoubtedly, each of the victims will be more careful about their computer use in the future.