Monday, 29 September 2014

Predicting the Future of Densities

Nostradamus couldn’t predict the future and neither can we – which is why the data centre industry has never predicted future rack densities correctly

Just recently, there was yet another report generated predicting that data centres within the next few years will be running at 30, 40, or 50 kW per rack.  I remember a conversation with someone (who shall remain nameless to protect the innocent) a few years ago and he insisted that we were only a few years away from 250kW in a single rack.  I asked him if he ever considered what the power distribution system for that solution would look like….of course he hadn’t.  The practical reality of the space and size of the supporting power distribution system alone (and we didn’t discuss the cooling complexity) would overwhelm the rack.   So I asked him – why would you bother?  I didn’t get an answer…

And he hasn’t been alone in these predictions.  For years we have seen industry analysts…yes the experts in predicting the future….who have generated these same reports.  And they have always been wrong.  All the evidence I see today, and my own experience, is that data centers are running in the 3-5 kW per rack range which is up from 10 years ago when they were 1-2kW.  We clearly have not seen the increase that all the prognosticators have predicted.

So I asked myself what is the correct density per rack? What are the trades-offs and is there some reason that the industry has stayed at relatively low rack density levels?


Our findings are fairly straightforward:

1)       When you analyse the cost of power and cooling infrastructure, there are significant economic advantages to achieving 5kW per rack on average.  There continues to be some savings up to 15kW per rack, and there are effectively no savings beyond that level.

2)      Most of the prognostications look at server power consumption numbers and don’t account for networking and storage racks which have a different load profile.  When you count these the average comes down.

3)      Most people overestimate the actual power consumption of servers due to nameplate ratings.

Like most people, I also can’t resist playing Nostradamus and trying to predict what will happen.  My view is that the economics of a 5-15kW sweet spot is a natural market force to keep densities below 15kW.  I’ve talked to some data centre operators today who won’t fully populate racks in order to stay under a 10kW limit – yes they purposely leave U space empty.

Additionally, there is also a reinforcing trend coming from Intel with performance per watt on chips.  The focus seems to be driving toward reducing or at least maintaining power consumption while driving performance higher.  Perform a cursory analysis of the performance per watt of an Atom chip and one can see some of the strides they are making.  If I was a betting man, I would bet that this development / technology on low end mobile chips will migrate its way to higher end chips….and drive densities down.

And then we also found a web page giving analysis showing power consumption per U http://datacenterpulse.org/blogs/jan.wiersma/where_rack_density_trend_going

….so maybe we have been driving to lower density per rack all along?

I don’t deny that there are applications like HPC which may require higher densities for IT performance reasons.  However, in most cases, I would argue that data centre operators should look closely and perhaps even enforce a policy limiting the densities they will allow in their data centre.

So I am not Nostradamus and from what I can tell he wasn’t very good at predicting the future either.  (Check out this link if you are bored).  However, I think there are some compelling arguments in favor of lower than projected densities.

And if you believe this, there are some practical (and I think quite fascinating) implications on how data center designs and specifications can be simplified….more to come on this topic in future blogs…

Guest blog by Kevin Brown, VP Global Data Center Strategy and Technology, IT Business, Schneider Electric, United States

Wednesday, 24 September 2014

Google and Apple Battling over Encryption

If there were ever an example of how easily the tech wars can become petty, it is the recent announcements from Google and Apple regarding encryption.  The dual announcements are meant to alleviate consumer concerns about security by revealing new encryption methods that both companies will be offering on upcoming devices.

Apple devices running iOS 8 are already equipped with new technology out-of-the-box.  Users will not have to bother turning it on, nor will they have to worry about Apple having access to data stored on a tablet or iPhone.  Apple says that it will not have the encryption key available to them which will prevent the company from gaining access to the data.

Google's plan is similar, though it will not kick in until the next version of the Android operating system.  As with Apple, it will be enabled by default.  Customers purchasing new Android devices will enjoy out-of-the-box encryption with no outside access to their personal information.

It is important to note that the new encryption message from both companies apply only to data stored on mobile devices.  It is not applicable to any information stored in the cloud or on servers located at third-party data centres.  Both government officials and hackers can still access those other locations in the same way they always have.

Security, Privacy or Marketing?


Sceptics question if the encryption moves by Apple and Google are really about security or privacy.  After all, Apple already has a very public policy that states it does not attempt to retrieve data from mobile devices in order to fuel marketing efforts.  As Apple's Tom Cook wrote in a blog post explaining the new encryption, “we don't 'monetise' the information you store on your iPhone or in iCloud.”

It could be that the move is more marketing than anything else.  By making this one little enhancement to the encryption, both Google and Apple can claim they are looking out for the customer.  It is good for reputation management when stories such as the NSA spying scandal break.  In the end however, the new encryption model is likely to do very little to enhance safety and security for mobile devices.

Any benefit that is realised from the updated encryption would come by way of making it more difficult for hackers to steal personal information however encryption will still not make it impossible, just harder.

Government Intrusion


Both Google and Apple were quick to mention government intrusion in their remarks.  The mentions are the strongest evidence suggesting that enhanced encryption is being used for marketing purposes, if not designed solely for that purpose.  Both companies are letting it be known that they cannot hand over personal data located on a mobile device to government officials because they will not have encryption keys.

Were the announcements from Google and Apple really necessary?  Probably not.  They could have just implemented their encryption strategies without making a big fuss.  Nevertheless, in the tech wars, image is everything.  If they have to create a story in which none exists, they have no problem doing so.




Thursday, 18 September 2014

Facebook Runs Stress Test – Shuts Down Entire Data Centre

We have a rare piece of data centre news to share with you today courtesy of the people over at Facebook.  Apparently, the social media giant recently ran a system-wide stress test by completely shutting down an entire data centre for a full day.  The idea was to see how the remainder of the company's systems would respond in the event of a complete data centre failure.

Facebook is not saying which of its data centres was shut down for the test.  That said, it operates facilities in Sweden and five US states: California, Iowa, North Carolina, Oregon and Virginia.  Had the shutdown resulted in a massive failure, it could have made for one of those data centre events that made headlines the world over however apparent the success of the test has served to leave it off the news radar for the most part.

According to Facebook's global head of engineering Jay Parikh, the company did undertake a few ‘fire drills’ to prepare for the eventual test.  He spoke about the plans and results of the test at a recent San Francisco conference.  He told the assembled crowd that, when the day finally came to pull the plug, they shut down an entire region by turning off tens of megawatts of power.

All signs indicate that the test was successful.  Although there were some minor glitches, all of the important components of Facebook remained active around the world.  The biggest point users may have noticed is that some of their favourite applications were not working.  By and large however, any disruption was not significant enough for people to suspect there was a problem.

With the test now behind it, Facebook has developed a number of improvements that will be implemented in the future.  The improvements are designed to address the shortcomings observed during the shutdown.  Parikh says Facebook was pleased enough with the results that it is planning to do more stress tests in the future.

Will Others Follow?


To our knowledge, the complete shutdown of a major data centre for the purposes of conducting a stress test is not normal industry practice.  The fact that Facebook was willing to do it demonstrates the confidence it has in the integrity and management of its data systems.  We would be lying if we said we were not impressed by it.

Parikh told the San Francisco conference that the philosophy of Facebook is to embrace both risk-taking and potential failure.  He encouraged their engineers to take risks in order to push the company further, just as long as those risks are not unnecessarily reckless.  Company officials believe this to be necessary in order to continue to be a dominant industry player.   At this point, we have no reason to argue.

Our question now is one of whether or not other Internet giants will follow suit.  Could Facebook have started a landslide of major data centre shut downs in the future?  We will know soon enough…



Monday, 15 September 2014

Yahoo! Reveals US Government Threats

The unsealing of important documents relating to the NSA data spying scandal reveals details that support claims from Yahoo! and others about threats made against them by the US government.  Yahoo! has said that the National Security Agency (NSA) threatened them with fines of up to $250,000 per day for failing to hand over data.

Since news of the scandal broke last year, we have learned that the NSA siphoned data from nine different US firms including Yahoo!, Facebook, Microsoft, Google and others.  Now Yahoo! is petitioning the courts for further publication of details relating to the scandal that have not yet been released.

Yahoo! general counsel Ron Bell said his company was pleased with the judge's decision late last week to unseal nearly 1,500 pages of documents that were previously considered classified.  He said making information public was ‘an important win for transparency’.  None of the other affected companies had any official comments.

Nothing New


One of the most important revelations to come from the unsealing is the fact that the NSA spy programme is nothing new.  The agency has been gathering personal information from US data centres ever since a 2007 change in the law broadened their data gathering authority.  What was revealed by now-infamous NSA contractor Edward Snowden last year has apparently been going on for more than seven years.

Yahoo! maintains that it originally did not comply with NSA orders because it believed these to be unconstitutional however it lost its initial court battle when the NSA pressed the issue.  Yahoo! has since complied with NSA request, though reluctantly so.  The company says it continues to look for ways to fight what it still believes are unconstitutional intrusions by the NSA.

Big Brother Watching


It is important to continue keeping the story at the forefront of technology news.  What is happening with the US government is a clear example of big brother watching every move citizens make.  When the government has unfettered access to data communications on the basis of national security, nothing is sacred.  It needs only invoke a perceived danger to invade the privacy of anyone it chooses.

Make no mistake; the spying scandal is not limited to only the US.  The NSA was caught trying to gather information here in the UK and elsewhere in Europe;  nevertheless, it goes further than that.  If the US is engaged in such activity, it is reasonable to expect other world governments are doing the same.  Perhaps the only difference between Europe and the US is that no one has blown the whistle on us yet.

Data communications are supposed to be private unless there is a legitimate and provable threat to worry about.  In the absence of any provable threat, a government demanding private data for the purposes of protecting against a perceived future threat is dangerous in and of itself.  Under such conditions, we all become threats that need to be dealt with in whatever way the government sees fit.  This is not good by any measure.



Thursday, 11 September 2014

UK Government Sets Forth Ambitious Plan for Global Climate Deal

The UK Government has staked out its position for the upcoming climate change summit that is scheduled for Paris next year.  Through a recently released document, officials have made it clear that they are undertaking an ambitious plan to make a global climate deal reality as a result of the 2015 talks.  The document, a publication entitled Paris 2015: Securing Our Prosperity to a Global Climate Change Agreement explains why a global deal is necessary and how the entire world can benefit from it.

Document authors say that taking action now to reduce greenhouse gas emissions will help the world avoid the most severe consequences of climate change.  The document makes the case that all countries, both large and small, would benefit from greenhouse gas reductions.  In order to make implementing changes easier, the document puts forth language that makes implementing changes more politically appealing.

According to Energy and Climate Change Secretary Ed Davey, going green and operating a profitable business are not mutually exclusive.  He has gone on record as saying it is possible to do both.  That is one of the messages he will be taking when he attends the conference next year. 

Davey also claims that both governments and businesses support the latest climate change initiatives saying, “There is an increasing political will from big and small countries alike to tackle climate change both through domestic action and in the international negotiations.  And it is not just governments who want a deal, there is wide spread support from businesses, NGOs and campaign groups both in the UK and internationally.”

If Davey and his supporters have their way, the Paris conference will result in a legally binding treaty forcing all participating countries to enact specific changes to positively affect climate change.  The UK will expect commitments from all participating countries, though the commitments will be different from one country to the next.  Whether or not every country represented at the summit will agree to the deal is unknown.

Powering the Future


While the rationale for a climate change deal hinges mainly on the principle of climate change and its relation to the use of fossil fuels, much of the impetus for what's now being talked about can be attributed directly to the data centre industry.  Few other industries have such enormous power needs.  Moreover, as the world becomes ever more connected, new data centres are needed in greater numbers around the world.

In order for the modern society to push forward, we must ensure the commercial success of data centres and other similar enterprises.  In order for that to happen, we need to seriously think about how future energy needs are going to be met.  The UK government is determined to meet those needs in a way that reduces fossil fuel consumption and limits greenhouse gas emissions.  The UK will lead the way in Paris and beyond; the only remaining question is who will follow.  We will only know the answer to that question next year…



Friday, 5 September 2014

Geodesic Dome Data Centre Goes Outside the Norms

When Oregon Health and Science University (OHSU) IT guru Perry Gliessman set about to design and build the university's new data centre, he knew he had a brilliant plan that was well outside the norms of data centre construction.  Rather than building a concrete box as so many others do, he decided to make his new facility a geodesic dome. It was no easy task to convince partners that his design was a good idea.

Geodesic domes have been used for all sorts of commercial applications, ranging from planetariums to sports facilities. However, until Gliessman came up with his design, the idea of a geodesic dome data centre was virtually unheard of.  For Gliessman, the shape and structure of the geodesic dome was perfect for his needs.

According to Gliessman, his number one concern was being able to provide economic cooling for data centres destined for extreme power consumption.  He wanted to employ free cooling as much as possible, but that requires an incredible amount of airflow.  It turns out that the geodesic dome is the perfect shape for producing the kind of airflow Gliessman was after.  His design has proved more than capable of providing the necessary cooling for high-performance servers averaging 25 kW per rack.

Gliessman designed a system of air intakes, fans and louvres that work together to constantly move the air.  The fans bring cool air in from outdoors, funnelling it into a series of rooms and corridors within the dome.  From there, the air moves through the IT space, beginning at floor level and rising through the interior spaces of the dome.  The hot air then escapes through the louvres to complete the cycle. Gliessman even designed a way to capture some of that hot air for recycling through the building.

The success of the design lies in the fact that the IT space has no ceilings or hard corners.  That means very little air is trapped anywhere in the space.  It flows, rises and escapes in a continual cycle that is constantly changing the air.  The facility's equipment is also able to regulate the amount of cool air from the exterior compared to the recycled hot air to adjust for temperature fluctuations during the various seasons.

The most amazing thing about this system is that it requires none of the traditional elements of a modern data centre.  There are no raised floors, chillers, air ducts or air handlers.  Everything is handled by fans and physics.

Working with Vendors


Next to the design itself, the biggest challenge for Gliessman was to convince his vendors to think outside the box.  “Most people have embedded concepts about data centre design,” Gliessman told Data Center Knowledge, “and, like all of us folks, [they] are fairly religious about those.”  The only way Gliessman was able to convince them was to prepare the data to prove his ideas beforehand.   Through a lot of research and extensive modelling, he was able to pull it off.