Wednesday, 28 January 2015

Data Centre Energy: Is Efficiency More Important Than Source?

A great deal of time and energy is being put into developing renewable energy sources capable of driving the data centre industry of the future.  This is not without good reason.  Indeed, the power and cooling needs of the modern facility are such that the industry consumes the largest amount of combined energy in most Western markets.  Figuring out ways to produce more renewables is paramount to making sure global networking remains a viable enterprise.

The question then becomes one of how important a source of energy is in relation to the productivity it generates.  It is a fascinating question in light of an excellent piece published in the January edition of The Economist.  The unnamed author of the piece delves deeply into the idea of energy efficiency as being the ‘invisible fuel’ of the future.

In simple terms, the author asserts that the best way to meet our ever-increasing energy demand is to eliminate waste.  It is a reasonable thesis on which to build the argument that the sources of energy we use to power the 21st century economy are less important than the need to use energy efficiently.  When energy is used as efficiently as possible, generation needs are reduced in relation to productivity.

The Economist offers, as an example, thousands of housing units now being built by the UK's Circle Housing.  By knocking down older homes that are not energy-efficient and replacing them with new homes, they are able to reduce annual energy bills from £2,000 to somewhere in the region of  £450.  Bills can be further reduced to £350 by using prefabricated components to build extremely efficient housing.

The author points out that these energy-efficient buildings accomplish a couple of goals.  Firstly, they use less energy through more efficient building materials and designs.  Secondly, they are capable of generating power - through solar and other means - that can be returned to the grid.  The housing is a perfect example of combining waste-saving technologies with the ability to generate power on a small-scale.

Creating Commercial Adaptations

Organisations such as Circle Housing have already demonstrated that they can design and build structures capable of eliminating nearly all energy waste.  Now the challenge is to adapt those designs to commercial structures that tend to use and waste more energy.  Data communications facilities are prime targets.

As The Economist explained, the architects and designers of the past were rarely interested in investing extra funds in energy efficiency.  They were fine with wasting energy as long as they could direct design budgets to other things they found more interesting.  That kind of thinking had to change and, fortunately for everyone, that change is already taking place.

The energy roadmap of the future begins at the place efficiency and renewable power generation intersect.  We will not succeed if either component is neglected in favour of the other.  We must create new data centre facilities that are as efficient as humanly possible if we are to ever realise the goal of making renewable power sources the de facto standard.



Thursday, 22 January 2015

Cloud Providers Pushing the Outsourcing Industry Forward

If IT outsourcing is an indicator of technology sector health, things appear to be very good in Europe.  According to Information Services Group (ISG), the numbers for both IT outsourcing and business process outsourcing were up in 2014 in Europe, Africa, and the Middle East.  Pushing the outsourcing industry is increased competition from cloud providers looking to expand their business models.

ISG says that IT outsourcing spending in the three regions was up some 7% in 2014, to £7.37 billion.  Global spending on IT outsourcing reached £15.2 billion.  That makes the three regions the biggest adopters of IT outsourcing for the last eight years.  According to Computer Weekly, German companies Vorwerk and Lufthansa are excellent examples of large companies awarding sizeable infrastructure contracts.

Computer Weekly says Vorwerk signed a six-year deal for IT infrastructure with Cognizant, while the German airline signed a seven-year contract with IBM worth some £950 million.  It is evident from these contracts that IT outsourcing is one of the hottest commodities in the current technology industry.

Competing with the Cloud

Where the IT outsourcing market goes in 2015 is anyone's guess yet one thing is certain: companies such as IBM and Cognizant will be competing with cloud service providers for more contracts this year.  Cloud providers already have extensive experience in providing IT services on a small-scale and some of the larger players are being encouraged to pursue bigger contracts offering complete outsourcing solutions.

It should take cloud providers some time to become firmly established as IT outsourcing and business process outsourcing providers however there is little doubt that it will happen.  This suggests that it is even more important for traditional outsourcing providers to solidify their positions in order to be able to appropriately respond to the increased competition.  Could it be that we are on the verge of an outsourcing war between established provider competitors from the cloud?  Indeed, it looks as though we are.

Data Centre and Collocation

Experts say the most important competition in the coming years will be in the infrastructure-as-a-service (IaaS) arena.  This could bode well for the data centre and collocation industry, especially among companies willing to invest significantly in server and storage hardware.  A company operating several established data centres might be able to parley some of their revenues into developing a successful IaaS model perfectly suited to outsourcing.

Falling hardware prices across the board is what is making all of this possible.  When equipment is cheaper, it is much more profitable to invest revenues in new servers and storage solutions that can be used for any number of purposes.   Where one data centre operator may seek to expand hosting options, another may decide to get its foot in the door of IT outsourcing.  We expect more will choose the latter option this year than we have seen in the past.

Few would have thought a few years ago that the cloud sector would be pushing outsourcing forward; nevertheless it is.  Now we will have to wait and see who comes out on top for 2015…



Tuesday, 20 January 2015

HOW SECURE IS YOUR RACK?

Data centre security is a big issue: especially for co-location centres hosting multiple racks for multiple, often competing, clients. Given the number of people passing through a data centre on a near daily basis, poor rack level security creates unnecessary risk.

Rack security
While the government, banks and police authorities now demand Intrusion Level 3, 4 and even 5 for anti-terrorist systems, the vast majority of data centre environments are failing to impose adequate controls over physical access to individual data centre racks.

Most co-location centres rely on the use of locked cages to separate the IT equipment of each client. But how robust is this model?  What happens when an engineer requires access to a server or rack?  Simply unlocking the cage provides access to the entire suite.  If a problem arises, how can the data centre manager determine the what, when and who?

Change control
At best, racks are secured only with standard handles using a manual key which can be easily broken or bypassed.  These locks provide minimal protection and standard keys are really not practical: key management is time-consuming and the risk of loss is high.

Instead, organisations can deploy network enabled electronic key pads that can be opened remotely or via HID proximity code access.  The model is inherently flexible; enabling organisations to impose the diverse control levels that reflect the different risk or data value of either client or specific rack.
At the simplest level, cards can be configured for specific periods of time i.e. to cover the visit of an engineer.  At a higher level, where two people are required to access the rack, the rack will only unlock with two approved access cards presented simultaneously.  The system will automatically raise an alert to security if the doors are opened without approval or if doors are left open and not locked after the engineer has completed the work.

To create an even more robust model, access can be linked to the change control system: no rack can be opened unless the correct change control request has been made and authorisation received.  Indeed, in some cases organisations do not even permit the co-location provider to enter the racks and undertake any work without change control in place – if access is required a request is made via telephone and a change control issued for a specific time of day/ individual and the door is opened remotely.

Cost benefits
In addition, rack level security releases a significant amount of space.  Typically most co-location centres use cages to provide separation between client installations but, in addition to being unattractive, these cages take up a lot of space that could be generating additional revenue.

Opting for rack level security creates a more flexible data centre model that enables co-location providers to be far more agile in the way racks are reallocated to new business.  Furthermore, combining network enabled security with video surveillance reduces the costs associated with physical security guards.
To find out more about Room & Rack security, click here.

Guest blog by Jason Preston, Director, 2bm Ltd
T: 0115 925 6000




Thursday, 15 January 2015

Data Centre Spending Expected to Rise 1.8% in 2015

The latest research from Gartner indicates that spending in the data centre sector should rise by as much as 1.8% over the next 12 months.  Although the increase is not as much as previously projected, it is still good news to those in the IT industry.  Increased data centre spending means more equipment, more infrastructure and more opportunity.

Gartner projects global IT spending to be somewhere in the region of US $3.8 trillion for 2015.  Of that total amount, they expect some $143 billion to be spent on data centre systems.  Previous projections suggested an increase of 3.9% in global IT spending this year; current projections are closer to 2.4%, representing a 1.5% reduction.

Gartner explains that the rising US dollar and an improving economy are affecting the numbers on paper however, when accounting for currency movement, the real reduction in predicted global expenditures is only 0.1%.  In other words, Gartner expects a very healthy year for the IT sector in general and the data centre industry specifically.

In addition to strong growth for data centre equipment and infrastructure, December research from Mordor indicates a very strong future for both collocation and enterprise software.  They expect the worldwide collocation market to grow to as much as $45 billion over the next four years.  Spending on enterprise software could go as high as $335 billion over the same period.  Both projections are good for the IT services sector.

Healthy IT, Healthy Economy

Those of us in the IT industry should greet the news with plenty of optimism.  Any time the IT market is healthy, it provides an undercurrent for a healthy economy across the board.  Why?  Because so much of the modern economy is linked to global Internet use.  Without the proliferation of the World Wide Web, the modern marketplace would look vastly different therefore a healthy IT industry means that the global network that is now sustaining modern business will continue moving along.

The increased spending on data centres in 2015 is likely to be primarily focused on building new centres and upgrading existing ones in order to keep up with increased demand.  As we move ever more quickly into the age of connectivity, older facilities will simply not have the muscle to compete.  These will be upgraded or replaced entirely.

Another area to keep an eye on is green technology for the IT sector.  This year should be a big year for green energy, more efficient cooling systems and projects that are able to harness data centre heat for reuse in other ways.  This could be the year that we see the development of a completely self-sustaining data centre that utilises renewable energy sources while also contributing to the grid.

It is an exciting time to be part of the IT industry.  We look forward to seeing what advancements will make our industry better; and what old technologies and methodologies are left behind.  What we do in 2015 will set the stage for many years to come.



Wednesday, 7 January 2015

Australian Internet Outage Blamed on Hot Weather

Thousands of Internet users in Australia were without Internet access for up to 6½ hours after extreme heat forced the country's second-largest ISP to shut down its Perth servers.  The outage occurred on January 5 (2015), with external temperatures at the time in excess of 44°C.

According to chief technology officer Mark Dioguardi, the iiNet data centre in Perth began experiencing problems with their primary cooling system, forcing them to switch to a secondary system to keep servers cool.  When that system also failed, the decision was made to shut down some of the servers as a precautionary measure.  Internet access and IT services were disrupted for about 2% of iiNet's total customer base.  Needless to say, they were not happy.

Frustrated customers took to social media to make their thoughts known however, many seem resigned to the fact that unusual occurrences such as this happen in Australia regularly.  It seems to be part of living there.  We assume iiNet customers are keeping their fingers crossed that the company will make the necessary improvements to its infrastructure to prevent future shutdowns.

Back here in the UK, the same type of scenario is unlikely to happen… For starters, we rarely see temperatures as high as they were in Perth however, we also have the added benefit of infrastructure that has been developed with built-in safeguards to protect against our reliably unpredictable weather.

Internet Service Providers' Association (ISPA) secretary general Nicolas Lansman told the BBC, “Data centres and networks are designed with resilience in mind.  Whilst we wouldn't expect 44 degrees in the UK, ISPs and data centre operators are very much prepared for the unpredictable British weather.”

Greatest Challenge Is Being Prepared

The data centre industry faces many challenges every single day.  None of those challenges is as difficult as the challenge of being prepared for any potential service-interrupting events.  Seriously dangerous weather notwithstanding, extreme heat and cold are things that have to be planned for in the early stages of infrastructure design.  What happened in Perth is understandable, given the nature of Australia's summertime heat, but you can bet iiNet will be working overtime to make sure it never happens again.

Today's Internet user is an individual with very little patience.  Among young people especially, there is little experience with having to wait for days or weeks in order to communicate outside of the local area.  This lack of experience makes 6.5 hours without Internet access seem an eternity to younger consumers who have no idea what life was like prior to the Internet age.  As for businesses, a 6.5 hour outage has become simply unacceptable as we rely so heavily on internet access these days. 

This is a perfect example of how competing for the modern Internet user is all about uptime and reliability and how these factors are driving the development of Internet technology well beyond what we could have envisioned even just a decade ago.   The challenges we now face are keeping the data centre industry on its toes as we all seek better, faster and more reliable services.



Monday, 5 January 2015

US Energy Efficiency Bill Dies in Senate

During the final hours of the 113th Congress, the US Senate attempted to finish some last-minute legislation including an energy efficiency bill that proponents claimed would make the federal government a more responsible user of energy in their data centres.  The bill was blocked on a procedural motion by retiring Senator Tom Coburn from Oklahoma.

Although specific details of the bill were never made available to the public, the most important provision called for Washington to reform the way the Department of Energy (DOE) compiles and reports real energy consumption among government data centres.  Right now, the data used as the benchmark comes from an EPA report dating back to 2007.  Not only have things changed over the last seven years, but past investigations into EPA practices shows that many of the datacentre reports used to compile the 2007 numbers cannot be trusted.

According to the EPA numbers, government data centres consumed roughly 10% of the 61 billion kW of energy used by the entire industry in 2006.  The EPA used that number as a basis for forecasting future energy consumption among government facilities.  In fact, the EPA said the government's energy consumption would double by 2012.

The problem with the EPA report is the fact that a good number of the government data centres had no way of reliably reporting energy consumption.  For example, it was estimated that half the power being used by data centre facilities was dedicated to power and cooling infrastructure rather than actually running IT equipment however, without an accurate benchmark for measuring real consumption, some data centre officials merely guessed.

Why Updates Are Necessary

Senator Coburn officially ceased being a senator at the end of 2014.  With him being in retirement now that the 114th Congress is underway, it is assumed that the energy efficiency bill will be reintroduced to the new Congress.   It is expected to pass and go to the President's desk.

Regardless of one's political persuasion, the essence of the bill is important on a number of fronts.  First and foremost, the 2007 numbers are still being used for everything from setting sustainability goals to establishing environmental policy.  Working with inaccurate numbers will achieve less-than-expected results at best. It could even lead to detrimental decisions in a worst-case scenario.

Coburn has not said why he blocked the bill on a procedural motion, but it's widely speculated that he could not reach a compromise with his Democratic counterparts regarding how money allocated for the bill would be spent.  Coburn's reputation as a fiscal conservative has led him to stand in the way of multiple bills that were good in principle, but lacked the proper spending controls to keep costs from spiralling out of control.

Now that the new year the new Congress has commenced, it will likely move on a broader form of the legislation.  The only question is whether the energy consumption reporting by the US government will be any more accurate.