Friday, 31 January 2014

London Test Achieves World's Fastest Broadband Speeds

In a recent test of technology developed under a joint venture between BT (British Telecom) and Alcatel-Lucent, the world's fastest broadband speeds were recorded at 1.4 Tbits per second.  The incredibly fast speeds allowed researchers to send the equivalent of 44 uncompressed HD films over a 255-mile link – in just one second!

The test was conducted between the BT Tower in London and a location in Ipswich, using existing infrastructure.  The fact that the two companies were able to generate such high speeds without building anything new is as important to data communications and it is impressive.  Achieving speeds of 1.4 Tbits per second without the need for new infrastructure opens up the door to great opportunities.

The results of the BT/Alcatel-Lucent test are especially important to web hosting companies and Internet service providers.  It will eventually allow them to offer faster service without the need for large capital investments.  Nevertheless, before you get too excited, it will be a while before the technology makes it to commercial applications.

BT and Alcatel-Lucent began working on their project in response to the ever-growing need for higher bandwidth.  According to the BBC, that demand grows every year by roughly 35%, dictating the need for faster speeds using current optical fibre technology.  The growing demand simply cannot be met fast enough, with other technologies requiring new infrastructure.  For now, companies like BT and Alcatel-Lucent need to do more with what they already have at their disposal.

Everyone Is Doing It


While BT and Alcatel-Lucent may be the first to achieve such impressive data speeds, they are by no means the only ones working on the problem.  Companies like Google and Virgin Communications are pushing the boundaries as well.  Today's networking and cloud computing environments demand no less.

Right now, the main question seems to be one of sticking with fibre optic technology and trying to make it better, or shifting to new technologies like laser.  Given the history and cost of building new infrastructure it seems reasonable that technology companies will drain every last ounce from optical fibre before moving on to other things.

That's good news in the sense that there still remains more to be gained from the infrastructure that is already in place.  The opportunity for faster speeds without incredible financial investments offers the opportunity for businesses to grow and expand networking capabilities fairly easily.  It is definitely a huge step in the right direction.

As companies like BT and Alcatel-Lucent learn to pack greater amounts of information into fibre optic networks, they will continue increasing data transfer speeds throughout the UK and the world.  It should be no surprise that London was the site of their most recent test, given the fact that the UK leads the world in high-speed Internet communications.  As long as we continue to make it possible for technology companies to succeed in the UK, there are no limits to what they can achieve.

Sources:


1.     BBC - http://www.bbc.co.uk/news/technology-25840502#story_continues_2

Monday, 27 January 2014

Microsoft Ready to Launch Experimental Data Centre

In the race to be the first major computing company to build a reliable and 'totally green' data centre, it appears as though Microsoft might now have a slight advantage over the competition.  The company recently announced plans to launch an experimental data centre that also doubles as a green power plant.  The facility is scheduled to go online within a month or so.

Microsoft's Sean James recently wrote about the plans on the company's TechNet blog.  Among other things, James identified the site of the new data centre as Cheyenne, Wyoming – in the American North-West. The region's low population density and abundance of natural resources makes it a perfect fit.

According to James, all of the data centre’s power and cooling needs will be provided by electricity generated from biogas.  For this particular application, municipal waste will be treated to produce methane for the electrical generation.  James states that nothing in the cycle will be wasted, including the heat the data centre produces.  That heat will be harnessed and returned to the sewage treatment facility to aid in the process of producing more methane.  Any excess electricity can be sold back to the grid.

The actual power generation process involves pumping methane directly into fuel cells that produce the electricity.  Microsoft plans to run the centre for the next 18 months in order to measure how well the system works.  If it is successful, we expect Microsoft to take the centre full-scale shortly thereafter.  They might even use it as a model to build additional facilities.

Increasing Power Consumption


As Sean James explained in his blog post, the experimental Microsoft station is about more than just creating green, zero emission energy for data centres.  It is about keeping up with the ever-increasing power demand presented by networking communications.  That demand continues to grow worldwide.

In the United States alone, power consumption among all data centres equals about 2% of the total energy used throughout the entire country.  Experts expect that percentage to only grow as cloud computing and virtualisation become more widespread.  What's more, new data centre construction continues to accelerate in order to keep up with worldwide demand.

Common sense dictates that the world's data centres cannot keep pulling more and more electricity off the grid without affecting everything else therefore there is an absolute necessity to develop alternative power sources that will allow data centres to be both self-contained and green.  The bonus with the Microsoft site is the real possibility of generating excess power that can be used by other industries or sold for residential applications.


We should note that the Wyoming data centre is not Microsoft's first foray into green energy.  The company has also invested heavily in both wind and hydroelectric power generation however the advantages of the biogas project that should be realised during the 18 month testing period could drastically alter where the company puts its future green energy investments.  It should be fun to watch.

Wednesday, 22 January 2014

Power Outage Severely Damages US Data Centre

A New Year's Day power outage in the United States is being blamed for extensive damage done to a data centre operated by the US National Parks Service (NPS).  The centre is located in Denver, Colorado. It houses nearly all of the data the NPS needs to administer America's national parks and monuments.

According to sources, the power went out at the data centre at approximately 3pm local time.  The uninterruptible power supplies at the facility kept things going for about 30 minutes, but the local power company was unable to get the power restored until 5pm.  In the interim, the data centre ended up shutting down.

The NPS reports that a significant amount of data was corrupted as a result of the shutdown.  When power was eventually restored, an improper restart procedure damaged hardware as well.  No estimate has been provided for when the data centre will be back to normal operations.

For the software portion, the NPS immediately contacted Microsoft who went about the process of restoring as much data as they could via off-site backups however Microsoft said significant portions of the system would need to be rebuilt from scratch.  As for the hardware, it is simply a matter of repairing what can be fixed and purchasing replacements for the remainder.

Backup Generators


Because data centre reports demonstrate how costly shutdowns can be, nearly every facility in the world has backup generators operating on diesel fuel.  This enables data centres to remain in full operation for as long as it takes to restore power.  Just as long as a facility has enough fuel on hand, backup generators can provide power endlessly.

It seems rather curious that there were no backup generators mentioned in relation to the NPS data centre. One possibility is that there were no generators to speak of which, according to industry standards, is inexcusable.  Another possibility is that generators were on-site but not working properly.

In either case, the proper management of a data centre requires every precaution be taken to ensure this sort of thing does not happen.  Power outages are part of the equation that will never be eliminated, thus necessitating things like uninterruptible power supplies, backup generators and other sources of off-the-grid energy.

It remains to be seen how long it takes the US National Park Service to get their data centre up and running again.   The rest of us would be wise to learn a lesson from this incident.  No matter how non-critical a data centre might appear, it will instantly become critical should it shut down due to a power outage.

In a day and age where backup technologies are extremely affordable, this is something that should never happen.  In light of this knowledge, it will be interesting to see whether the management of the NPS data centre is called into question or not.  It will also be interesting to note whether anyone in a position of policymaking will be subject to disciplinary action.

Thursday, 16 January 2014

NSA Spying Possible Without Active Internet Connection

A continually developing story out of the United States is serving as a stark reminder to organisations involved in networking and data communications that security can never be taken lightly.  The story comes by way of the US National Security Agency (NSA) and its global espionage efforts.

According to a report published by the BBC, the NSA has acknowledged using secret technology for global espionage purposes – technology that allows it to monitor computer activity even when targeted machines are not actively connected to the Internet.  While we've known for a while that this technology exists, the new revelations mark the first time a government agency has publicly acknowledged using it to spy on others.

This should be troubling to those of us in the networking and data communications sectors on several fronts: first and foremost is the real potential that the NSA might be monitoring individual computers and data centres right here in this country.  Moreover, where the potential exists, extra vigilance is demanded.

Secondly, if the NSA can effectively use this technology for spying purposes, so can criminals.  According to the BBC, US whistleblower Edward Snowden has already produced documents showing at least 100,000 computers being monitored by the NSA.  It is easy to imagine hundreds of computers - if not, hundreds of thousands - also being compromised by criminals who might get a hold of the technology.

In order to spy without an active Internet connection, the NSA has been installing small radio transmitters via computer circuit boards or USB cards.  As long as the affected computers are running, all sorts of information can be monitored via radio transmissions.

The NSA has tried to quell criticism of their programme by assuring the world they do not use the data collected for any purposes other than identifying and dealing with foreign intelligence threats however, the record of the current Administration suggests there is little comfort in those assurances.  The public revelations surrounding the NSA activity need to be taken seriously by the entire networking and data communications community.

How to Respond


At this time, there are no opportunities for a direct response among companies in Europe that may have been affected.  Nevertheless, we can indirectly respond by adjusting security training and management to take into account the actions of the NSA.  Security experts need to be briefed in the technology, how to detect it, and what to do if it is found.  Software tools also need to be developed capable of thwarting the spying efforts of anyone who would use the technology.

The world is becoming an increasingly insecure place in terms of worldwide data communications.  It is the responsibility of those of us involved in the industry to do whatever is necessary to ensure the privacy and security of our clients.  We can never let our guards down where Internet security is concerned;  the moment we do, we open the door to individuals and organisations that would seek to take advantage of any vulnerabilities they can identify.

Sources:

BBC - http://www.bbc.co.uk/news/technology-25743074

Tuesday, 14 January 2014

NextGen early stage smoke detection reduces risk & prevents DC downtime

Outages, the constant threat

Tasked with providing a 24/7/365 service, data centre managers are well aware of fire as a potential hazard. Providing effective fire protection for a data centre poses some interesting challenges. In a large data centre up to 40kW of heat can be generated in some server racks, so an overheat condition can develop extremely rapidly if there is failure in the heat removal equipment. If an automatic extinguishing system is activated, the clean-up requirements can be as disruptive as the fire it has put out.

In a recent study of US data centres by the Ponemon Institute, the cost of downtime has increased significantly in the last three years. Each unplanned outage now costs an average of US$7,900 per minute and lasts for 86 minutes. It costs around $690,200 in damage to mission-critical data, the impact of downtime on productivity, damage to equipment, legal and regulatory repercussions and lost confidence by key stakeholders.

The problem: the ever-present danger of fire

The data centre is not a friendly environment for traditional methods of smoke detection. With high currents, overheating cabling is a very real issue. Computer room air conditioners (CRAC) create very high airflows, often causing stratification and smoke dilution. The large temperature variations between the air in the hot and cold aisles also cause smoke detection issues.

A solution: earliest warning detection from next generation aspiration systems

The ultra-high sensitivity of an aspiration system is the most effective technology to give maximum warning time for the impacted equipment to be taken offline and data to be moved; however, historically, false alarms have been a problem. Typically, the initial response to a fire alarm is to power down the affected equipment, making a false alarm as problematic as a real fire.

Next generation aspiration detection systems, such as FAAST, are able to respond to an overheating cable hours before smoke becomes visible. The latest include sophisticated filtration to deliver ultra-high sensitivity (as much as 0.0015% obs/m) without the danger of false alarms.

Systems also now have TCP/IP connectivity, enabling remote interrogation and monitoring. Alerts can be automatically broadcasted by email, a benefit to data centre managers. Modbus protocols are also embedded, enabling seamless integration with DCIM systems without any additional hardware or software and delivering comprehensive communications with the fire control panel.

Standards

The latest aspiration systems are required to meet EN54-20 Class A, B and C requirements. They can protect 2000 m2 in a Class A applications such as a data centres, making them more scalable and able and to address the key concerns in these areas:

• Potential revenue loss
• Smoke, fire and water damage
• False alarms / downtime
• False discharge of extinguishing systems
• High air velocity environments

To find out more

Aspiration detection providers such as Honeywell, offer free of charge, CPD approved seminars on the latest aspiration technologies. These are aimed at consultants, specifiers and installers working in the fire system design, architectural, mechanical and electrical contracting and facilities management sectors. Go to www.faast-detection.com to find out more.

DATACENTRE.ME guest blog by Tim Checketts, Honeywell


Monday, 13 January 2014

Spain First Country to Produce Majority of Power from Wind

The data still has to be analysed and confirmed, but preliminary reports suggest that Spain is the first country to generate the large portion of its annual power needs from wind.  According to raw data provided by Red Eléctrica de España (REE), it appears that wind power accounted for 21.1% of the electricity put into the Spanish electrical grid for the 2013 calendar year.

The data must first be verified by Spain's governing body, Asociación Empresarial Eólica (AEE), before any official claim can be made however, should the REE data prove accurate then Spain will have been the first to achieve something countries all over Europe have been striving for: creating a volume of renewable energy that surpasses non-renewable sources.

REE says that wind power's 21.1% production just barely eclipsed the 21% generated by nuclear.  Coal was the third largest generation source at 14.6% while hydropower came in at 14 4%.  Finally, combined renewable sources provided 42% of the country's electricity needs.

Equally important is the fact that Spain has been able to reduce wholesale power prices by increasing renewable energy generation.  AEE claims the wholesale price on the day of the highest renewable output was €7.69/MWh, compared to a price of €93.11/MWh on the day of the lowest output.  In addition, wind energy appears to have saved the country more than €2.7 billion in foreign energy imports.

The Future of Wind Power


Spain and its wind power consortium should be extremely proud of the feat they accomplished in 2013.  Moreover, even if the final analysis still shows nuclear did better, they have still come a long way in terms of commercial wind power.  The country needs to take every opportunity to continue harnessing what it has available in order to reduce its dependence on coal, nuclear, and other non-renewable energy sources.

In the future, we would expect Spanish power companies to design and build wind generation technologies even more sophisticated than what is being used today.  It may still be a long way off, but their 2013 accomplishment even makes it possible that Spain will one day be the first country to all but abandon coal and nuclear.

As we've said in previous posts, the high-speed and high-powered data communications of today demand nothing less.  It is critical we develop as many renewable energy sources as possible if we are to keep up with the advances in computer technology without sacrificing the energy consumption needs of the rest of society.  Renewable energy must lead the way.


Perhaps now that Spain has proved it can be done on a large scale, there may be other opportunities to create self-contained data centres and collocation facilities that have no need for power from the grid.  Perhaps we are closer than ever to the day when they generate their own power to the point of having excess they can return for general use.  If you are a fan of renewable energy, it is an exciting time to revel in what Spain has accomplished.

Thursday, 9 January 2014

Australian Data Centre Offers 100% Renewable Energy Option

In the first offer of its kind “Down Under”, an Australian data centre is now offering customers the option to use 100% renewable energy for their servers.  The option is available thanks to a brand-new AUS $1.2 million solar array installed on the data centre's roof.  The array was designed and installed by Energy Matters, a Melbourne-based company and one of the largest solar technology enterprises in Australia.

The 401 kW solar electric system needs 3000 m² of roof space to generate approximately 550 MWh of electricity annually, according to reports. The company says its solar electric generation will offset approximately 670 tonnes of carbon dioxide emissions every year however, the only downside is that it generates but a fraction of the total power and cooling needs of the facility.

Company officials say the solar array will only create about 5% of the energy needed to run the data centre facility.  As a result, only a limited number of customers will be able to take advantage of the 100% renewable energy option.  However, among those who do, it is a great selling point they can turn around and use for their own marketing purposes.

Energy Matters' Nick Brass was quoted by The Fifth Estate as praising data centre owner NEXTDC for making the move into solar energy. He said the company “has not only taken a leadership role by being the first data centre in Australia to make such a meaningful commitment, its foresight means it has effectively locked-in a sizeable portion of its energy bills for the next 25 years and beyond.”

Another Small Step


Anyone interested in seeing renewable energy advanced around the world will welcome the news regarding the NEXTDC data centre.  Although it may seem like only a small step, multiple small steps add up to very large ones.  What starts out as a project to generate 5% of the data centre's total energy needs may someday grow to 100% as technology allows, and that's very much the goal here.

The ever-increasing speed and raw power of today's computer systems dominates data centre news around the globe.  But never forget that the supercomputers of today and tomorrow need exponentially more energy to operate at maximum capacity and that energy has to come from somewhere.

Making a concerted effort to develop renewable energy sources to feed future computer needs is not only wise; it is the moral responsibility of everyone involved in the industry.  From data centres to web hosting companies to equipment manufacturers, everyone needs to continue pressing forward into a future where more renewable energy is developed.

As for the Australian project, it is still so new that we are unable to draw any concrete conclusions about its success however, we trust that things will be different a year from now.  We are fairly confident the solar array will deliver as promised and that customers of NEXTDC will be thrilled to take advantage of the option to use 100% renewable energy.

Sources:


Monday, 6 January 2014

Office 365 Bug Leaves SharePoint Users Vulnerable

Just when you thought you were safe and sound in your SharePoint cocoon it turned out, just before Christmas, that there was a major bug in Microsoft Office 365 that could expose your organisation to a serious security breach.  The bug was discovered and tested by Noam Liran, chief software architect at Adallom.  Once the bug was revealed, Adallom produced a video demonstrating how it worked.  Fortunately, Microsoft quickly came out with a patch to solve the problem.

According to numerous sources, the bug resides in the way SharePoint servers authenticate users.  Being that both Microsoft Office 365 and SharePoint are essentially cloud-based, anyone can attempt to log into a share point account from just about anywhere.  By using a fake server and an Office document, Liran was able to fool a user computer into sending a SharePoint security token.  The process turned out to be quite simple.

When a Microsoft Office 365 user attempts to download a document from a SharePoint account, the SharePoint server verifies the authentication credentials of the Office 365 account holder and returns a security token.  That token is only supposed to be sent to a computer already on a SharePoint domain however, Liran's fake server was able to return the necessary information that would normally be expected by the Office 365 computer, allowing the token to be sent anyway.

In principle, this bug could be used to exploit everything from documents to managed services.  Essentially anything on a SharePoint server could be accessed, downloaded and manipulated in any way the hacker saw fit.  The fact that it was so easy should be alarming to anyone involved in cloud computing and data centre operations.

The Cloud Computing Quagmire


This latest security bug from Microsoft Office 365 perfectly illustrates why the cloud computing concept has not enjoyed the breakout success companies like Microsoft have been hoping for.  For better or for worse, the entire cloud concept is one rife with potential security breaches that far too many business executives and IT professionals are uncomfortable with.

True, there will always be security risks as long as we continue to use networked data communications.  Nevertheless, there is a long-standing axiom that cannot be ignored: the more complex a system, the greater its vulnerabilities.  Cloud computing is, by its very nature, a complex system. Moreover, regardless of whether you are talking about Microsoft Office 365, SharePoint, virtual servers or any other related technologies, there will always be more security vulnerabilities than the experts can keep up with.


That's not to say we should consider abandoning cloud computing.  It has simply to suggest that the cloud is not a one-size-fits-all solution for every business and networking need.  We need to keep as many options open to meet as many needs as possible and businesses and cloud providers need to cut their cloth accordingly.  In the meantime, security needs to always be the number one priority of all network communications.