Thursday, 28 November 2013

IBM Testing Disaster Intervention Cloud Solution

IBM and Marist College have announced a joint project – now in the testing stage – that could change the way data centres prepare for impending natural disasters.  The solution involves using a cloud-computing environment to re-provision data centres in minutes, using only a mobile device and a wireless connection.

When a natural disaster is imminent, data centre operators immediately set about the process of re-provisioning.  Simply put, re-provisioning moves data and online applications to another server not in harm's way.  Unfortunately, the process can take days to complete using technology currently available.  Often, data centres do not have that much advance warning.

Superstorm Sandy a Good Example


Last year's super storm Sandy is just one example of what could happen when there isn't enough time to re-provision.  As the storm bore down on the north-eastern United States, it destroyed communications networks and put a number of data centres literally underwater.  A year later, some of those data centres still haven't recovered.  In the days and weeks after the storm, there were still millions of businesses and individual consumers in the North-east without access to network data communications.

IBM's new system could be a significant game-changer should the test at Marist College prove successful.  The system takes advantage of what is known as software-defined networking (SDN) in a cloud-computing environment.  The software allows for more efficient management of data in both physical and virtual networks, while also allowing immediate changes to network resources, even from remote locations.

Disaster Intervention


The system, in theory, changes the game from disaster prevention to disaster intervention.  As soon as data centre operators know a potentially damaging storm is imminent, they could begin re-provisioning right away.  Just bring up the software on any mobile device and they are off and running.  Systems engineers don't even need to be present to get the job done.  What's more, there is no interruption in service for the millions of customers who might be using a series of data centres in the storm's path.

In a world becoming increasingly more dependent on virtualisation and cloud computing, the system cannot be ready soon enough.  IBM is working to make the software commercially available sometime next year.  It is likely to become an instant success should it deliver what IBM is promising.

For the record, the SDN lab at Marist College is sponsored by IBM, also providing testing facilities for a number of other important technologies.  The lab is heavily involved in helping IBM develop cloud-computing technologies by providing an environment for testing using real-time scenarios.  IBM has used the lab to develop an open-source SDN controller, develop a software device that can predict and prevent Internet traffic congestion and better control video streaming within the cloud.


The success of the lab suggests that the test of IBM's new disaster intervention system will go well.  We are looking forward to the commercial release of the software next year.  Once it becomes main stream, it could hopefully mean the end of service interruptions as a result of natural disasters and other unforeseen events...

Monday, 25 November 2013

Japanese Supercomputer Enjoys Boost from Cooling Fluid

The TSUBAME supercomputer from Japan's Global Scientific Information and Computing Centre has long been one of the most powerful supercomputers in the world.  What's more, the research and development conducted at the centre has enabled Japan to be very influential in multiple areas, ranging from technical education to computer-based enterprise.  So what are the chances of TSUBAME being improved upon?

Pretty good, especially if you run a company specialising in cooling fluids for data centres.  Such is the case for Green Revolution Cooling, an American based technology company that improved the performance of TSUBAME by submerging its multiple CPUs and GPUs in its revolutionary cooling liquid.

The computer, dubbed the TSUBAME-KFC, uses two Intel Xeon E5 processors and K20 Kepler GPUs from NVIDIA. By submerging the system in the speciality cooling fluid, the researchers were able to improve the 1.0 Gflop per watt performance of TSUBAME 2.0 to 4.5 Gflop. That makes the supercomputer the most powerful of all supercomputers.

What makes this breakthrough so important is the fact that the power and cooling needs of supercomputers goes well beyond that normally used by the average data centre.  Increasing output more than four times, without significantly increasing power consumption, opens the door to more practical uses of supercomputers in the future.

The Tokyo Institute of Technology is now touting this research breakthrough as yet another reason why Japan leads the world in supercomputer research.  The TSUBAME 2.5 is poised to play a major role in computer education, advanced computer simulation, further research & development and competitions designed to breed future computer technology experts from within Japanese schools.

We should expect to see Japan design and build better versions of the TSUBAME in years to come.  As for where the architecture will take us, there is no way to know however it is easy to assume that things can only get better from here.

Data Centre Applications


The technology advancements afforded by TSUBAME have great potential for the data centre and collocation market too.  If you think of it in terms of cloud computing and on-demand Internet, those benefits should be obvious.  A supercomputer capable of exponentially greater computations could increase both data transfer speeds and data crunching capabilities.

Those of us in the business already know that hardware is not keeping pace with software and theoretical advancements. We know there is so much more that could be done if we had equipment fast enough and powerful enough to keep up with the data. That's the promise TSUBAME holds.

Imagine being able to design and build a supercomputer at every data centre.  Imagine the possibilities for hosting large-scale computer systems that would dwarf even the largest mainframes of decades past.  Imagine the potential for high-speed data transfers, entire systems existing in a virtual environment and clouds that would span the entire globe.


There is an old American baseball movie with a famous line that says, “If you build it, they will come.”  We think it applies to supercomputing as well.

Thursday, 21 November 2013

Researchers Break Record for Quantum Computing

Researchers from British Columbia's Simon Fraser University have done the unthinkable.  They have managed to maintain a quantum memory superposition state, at room temperature, longer than any previous attempts. What's most impressive is the fact that the record-setting superposition lasted nearly 100 times longer than the previous record of 25 seconds.

So how long did they maintain the superposition?  An astounding 39 minutes!

If this doesn't sound impressive to you, consider the fact that quantum computing usually requires extremely cold temperatures to be successful.  For example, a superposition at absolute zero (-273C°) would be the norm for maximum efficiency however the Simon Fraser University researchers accomplished their feat at a relatively balmy 25°C.  This is a major breakthrough no matter how you look at it.

If they are able to take their test results and use them to double their time, imagine what they'll be able to do with the next jump... Is 80 minutes out of the question?  There's no doubt that quantum computing is the future of high-speed algorithms and computing functions.  As such, it is also the future of high-speed data communications and IT services.

Quantum computing uses an entirely different paradigm that works with data contained in various quantum states rather than expressed as binary data.  The fascinating thing about quantum computing is that data can exist in many different states simultaneously.

In theory, quantum computing allows data to be represented and crunched in any number of states, depending on the devices that will use and manipulate the data.  Different devices could use different states of data without disturbing the others.  Moreover, quantum states allow massive amounts of data to be handled at lightning fast speeds.

Being able to design and build a supercomputer that can maintain a quantum superposition for 39 minutes shows, at least in concept, that a future machine could be built to run indefinitely.  It certainly gives new meaning to IT power and cooling design.

Untapped Potential


For those of us involved in the data centre and collocation sector, the untapped potential of quantum computing is virtually limitless.  Just think about how much data we can currently crunch and manipulate using a binary system and state-of-the-art equipment then imagine multiplying that hundreds of times over with a quantum-based supercomputer.

We've seen it in sci-fi movies already.  We've seen imaginations of instant communications and transfers of incredibly large chunks of data, futuristic translator devices for intergalactic communications, and incredibly fast analysis of large chunks of data that could not be touched by a traditional binary system.  And you know what they say about sci-fi film technology: it's only a matter of time before it becomes reality.


It's true that there are still plenty of hurdles to overcome before quantum computing is the norm.  Nevertheless, what the researchers at Simon Fraser University accomplished is nothing less than phenomenal.  It is also nothing less than historic.  Their accomplishment clearly sets the stage for the first quantum computer with an indefinite shelf life.

Monday, 18 November 2013

American Retailer Considering Jump to Data Centres

Throughout most of the 20th century, the Sears Roebuck Company was one of America's premier retailers, offering consumers everything from clothing to household items to automotive care.  In fact, Sears pioneered many aspects of retail marketing that are now standard practice today.  However, by the turn-of-the-century a weak economy saw the struggling company looking at every alternative to keep its doors open, including merging with competing retailer Kmart.

Earlier this year they went one step further by forming Ubiquity Critical Environments.  The new company's mission is to find ways to transform underperforming retail locations into 21st century technology enterprises.

Ubiquity's first major proposal is to convert a number of the Sears Auto Center stores into mission-critical data centres serving smaller markets.  It is an idea born out of the realisation that secondary markets tend to be at the mercy of larger markets for data centre and IT services.  The Sears locations offer a number of unique advantages that make them perfectly suited for such an enterprise.

The company originally looked at the possibility of converting underperforming Sears and Kmart stores, but then realised that most of those stores were close enough to established data centres to make the project financially unviable.  The auto centre stores are a different matter.

Sears Holdings operates about 50 standalone auto centres located on the outer edges of mall property.  The stores offer up to 50,000 ft² of floor space and ceilings at least 16 feet high.  This would enable the company to move quickly by building modular units that could easily take advantage of current infrastructure and architecture.  Moreover, because the sites are already approved for industrial or commercial use, Ubiquity could set up the data centres with very little government interference.

As for power and cooling needs, company officials say that should not be a problem either.  Mall properties are typically on the receiving end of high power transmission lines already.  Furthermore, many of the sites include stranded power not being put to use in any other way.

It should be noted that many of the auto centres already have a fairly impressive power capacity due to the installed automotive equipment.  In some cases, very few modifications would be needed to accommodate the power and cooling needs of a data centre.

Should Ubiquity decide to go through with the plan, they will be collaborating with Schneider Electric; a company that is already an established leader in data centre equipment and services.  They are the perfect partner for transforming this portfolio of nearly-ready properties into mission-critical data centres for small markets.


Undoubtedly, the data centre sector in America will be watching closely at what Ubiquity does here.  The economic crash of 2007/2008 has left in its wake a much smaller retail sector and plenty of under-performing and under-utilised properties. A successful project from Ubiquity could pave the way for other struggling enterprises to pursue similar projects; and if enough companies got involved, they would drastically change the data centre landscape in the States.

Friday, 15 November 2013

Japanese Wind Turbines Offer Both Hope and Fear

Ever since the 2011 earthquake that devastated the Fukushima nuclear reactor in central Japan, the country's nuclear power capacity has been reduced significantly.  The Japanese government is looking at other sources of energy in hopes of abandoning nuclear altogether.  According to the New York Times, one potential source is wind.

The Times says Japan has enough wind offshore to power their entire country, and then some.  Some estimates suggest that they could produce up to eight times more power than they need under normal circumstances.  Moreover, as a renewable power source, the wind could produce that energy without creating a huge carbon footprint.  Unfortunately, it is not as simple or easy as that.

In order to replace just one nuclear reactor, Japan would have to build 140 offshore wind turbines.  Yet they do not have enough shallow water in which to build traditional turbines anchored to the sea floor. One solution is to design and build wind turbines that would float the same way oilrigs do.  The floating turbines would allow Japan to place them virtually anywhere they wanted.

If initial tests of three floating turbines are successful, it is likely the government will go ahead with plans to build more.  How many they build remains to be seen however, with 50 nuclear reactors to replace, you are talking nearly 7,000 turbines scattered throughout Japanese waters.  It is a great source of renewable energy, but one that could have a negative impact as well.

Japanese Fishing Industry


It's no secret that the Japanese rely heavily on commercial fishing for a significant portion of their economy.  Some fear that thousands of floating turbines would drastically alter fishing habitats to a point of severely damaging the industry.  No one really knows the impact that these floating turbines would have because it has never been done before.

The hope is that the turbines would actually help fishing by encouraging colonies of fish and seaweed to accumulate however, even if they do, the largest fishing trawlers would not be allowed anywhere near the wind farms for obvious reasons.  To those who rely on fishing for their livelihood, there is no reasonable way the fishing industry and large-scale floating wind farms can co-exist.

One last thing to consider is the cost of the floating turbines.  Just one is expected to cost as much as eight times more than a traditional anchored turbine in shallow waters or on land.  Wind farm management costs and commercial development would add to the overall price tag as well.

While the floating turbine idea is an interesting one worth pursuing, there are plenty of hurdles to overcome to make the project viable.  And that's just to provide for the general power needs for the Japanese islands.  It says nothing about using the renewable energy to power future data centres, collocation centres and major manufacturing plants.

If it works, it will be a definite coup for the Japanese government.  On the other hand, it could end up being a colossal failure....  time will tell what the outcome will be.

Monday, 11 November 2013

EC Publishes Big Data Fact Sheet in Recent Memo

The European Commission took the opportunity presented by a November 7 memo to release a fact sheet explaining the idea of what ‘Big Data’ is.  Although no official reason for why the memo was necessary has been offered, the language of the fact sheet suggests that the Commission intended to educate the average consumer about Big Data.

The definition of Big Data offered by the Commission is as follows:

“Every minute the world generates 1.7 million billion bytes of data, equivalent to 360,000 standard DVDs.  More digitised data was created in the last two years than in the rest of human history.  This trend and the mountains of data it produces is what we call ‘Big data’."

The authors of the fact sheet went on to pose the question of why Big Data is important.  Their explanation was one of how the vast amount of data collected on a daily basis is now being used to improve everything from farming to public health.  The Commission cited the 5%-6% efficiency gains in the business sector as sufficient reason to embrace Big Data in every area of life.

An example of how Big Data affects the average consumer can be found in the agricultural sector.  The memo speaks of a farmer whose future operation is plugged into the system in order to glean from the mountains of data collected from around the world.  It would be used to determine what crops to grow, when to plough and sow, and what markets will be most suitable at harvest.  It can even be used for equipment automation.

The Good and The Bad


The European Commission's Big Data memo did a very good job of presenting the topic in a positive light however true objectivity requires us to look at both sides of the equation.  Yes, Big Data is capable of improving efficiency and effectiveness on many levels, but it could also being incredibly harmful.

One example cited in the memo explained Big Data could be used to avoid a worldwide epidemic by identifying trends on social networking sites like Facebook.  Seemingly innocuous statements, like someone mentioning that they are in bed with the flu, apparently allow those in the Big Data community to establish where future problems might arise.  On the surface that sounds good... but is it really?

If a government health organisation could put that type of data to use for the benefit of citizens, it could put other types of data to use to everyone's detriment.  Moreover, simply assuming that the government would never do such a thing is to turn a blind eye to history.


Big Data does have many positive implications for everything from IT services to networking to worldwide data communications.  It has many positive implications for humanitarian efforts and education however it is something that must be treated with the utmost respect and caution.  Without proper safeguards and effective enforcement means that Big Data could be a digital disaster, waiting to overwhelm the world.

Friday, 8 November 2013

Floating Data Centres Powered by Waves – is this the way forward?

Finding new ways to harness the earth's natural resources for energy production has long been part of the human landscape.  Over the centuries, we've used the sun, wind and water as a means of making our lives better.  More recently, these natural resources are being put to use for electrical generation in ways most of us would never have imagined.  For example, there is speculation that Google is working on floating data centres to be powered by ocean waves.

Pelamis Wave Power Ltd, makers of a revolutionary electrical generation system using ocean waves, recently speculated about Google's possible plan based on a combination of events.  They say the search engine giant filed a patent for using a Pelamis wave power machine to power a floating data centre.  They say the company is also responsible for the construction of a number of mysterious barges off the coasts of Portland and San Francisco.

Although Google has been silent about its links to the barges, the US Coast Guard has publicly stated the barges belong to the Mountain View, California, company. It's quite possible Google is working on plans to design and build data centres that will float offshore and be run entirely on electricity generated from waves.

The Pelamis wave machine works by connecting long strings of semi-buoyant objects connected to one another with hinges.  As waves move through the water, the individual sections move back and forth on the hinges.  The energy of these movements is converted into electricity using highly sensitive, state-of-the-art equipment.

Pelamis deployed their pilot offshore system in 2004, becoming the first to connect to the UK grid with a wave-based device.  Since then, they have built and tested five additional systems at various locations.  The second generation of their technology is proving to be very promising for power production.

The Future of Data Centres


The implications of this technology for future data centres are very significant.  To understand why, one need only look at the state of the industry in the UK.  While the UK clearly leads Europe in data centre and IT innovation, finding the property to build and the green energy to power new data centres is challenging.  A floating data centre powered by ocean waves solves both problems in one fell swoop.

The main advantage of wave power over wind and solar is the fact that it is nearly constant.  Even on days when the water appears relatively calm at the shore, just a few miles offshore there is enough wave action to generate the electricity a floating data centre would need.  With the right technology, one of the systems could even generate surplus electricity on stormy days.  That surplus could be sent through the grid to supplement current brown energy.


Without a doubt, floating data centres would certainly open entirely new doors for collocation, web hosting and digital communications.  In the highly competitive world of IT, the company that manages to master this first will be at the cutting edge. 

Monday, 4 November 2013

LED Lights Show Promise for High-Speed Data Transmission

There's no doubt that LED lights are the latest craze for everything from household to decorative to automotive lighting.  LED lights are very attractive because they are efficient, cost-effective and very powerful however there may be a new use for LED lighting on the horizon: high-speed data transmissions.

BBC News recently reported on new research being undertaken in a joint project between the universities of Cambridge, Edinburgh, Oxford, St. Andrews and Strathclyde.  The project has enabled researchers to achieve data transmission speeds of 10 GBits per second using specially designed LED lights capable of utilising what is known as orthogonal frequency division multiplexing (OFDM).

Researchers explained their technique as being similar to what a showerhead does.  Just as a showerhead splits water into dozens of individual streams, the modulators in the customised LED lights split the blue, green and red light spectrums into dozens of individualised streams that can carry binary data.

By controlling the modulation of each of the light streams, researchers have been able to transfer data to computers with a wireless system they are now calling ‘Li-Fi’.  In theory, the technology should make it possible to increase wireless network capabilities exponentially.  A commercially viable implementation of the technology will most certainly revolutionise data communications.

The idea behind Li-Fi goes back a few years to the University of Edinburgh's Prof Harald Haas, one of the current project's most influential leaders.  Since his first demonstration in 2011, other researchers have been able to design and build LED bulbs capable of handling data communications at high rates of speed.

In Germany for example, researchers were able to achieve data transfer rates of up to 1 GBit per second, per light frequency, under optimal conditions.  Chinese researchers have achieved speeds of 150 MBit per second with a single bulb, providing data communications for four computers.

Other Advantages


Besides the obvious increase in speed, Li-Fi offers other advantages as well; amongst which is a huge improvement in local network security.  Because light waves cannot penetrate solid objects, it would be entirely possible to have a secure Li-Fi connection inside a single room that could not be accessed externally.  That capability could be extended throughout the entire office building with only a few modifications.

Another advantage is that light, when properly spaced and focused, does not suffer from the same degradation consistent with radio signals.  Where traditional Wi-Fi signals may drop out, Li-Fi may remain strong.  The potential here seems obvious in the world where radio-based Wi-Fi can be problematic.

Li-Fi technology promises to change how we think about wireless data communications.  Combined with fibre optics and other superfast technologies now emerging, we could be on the verge of a completely new era of digital communications.


Is it any wonder that the current research is being undertaken in the UK?  Not when you realise the UK leads the way in digital technology developments in Europe.  This project is yet another example of what the UK technology sector has to offer and it's all good.