Thursday, 19 December 2013

IBM Weather Forecasting Technology Helping to Make Wind Farms More Efficient

In 1996, IBM's Deep Blue computer made history by beating world-renowned chess champion Garry Kasparov in the second six-game match between the two. Now,  the same deep computing technology that made Deep Blue possible is being put to use to make renewable energy generation more efficient.

Say hello to IBM's Deep Thunder project; a project that began the same year Deep Blue defeated Kasparov.  The goal of Deep Thunder has always been to provide short-term, local weather forecasting and modelling for customisable local purposes.  It is a goal that is now realising some success when applied to wind farms.

According to news reports, IBM has developed a new technology they have called Hybrid Renewable Energy Forecasting (HyRef).  The technology combines hardware, software and local weather data to predict how much energy a wind farm will generate in a given amount of time and how local weather patterns will affect that generation.

For example, sensors in and around a wind farm will measure things like wind speed, wind direction and air temperature.  That data will be combined with information regarding cloud movement, barometric pressure and other meteorological waypoints to create real-time weather forecasts, allowing better management of wind farms.

If the technology proves to be as successful as IBM has promised, it could theoretically be used for solar applications as well.  The thing to remember is that the success or failure of the system will depend entirely on its accuracy.  If HyRef is inaccurate in producing real-time forecasts, there will be no real benefit to renewable power generation.

As things currently stand, the biggest impediment to generating a significant amount of energy through renewable sources, like wind and sun, lies in the unpredictability of weather events.  We all know how inaccurate a 24-hour weather forecast can be; IBM will have to overcome greater obstacles to accurately produce real-time forecasts capable of making renewable energy sources more efficient.

Hoping for Success


IBM is to be applauded for identifying a problem with renewable energy production and being willing to invest in a solution.  Proponents of renewable energy hope for their success for obvious reasons.  If HyRef meets expectations, it could be a major player in reaching a worldwide goal of producing 25% of our energy from renewable sources.  If the system is not successful, it will at least give researchers more data to work with.

As the world becomes a smaller place through the Internet, high-speed data communications and superfast computers, our need for more energy will only increase.  Computers require a tremendous amount of electricity for both operation and cooling needs.  If we can produce more of that energy through renewable sources, it will be good for everyone involved.


There may be a day when the latest state-of-the-art data centre is completely energy independent thanks to a combined wind and solar system... when that day comes, the HyRef weather technology may be an important component in the proper management of the data centre's systems. 

Tuesday, 17 December 2013

Inmarsat Launches First Ka Spacecraft

In the race to dominate global mobile communications, the company with the most advanced technology usually comes out ahead.  Therefore, it should be no surprise that London-based Inmarsat was quite thrilled with the recent launch of their first satellite related to their new Ka-band project.  The spacecraft with the satellite on board was launched from Baikonur, Kazakhstan late last week.

Inmarsat will have to launch at least two more satellites to complete its Ka-band Global Xpress system; a data communications system aimed squarely at mobile customers.  The marine and aviation industries will be the biggest winners when the £1 billion system is operational by late 2014 or early 2015.  For now however, the company is concentrating on making sure the I-five F1 satellite launched last week manoeuvres itself into the correct orbit.

The Emerging Ka-Band


At the heart of the Global Xpress system is data and telecommunications equipment that operate on the Ka-band.  Inmarsat has invested heavily in the band for two primary reasons, the first being that the L-band has just about reached its capacity.  According to company officials, there isn't a lot of space left in the spectrum to increase capacity in a way that would benefit the company's customers.

The second reason for choosing the Ka-band comes by way of its higher frequency.  That higher frequency allows for higher bandwidth connections and much faster throughput speeds.  Inmarsat estimates they will be able to deliver 50 Mbps download and 5 Mbps upload speeds however, there is one problem: weather.

The Ka-band spectrum can suffer severe signal degradation in poor weather.  It is a condition known in the industry as ‘rain fade’. However, Inmarsat has been able to design and build a technology solution that allows its satellites to automatically switch between Ka and L-band in the event of bad weather.  The same technology has been adapted for use on ships linked to the system.

A specially designed router integrated into a ship's data communications system automatically chooses between the two bands, based on weather conditions and current throughput speeds.  As long as all of the equipment is functioning as intended, switching between the bands will be seamless from the customer's perspective.  This will ensure maximum uptime and the best speed possible regardless of weather.

Inmarsat plans to design and build a fourth satellite for backup purposes.  It will be available if the company experiences launch failures or problems obtaining orbit with the second and third satellites next year.  If the fourth machine is not needed as a backup, it likely will be eventually launched and added to the system for better performance.


In addition to the marine and aviation sectors, Inmarsat plan to make their communications system available to anyone who wants to participate.  It will allow for all sorts of communications including phone calls, high-speed communications between data centres and remote servers, managed services for mobile applications and so on.  The company hopes to make the Global Xpress system the standard in mobile satellite communications for years to come. 

Monday, 16 December 2013

Green networks go deeper than just the energy savings

Over the last six years, there has been a constant focus on carbon reduction around IT systems, both inside and outside the data centre. Most of that attention has been on power, cooling, servers and storage. 

According to Ian Wilkie, supply chain director at Brand-Rex, the next phase is to look at the network and the supply chain:

Supply Chain

Look at any large company and you will find they have publically embraced the need for a "green" policy.

One of the ways that many companies can make quick gains around their green credentials is in the supply chain. Not everything, for example, can be green. If you are commissioning data centres you have to use concrete. Every tonne of concrete releases 410kg/m3 of CO2e.

Few CIOs want to know what the carbon footprint of each item in the IT infrastructure.  Instead, the enterprise is focused on what it can control; its own carbon footprint.

That is changing. Vendors such as Brand-Rex are now talking openly about the carbon footprint of some of their products. Customers can begin to factor into their supply chain, the carbon footprint of the products they purchase. It is a major step forward for any company that wants to understand its ethical and environmental responsibilities and the first time any major IT vendor has done this.

Examples of green project investments include a hydro-dam project in Turkey where the power for purifying copper is generated - plus a project to improve transportation energy-efficiencies through the introduction of regenerative braking technology. These projects, along with other carbon reduction measures - such as pulp inserts, reduced manufacturing plant conversion energy and un-bleached boxes - helped Brand-Rex lower its products carbon footprint by over 1000 tonnes in a year.

Such investments and savings might not be applicable for every customer. What they do enable customers to know is that that they are buying from a carbon sensitive company.

Inside the data centre and comms room

Savings in the network start with simple things such as virtualisation, better cable management and choice of cabling.

Before any choice of cable is made, good management and practice is essential. Key gains such as better aisle containment, making sure blanking plates are fitted, that there are no air leaks around racks and that hot and cold air are not mixing are basic requirements. 

Switches

A big problem for older data centres and comms rooms - and one that still exists in new builds - is the location and cooling of switches. Historically, located at the back of racks, this is where the majority of the heat exists in today’s higher-powered racks.
A good solution is to use ducts to channel cold air from the front of the rack and feed it to the input side of the switch. Another duct vents hot air to the back of the cabinet. This increases switch reliability and prolongs life and helps to ensure that cold and hot aisles are properly separated without air short-circuits.

Ports mean power

Every active port on every switch consumes power. On older switches that's a lot of power; because the ports are always on and always transmitting at a power level high enough to drive a full 100 metre link – even if the link is only 20m. Even more power is consumed day-in and day-out if the port supports Power over Ethernet.

A switch technology refresh can often be justified simply by the power savings that can be achieved across the estate. Newer, more advanced switches contain smart power control. Often referred to as EEE (energy efficient Ethernet), these devices switch off the transmitters in the PHY (physical interface) when not needed, detect when far-end devices like PCs are inactive and instigate sleep-mode. They even measure the length of the link when active – turning down their power draw to match.

The operating systems of these switches have the facility to completely de-activate the PHYs of individual ports so that they consume no power at all.

Even though switch ports individually use only a watt or two, by the time you’ve multiplied that by 10,000 or so around the estate and each burning power 8,860 hours per year you will understand why – as far back as 2005 - it was estimated that the combined pool of switch ports in the USA devoured 5.3Terawatt hours (TWh) of energy a year. Imagine what that figure would be were it not for technology like EEE.

Make space for air

10Gigabit/s cabling is required by the data centre standards and is being installed in an increasing proportion of enterprise projects.  But some standard UTP Category 6A cables are bulky, and seriously impede both in-rack and under-floor airflow. So cooling fans have to work harder and draw a lot more power.

Shielded Cat6A cables are smaller in diameter whilst some specifically designed for the data centre like Brand-Rex zone cable is as thin as Cat5e cable freeing up the airflow and allowing fans to slow down and use less power.

Fibre or copper

If 10Gbit/s is the highest speed your network will ever need then copper is probably the answer. You can lay-in Cat6A cables well in advance of need, but fit only EEE Gigabit switches. In fact you can save power by delaying the roll out of more power hungry 10Gb/s switches until they are absolutely needed.

If fibre is an option for your application, then consider the power consumption for single-mode versus multi-mode versus copper and factor this into your business case. Bear in mind that one pair of single mode fibres will handle 1Gb/s, through 10 and 40 right up to 100Gb/s with no changes to the cabling. Multi-mode fibre goes from needing two fibres for 10Gb/s to eight for 40Gb/s and 20 for 100Gb/s.

Add into your thinking that it is now possible to purchase Zero-Carbon footprint copper cabling – but so far only from Brand-Rex.

Virtualisation

The use of virtualisation is well understood for servers inside modern data centres. What is not done well, and this has been due to a technology lag, is network virtualisation. Replace 1GbE copper with 10GbE copper or fibre or even 40GbE - and then virtualise the capacity.
This has several benefits.
·        Less space requirement for the cabling.
·        An option to increase network capacity per link should traffic demand it.
·        Easier to create redundant links by running two fibres rather than 20 copper cables.
·        Longevity as single mode fibre supports higher speeds now and will not need to be replaced as switches are replaced. 
There are some drawbacks as well.
·        Fibre is inherently more difficult to patch or replace onsite although this can be overcome by training.
·        Fibre ports generally draw more power than copper but this can be offset by the ability to virtualise multiple copper cables.

Conclusion

The network is often overlooked as an opportunity to improve power efficiency and reduce carbon footprint. Yet network vendors such as Brand-Rex are able to deliver carbon savings as part of the supply chain and as part of the equipment in the data centre. With environmental concerns rising up the corporate governance chain, it's time to take a hard look at your network and how it can deliver unexpected benefits.

Guest blog post by Ian Wilkie, Supply Chain Director at Data Centre Networking Specialists Brand-Rex

Friday, 13 December 2013

BT Announces Next Phase of Devon and Somerset Project

Officials in south-west England are thrilled with an announcement from BT that another 31 local communities will be added to the company's fibre network by next spring.  The addition of these communities pushes the total number of homes and businesses in the south-west with access to fibre broadband up to 44,000.

The project is part of an organised effort made possible by a joint venture between BT and local authorities – the project known as “Connecting Devon and Somerset”.  The venture aims to connect more than 90% of the region's homes and businesses to the high-speed optical fibre network by the end of 2016.

Funding for the expanded network is being provided through a £32 million central government grant, with an additional £21 million from the Devon and Somerset councils and a £42 million contribution from BT.  When complete, customers will have access to high-speed Internet at speeds of up to 24 Mbps.

In a prepared statement, minister Ed Vaizey welcomed the BT announcement as good for both local residents and the regional economy.  He noted that the UK is already the European leader in doing online business and that adding such a large area to existing optical fibre networks would boost local economies even further.

Between both public and private high-speed Internet projects, the UK easily leads Europe in providing high-speed Internet access to businesses and residences.  The country is also a leader in developing new Internet and networking technologies that are shaping the course of the digital future. The UK is definitely the place to be for anyone involved in the tech sector in Europe.

Economic Dividends Realised


Connecting Devon and Somerset have gone to great lengths to study the economic impact of high-speed Internet service in developing plans for their project's roll-out.  What they have observed suggests the potential of measurable economic dividends for the local economy.

For example, studies show that Internet users who do not enjoy a pleasant experience with a given website are not likely to stay on it long enough to make a purchase or access a service.  Likewise, consumers with slow Internet speeds are less likely to be involved in online commerce.  When high-speed access is available to both consumers and business owners, online commerce is encouraged.  Officials expect to see that exact scenario play out as more communities are added to the fibre network.

From an administrative standpoint, the faster speeds also make implementing IT services and commercial data communications a lot more attractive.  We would expect to see more technology companies courting local businesses that might be willing to purchase a suite of managed services for online business operations.


Connecting Devon and Somerset will continue working to get more communities online even as the current plans are implemented over the next few months.  At the conclusion of the rather ambitious project, the south-west of England should be more than capable of competing in the online marketplace.  Hopefully, their success will be replicated in other regions of the UK.

Tuesday, 10 December 2013

Steady Winds Lead to Record Power Production

Steady winds in late November led to an all-time record of wind-generated power, peaking on the morning of November 29.  According to National Grid PLC, wind turbines generated a peek 6,010 MW in London, allowing gas-fired plants to temporarily drop output.  The previous best was recorded on September 14 of this year.

The numbers from November 29 represent approximately 14% of the total amount of electricity supplied to the grid that day, replacing more than 7,800 MW that would have otherwise been generated by gas-powered plants.  No matter how you look at it, 14% is very impressive.  It offers a real incentive to continue pushing for more wind power generation both onshore and off.

To that end, government and private entities are hoping to design and build enough new wind projects to triple capacity by 2020.  They set a goal of supplying 15% of the UK's total power needs from renewable sources between now and then. Plans call for the installation of new turbines capable of generating 18 GW offshore and additional 13 GW onshore.

As part of those plans, the government recently announced a restructuring in the way energy subsidies are awarded to companies involved in the renewable power sector.  In coming years they plan to funnel more money to offshore projects under the assumption that many existing onshore projects can get by without such large subsidies.  The hope is to encourage more investment in offshore wind projects as quickly as possible.

Moving in the Right Direction


While it is true that renewable energy sources are not yet mature enough to overtake gas, coal and other fossil fuels, things are moving in the right direction.  If proper development and management practices are applied, there is no reason to believe we cannot reach the goals set for 2020.  However, it will require the cooperation of public, commercial and individual efforts to make it happen.

Along the way, it is important that the UK not put all of its investments exclusively into wind power.  We need to continue developing solar power, especially in the area of solar thermal – a strategy already showing incredible results in other parts of the world.  Solar thermal uses the power of the sun to heat water, or other liquid, that can be used to drive turbines that generate electricity.  Solar thermal power can also be utilised to provide space heat.

Also in the pipeline are new technologies involving wave power and biomass electrical generation.  However, neither of these two options is yet ready to be discussed on a large scale.  Research and development needs to continue in order to bring them up to speed.


There may eventually be a day when the UK produces the majority of its electricity from renewable sources.  How long it will take to reach that day is anyone's guess.  For now, we will have to be content with those brief windows when steady winds allow us to break new records for wind power generation. 

Friday, 6 December 2013

Liquid Fuel from CO2 and a Volcano

In an attempt to reduce carbon emissions as much as possible, creative companies around the world are coming up with innovative solutions for alternative fuels.  One of the latest innovations comes by way of an Icelandic company known as Carbon Recycling International (CRI).

CRI's latest project was to design and build a plant adjacent to an existing geothermal power plant in southern Iceland.  That power plant has been generating electricity by using the energy from the region's volcanic landscape.  CRI might have been inspired by local residents who have been using the hot water discharged by the power plant for a health spa since the mid-1970s.

The CRI project aims to do something similar in terms of recycling some of the waste generated by the power plant.  The only difference is that they are not interested in hot water; they are interested in carbon dioxide.  They have figured out a way to convert that carbon dioxide into methanol for use as a liquid fuel.

Methanol is created from carbon dioxide by combining it with hydrogen, copper oxide and a reliable energy source.  It is easier to recycle waste CO2 from a geothermal power plant than a fossil fuel plant because the emissions are easier to capture and separate.  The Icelandic site is perfect for the process because CRI can harness the local volcanic energy.

From a carbon footprint standard, making methanol widely available on the commercial market would have a measurable impact on greenhouse gas emissions.  It is a cleaner fuel that can be used for any number of purposes, including manufacturing.  There's only one problem: producing methanol requires extremely cheap electrical power if it is to be profitable.

Iceland is an obvious choice for this type of methanol production because electricity costs so little there.  Companies like CRI can purchase electricity for about a third of what it costs across much of Europe.  In places where electricity cost more, something else needs to be done to encourage companies to spend the money.

In the meantime, CRI will continue to develop its Icelandic plant by harnessing geothermal energy to create methanol.  Company officials say that, while the plant is not yet profitable, it will be when it is fully operational sometime next year.  Right now, the company is still ironing out whatever minor problems remain.

Where to Go from Here


CRI is to be congratulated for their efforts in Iceland however, the question of where to go from here remains.  Despite the fact that they are planning to build additional plants around Europe, it's not likely large-scale methanol production will be commercially available in the near future.  There just isn't enough cheap electrical power available to make it viable.


Right now, the focus needs to be on better management of current renewable power sources in order to achieve maximum benefits from them.  Until such time as renewable energy can compete with fossil fuels, we must approach alternative fuel sources as a supplement rather than a replacement. 

Tuesday, 3 December 2013

Prepare for a Record-Breaking Success

The launch of video game Grand Theft Auto 5 was undoubtedly a runaway success, racking up more than $1billion in sales in its first three days and setting six Guinness world records in the process. However, simple server monitoring problems have left the game’s creator, RockStar North, in a vulnerable position.

The company has admitted that it didn’t anticipate such popularity, and it is working “around the clock” to buy and add more servers. With the prospect of “servers crashing under demand,” RockStar North is typical of many businesses that may have overlooked the double edged sword of success - as demand for data increases, so does the need for energy consumption within the data centre. In this instance they have not prepared for the unexpected scale of popularity and now their reputation hangs in the balance.

Given that demand appears to be the culprit here it is worth remembering energy is still highly desired and accurate measurement of consumption (closely related to the rate of CPU) could have helped RockStar North to identify its server’s true activity and efficiency. Higher CPU utilisation reflects higher processing or increasing workload activity, which in turn requires more energy.

An easy solution

Monitoring and managing energy of either individual or groups of servers is easily undertaken by today's infrastructure technology, so the solution already exists to overcome this type of problem. More education and awareness is needed to ensure that businesses are taking full advantage.

Top of the range intelligent energy metering PDUs (power distribution units) and environmental sensors sit behind the rack and can actively monitor the data centre environment. Continually looking for threats from electrical circuit overloads, and any conditions which might place critical IT computing loads at risk, the PDU is a small but vital part of the energy supply chain. This technology enables data centre and facilities managers to make informed capacity planning decisions, improve uptime, measure PUE (power usage effectiveness) and support green initiatives. 

Organisations, from the gaming industry and beyond, need to change the way they plan and measure resources. Rather than sitting back and reacting when the worst happens, organisations should continuously monitor resources and take a proactive approach by seeking out threats before they ever happen. They need to plan for the threat of extreme success or it could be game over.

Guest Blog by Eddie Desouza, Global Marketing & Communications Director, Enlogic


Monday, 2 December 2013

Arctic Fibre to Link London & Tokyo

A Canadian technology company is betting there is more to the Arctic Circle than just cold and snow. They are planning to take advantage of a short polar route to link London and Tokyo with an optical fibre network that should be up and running sometime in 2016.  Arctic Fibre plans to start construction on the 24 TB network next spring.

The plan has the network travelling through the North-West passage as it links London with Tokyo.  The company will spend USD $620 million to build the network, including a number of 100 GB spurs that will provide high-speed Internet access to remote parts of Alaska.  The main emphasis of the network, however, is to decrease latency to speed up financial transactions between Europe and Asia.

In a world where milliseconds can mean the difference between financial loss and gain, the lower latency provided by the shorter and more direct route will have investors lining up to use the network when it's complete.  Arctic Fibre is expecting big-name stock traders to be willing to pay whatever cost necessary for access to the network.

In addition to its business benefits, the new network will also have some strategic benefits for the US Defence Department.  Network communications between the department and strategic positions in Alaska are somewhat limited due to the inherent weaknesses of satellite technology.  In latitudes above 70°, satellite communications are too slow and undependable for defence purposes.

The installation of fibre optics will change the game by allowing for consistent networking at a much higher rate of speed.  The US is especially interested in the service that the network will provide to Eareckson Air Station.

Eareckson is a US Air Force base and home to an advanced radar system used to detect both space debris and potential missile launches originating from China or Russia.  Having high-speed data communications through the network makes the base that much more effective for both Canadian and US defences.

As far as the spurs are concerned, there are plenty of Alaskan locations with no Internet communications to speak of.  The network will bring access to places like Barrow, Prudhoe Bay, Wainwright, Kotzebue, and Nome.  The Aleutians are also part of the plans.

Pushing the Limits


The harsh Arctic environment is the last place many people would think to install a high-speed fibre-optic network however, in order to overcome the limits of satellite technology, there is a need to push the limits of what is currently available through terrestrial networks.  It's not enough to rely on relatively slow satellite communications that can be dependent on weather and other conditions.


A successful installation and deployment could set the stage for similar networks in other remote parts of the world.  It appears to be just the next step in eventually linking the entire world, even in the most out-of-the-way places.  At any rate, the network is an ambitious project that will certainly yield benefits for generations to come in Europe, North America and Asia. 

Thursday, 28 November 2013

IBM Testing Disaster Intervention Cloud Solution

IBM and Marist College have announced a joint project – now in the testing stage – that could change the way data centres prepare for impending natural disasters.  The solution involves using a cloud-computing environment to re-provision data centres in minutes, using only a mobile device and a wireless connection.

When a natural disaster is imminent, data centre operators immediately set about the process of re-provisioning.  Simply put, re-provisioning moves data and online applications to another server not in harm's way.  Unfortunately, the process can take days to complete using technology currently available.  Often, data centres do not have that much advance warning.

Superstorm Sandy a Good Example


Last year's super storm Sandy is just one example of what could happen when there isn't enough time to re-provision.  As the storm bore down on the north-eastern United States, it destroyed communications networks and put a number of data centres literally underwater.  A year later, some of those data centres still haven't recovered.  In the days and weeks after the storm, there were still millions of businesses and individual consumers in the North-east without access to network data communications.

IBM's new system could be a significant game-changer should the test at Marist College prove successful.  The system takes advantage of what is known as software-defined networking (SDN) in a cloud-computing environment.  The software allows for more efficient management of data in both physical and virtual networks, while also allowing immediate changes to network resources, even from remote locations.

Disaster Intervention


The system, in theory, changes the game from disaster prevention to disaster intervention.  As soon as data centre operators know a potentially damaging storm is imminent, they could begin re-provisioning right away.  Just bring up the software on any mobile device and they are off and running.  Systems engineers don't even need to be present to get the job done.  What's more, there is no interruption in service for the millions of customers who might be using a series of data centres in the storm's path.

In a world becoming increasingly more dependent on virtualisation and cloud computing, the system cannot be ready soon enough.  IBM is working to make the software commercially available sometime next year.  It is likely to become an instant success should it deliver what IBM is promising.

For the record, the SDN lab at Marist College is sponsored by IBM, also providing testing facilities for a number of other important technologies.  The lab is heavily involved in helping IBM develop cloud-computing technologies by providing an environment for testing using real-time scenarios.  IBM has used the lab to develop an open-source SDN controller, develop a software device that can predict and prevent Internet traffic congestion and better control video streaming within the cloud.


The success of the lab suggests that the test of IBM's new disaster intervention system will go well.  We are looking forward to the commercial release of the software next year.  Once it becomes main stream, it could hopefully mean the end of service interruptions as a result of natural disasters and other unforeseen events...

Monday, 25 November 2013

Japanese Supercomputer Enjoys Boost from Cooling Fluid

The TSUBAME supercomputer from Japan's Global Scientific Information and Computing Centre has long been one of the most powerful supercomputers in the world.  What's more, the research and development conducted at the centre has enabled Japan to be very influential in multiple areas, ranging from technical education to computer-based enterprise.  So what are the chances of TSUBAME being improved upon?

Pretty good, especially if you run a company specialising in cooling fluids for data centres.  Such is the case for Green Revolution Cooling, an American based technology company that improved the performance of TSUBAME by submerging its multiple CPUs and GPUs in its revolutionary cooling liquid.

The computer, dubbed the TSUBAME-KFC, uses two Intel Xeon E5 processors and K20 Kepler GPUs from NVIDIA. By submerging the system in the speciality cooling fluid, the researchers were able to improve the 1.0 Gflop per watt performance of TSUBAME 2.0 to 4.5 Gflop. That makes the supercomputer the most powerful of all supercomputers.

What makes this breakthrough so important is the fact that the power and cooling needs of supercomputers goes well beyond that normally used by the average data centre.  Increasing output more than four times, without significantly increasing power consumption, opens the door to more practical uses of supercomputers in the future.

The Tokyo Institute of Technology is now touting this research breakthrough as yet another reason why Japan leads the world in supercomputer research.  The TSUBAME 2.5 is poised to play a major role in computer education, advanced computer simulation, further research & development and competitions designed to breed future computer technology experts from within Japanese schools.

We should expect to see Japan design and build better versions of the TSUBAME in years to come.  As for where the architecture will take us, there is no way to know however it is easy to assume that things can only get better from here.

Data Centre Applications


The technology advancements afforded by TSUBAME have great potential for the data centre and collocation market too.  If you think of it in terms of cloud computing and on-demand Internet, those benefits should be obvious.  A supercomputer capable of exponentially greater computations could increase both data transfer speeds and data crunching capabilities.

Those of us in the business already know that hardware is not keeping pace with software and theoretical advancements. We know there is so much more that could be done if we had equipment fast enough and powerful enough to keep up with the data. That's the promise TSUBAME holds.

Imagine being able to design and build a supercomputer at every data centre.  Imagine the possibilities for hosting large-scale computer systems that would dwarf even the largest mainframes of decades past.  Imagine the potential for high-speed data transfers, entire systems existing in a virtual environment and clouds that would span the entire globe.


There is an old American baseball movie with a famous line that says, “If you build it, they will come.”  We think it applies to supercomputing as well.

Thursday, 21 November 2013

Researchers Break Record for Quantum Computing

Researchers from British Columbia's Simon Fraser University have done the unthinkable.  They have managed to maintain a quantum memory superposition state, at room temperature, longer than any previous attempts. What's most impressive is the fact that the record-setting superposition lasted nearly 100 times longer than the previous record of 25 seconds.

So how long did they maintain the superposition?  An astounding 39 minutes!

If this doesn't sound impressive to you, consider the fact that quantum computing usually requires extremely cold temperatures to be successful.  For example, a superposition at absolute zero (-273C°) would be the norm for maximum efficiency however the Simon Fraser University researchers accomplished their feat at a relatively balmy 25°C.  This is a major breakthrough no matter how you look at it.

If they are able to take their test results and use them to double their time, imagine what they'll be able to do with the next jump... Is 80 minutes out of the question?  There's no doubt that quantum computing is the future of high-speed algorithms and computing functions.  As such, it is also the future of high-speed data communications and IT services.

Quantum computing uses an entirely different paradigm that works with data contained in various quantum states rather than expressed as binary data.  The fascinating thing about quantum computing is that data can exist in many different states simultaneously.

In theory, quantum computing allows data to be represented and crunched in any number of states, depending on the devices that will use and manipulate the data.  Different devices could use different states of data without disturbing the others.  Moreover, quantum states allow massive amounts of data to be handled at lightning fast speeds.

Being able to design and build a supercomputer that can maintain a quantum superposition for 39 minutes shows, at least in concept, that a future machine could be built to run indefinitely.  It certainly gives new meaning to IT power and cooling design.

Untapped Potential


For those of us involved in the data centre and collocation sector, the untapped potential of quantum computing is virtually limitless.  Just think about how much data we can currently crunch and manipulate using a binary system and state-of-the-art equipment then imagine multiplying that hundreds of times over with a quantum-based supercomputer.

We've seen it in sci-fi movies already.  We've seen imaginations of instant communications and transfers of incredibly large chunks of data, futuristic translator devices for intergalactic communications, and incredibly fast analysis of large chunks of data that could not be touched by a traditional binary system.  And you know what they say about sci-fi film technology: it's only a matter of time before it becomes reality.


It's true that there are still plenty of hurdles to overcome before quantum computing is the norm.  Nevertheless, what the researchers at Simon Fraser University accomplished is nothing less than phenomenal.  It is also nothing less than historic.  Their accomplishment clearly sets the stage for the first quantum computer with an indefinite shelf life.

Monday, 18 November 2013

American Retailer Considering Jump to Data Centres

Throughout most of the 20th century, the Sears Roebuck Company was one of America's premier retailers, offering consumers everything from clothing to household items to automotive care.  In fact, Sears pioneered many aspects of retail marketing that are now standard practice today.  However, by the turn-of-the-century a weak economy saw the struggling company looking at every alternative to keep its doors open, including merging with competing retailer Kmart.

Earlier this year they went one step further by forming Ubiquity Critical Environments.  The new company's mission is to find ways to transform underperforming retail locations into 21st century technology enterprises.

Ubiquity's first major proposal is to convert a number of the Sears Auto Center stores into mission-critical data centres serving smaller markets.  It is an idea born out of the realisation that secondary markets tend to be at the mercy of larger markets for data centre and IT services.  The Sears locations offer a number of unique advantages that make them perfectly suited for such an enterprise.

The company originally looked at the possibility of converting underperforming Sears and Kmart stores, but then realised that most of those stores were close enough to established data centres to make the project financially unviable.  The auto centre stores are a different matter.

Sears Holdings operates about 50 standalone auto centres located on the outer edges of mall property.  The stores offer up to 50,000 ft² of floor space and ceilings at least 16 feet high.  This would enable the company to move quickly by building modular units that could easily take advantage of current infrastructure and architecture.  Moreover, because the sites are already approved for industrial or commercial use, Ubiquity could set up the data centres with very little government interference.

As for power and cooling needs, company officials say that should not be a problem either.  Mall properties are typically on the receiving end of high power transmission lines already.  Furthermore, many of the sites include stranded power not being put to use in any other way.

It should be noted that many of the auto centres already have a fairly impressive power capacity due to the installed automotive equipment.  In some cases, very few modifications would be needed to accommodate the power and cooling needs of a data centre.

Should Ubiquity decide to go through with the plan, they will be collaborating with Schneider Electric; a company that is already an established leader in data centre equipment and services.  They are the perfect partner for transforming this portfolio of nearly-ready properties into mission-critical data centres for small markets.


Undoubtedly, the data centre sector in America will be watching closely at what Ubiquity does here.  The economic crash of 2007/2008 has left in its wake a much smaller retail sector and plenty of under-performing and under-utilised properties. A successful project from Ubiquity could pave the way for other struggling enterprises to pursue similar projects; and if enough companies got involved, they would drastically change the data centre landscape in the States.

Friday, 15 November 2013

Japanese Wind Turbines Offer Both Hope and Fear

Ever since the 2011 earthquake that devastated the Fukushima nuclear reactor in central Japan, the country's nuclear power capacity has been reduced significantly.  The Japanese government is looking at other sources of energy in hopes of abandoning nuclear altogether.  According to the New York Times, one potential source is wind.

The Times says Japan has enough wind offshore to power their entire country, and then some.  Some estimates suggest that they could produce up to eight times more power than they need under normal circumstances.  Moreover, as a renewable power source, the wind could produce that energy without creating a huge carbon footprint.  Unfortunately, it is not as simple or easy as that.

In order to replace just one nuclear reactor, Japan would have to build 140 offshore wind turbines.  Yet they do not have enough shallow water in which to build traditional turbines anchored to the sea floor. One solution is to design and build wind turbines that would float the same way oilrigs do.  The floating turbines would allow Japan to place them virtually anywhere they wanted.

If initial tests of three floating turbines are successful, it is likely the government will go ahead with plans to build more.  How many they build remains to be seen however, with 50 nuclear reactors to replace, you are talking nearly 7,000 turbines scattered throughout Japanese waters.  It is a great source of renewable energy, but one that could have a negative impact as well.

Japanese Fishing Industry


It's no secret that the Japanese rely heavily on commercial fishing for a significant portion of their economy.  Some fear that thousands of floating turbines would drastically alter fishing habitats to a point of severely damaging the industry.  No one really knows the impact that these floating turbines would have because it has never been done before.

The hope is that the turbines would actually help fishing by encouraging colonies of fish and seaweed to accumulate however, even if they do, the largest fishing trawlers would not be allowed anywhere near the wind farms for obvious reasons.  To those who rely on fishing for their livelihood, there is no reasonable way the fishing industry and large-scale floating wind farms can co-exist.

One last thing to consider is the cost of the floating turbines.  Just one is expected to cost as much as eight times more than a traditional anchored turbine in shallow waters or on land.  Wind farm management costs and commercial development would add to the overall price tag as well.

While the floating turbine idea is an interesting one worth pursuing, there are plenty of hurdles to overcome to make the project viable.  And that's just to provide for the general power needs for the Japanese islands.  It says nothing about using the renewable energy to power future data centres, collocation centres and major manufacturing plants.

If it works, it will be a definite coup for the Japanese government.  On the other hand, it could end up being a colossal failure....  time will tell what the outcome will be.

Monday, 11 November 2013

EC Publishes Big Data Fact Sheet in Recent Memo

The European Commission took the opportunity presented by a November 7 memo to release a fact sheet explaining the idea of what ‘Big Data’ is.  Although no official reason for why the memo was necessary has been offered, the language of the fact sheet suggests that the Commission intended to educate the average consumer about Big Data.

The definition of Big Data offered by the Commission is as follows:

“Every minute the world generates 1.7 million billion bytes of data, equivalent to 360,000 standard DVDs.  More digitised data was created in the last two years than in the rest of human history.  This trend and the mountains of data it produces is what we call ‘Big data’."

The authors of the fact sheet went on to pose the question of why Big Data is important.  Their explanation was one of how the vast amount of data collected on a daily basis is now being used to improve everything from farming to public health.  The Commission cited the 5%-6% efficiency gains in the business sector as sufficient reason to embrace Big Data in every area of life.

An example of how Big Data affects the average consumer can be found in the agricultural sector.  The memo speaks of a farmer whose future operation is plugged into the system in order to glean from the mountains of data collected from around the world.  It would be used to determine what crops to grow, when to plough and sow, and what markets will be most suitable at harvest.  It can even be used for equipment automation.

The Good and The Bad


The European Commission's Big Data memo did a very good job of presenting the topic in a positive light however true objectivity requires us to look at both sides of the equation.  Yes, Big Data is capable of improving efficiency and effectiveness on many levels, but it could also being incredibly harmful.

One example cited in the memo explained Big Data could be used to avoid a worldwide epidemic by identifying trends on social networking sites like Facebook.  Seemingly innocuous statements, like someone mentioning that they are in bed with the flu, apparently allow those in the Big Data community to establish where future problems might arise.  On the surface that sounds good... but is it really?

If a government health organisation could put that type of data to use for the benefit of citizens, it could put other types of data to use to everyone's detriment.  Moreover, simply assuming that the government would never do such a thing is to turn a blind eye to history.


Big Data does have many positive implications for everything from IT services to networking to worldwide data communications.  It has many positive implications for humanitarian efforts and education however it is something that must be treated with the utmost respect and caution.  Without proper safeguards and effective enforcement means that Big Data could be a digital disaster, waiting to overwhelm the world.

Friday, 8 November 2013

Floating Data Centres Powered by Waves – is this the way forward?

Finding new ways to harness the earth's natural resources for energy production has long been part of the human landscape.  Over the centuries, we've used the sun, wind and water as a means of making our lives better.  More recently, these natural resources are being put to use for electrical generation in ways most of us would never have imagined.  For example, there is speculation that Google is working on floating data centres to be powered by ocean waves.

Pelamis Wave Power Ltd, makers of a revolutionary electrical generation system using ocean waves, recently speculated about Google's possible plan based on a combination of events.  They say the search engine giant filed a patent for using a Pelamis wave power machine to power a floating data centre.  They say the company is also responsible for the construction of a number of mysterious barges off the coasts of Portland and San Francisco.

Although Google has been silent about its links to the barges, the US Coast Guard has publicly stated the barges belong to the Mountain View, California, company. It's quite possible Google is working on plans to design and build data centres that will float offshore and be run entirely on electricity generated from waves.

The Pelamis wave machine works by connecting long strings of semi-buoyant objects connected to one another with hinges.  As waves move through the water, the individual sections move back and forth on the hinges.  The energy of these movements is converted into electricity using highly sensitive, state-of-the-art equipment.

Pelamis deployed their pilot offshore system in 2004, becoming the first to connect to the UK grid with a wave-based device.  Since then, they have built and tested five additional systems at various locations.  The second generation of their technology is proving to be very promising for power production.

The Future of Data Centres


The implications of this technology for future data centres are very significant.  To understand why, one need only look at the state of the industry in the UK.  While the UK clearly leads Europe in data centre and IT innovation, finding the property to build and the green energy to power new data centres is challenging.  A floating data centre powered by ocean waves solves both problems in one fell swoop.

The main advantage of wave power over wind and solar is the fact that it is nearly constant.  Even on days when the water appears relatively calm at the shore, just a few miles offshore there is enough wave action to generate the electricity a floating data centre would need.  With the right technology, one of the systems could even generate surplus electricity on stormy days.  That surplus could be sent through the grid to supplement current brown energy.


Without a doubt, floating data centres would certainly open entirely new doors for collocation, web hosting and digital communications.  In the highly competitive world of IT, the company that manages to master this first will be at the cutting edge. 

Monday, 4 November 2013

LED Lights Show Promise for High-Speed Data Transmission

There's no doubt that LED lights are the latest craze for everything from household to decorative to automotive lighting.  LED lights are very attractive because they are efficient, cost-effective and very powerful however there may be a new use for LED lighting on the horizon: high-speed data transmissions.

BBC News recently reported on new research being undertaken in a joint project between the universities of Cambridge, Edinburgh, Oxford, St. Andrews and Strathclyde.  The project has enabled researchers to achieve data transmission speeds of 10 GBits per second using specially designed LED lights capable of utilising what is known as orthogonal frequency division multiplexing (OFDM).

Researchers explained their technique as being similar to what a showerhead does.  Just as a showerhead splits water into dozens of individual streams, the modulators in the customised LED lights split the blue, green and red light spectrums into dozens of individualised streams that can carry binary data.

By controlling the modulation of each of the light streams, researchers have been able to transfer data to computers with a wireless system they are now calling ‘Li-Fi’.  In theory, the technology should make it possible to increase wireless network capabilities exponentially.  A commercially viable implementation of the technology will most certainly revolutionise data communications.

The idea behind Li-Fi goes back a few years to the University of Edinburgh's Prof Harald Haas, one of the current project's most influential leaders.  Since his first demonstration in 2011, other researchers have been able to design and build LED bulbs capable of handling data communications at high rates of speed.

In Germany for example, researchers were able to achieve data transfer rates of up to 1 GBit per second, per light frequency, under optimal conditions.  Chinese researchers have achieved speeds of 150 MBit per second with a single bulb, providing data communications for four computers.

Other Advantages


Besides the obvious increase in speed, Li-Fi offers other advantages as well; amongst which is a huge improvement in local network security.  Because light waves cannot penetrate solid objects, it would be entirely possible to have a secure Li-Fi connection inside a single room that could not be accessed externally.  That capability could be extended throughout the entire office building with only a few modifications.

Another advantage is that light, when properly spaced and focused, does not suffer from the same degradation consistent with radio signals.  Where traditional Wi-Fi signals may drop out, Li-Fi may remain strong.  The potential here seems obvious in the world where radio-based Wi-Fi can be problematic.

Li-Fi technology promises to change how we think about wireless data communications.  Combined with fibre optics and other superfast technologies now emerging, we could be on the verge of a completely new era of digital communications.


Is it any wonder that the current research is being undertaken in the UK?  Not when you realise the UK leads the way in digital technology developments in Europe.  This project is yet another example of what the UK technology sector has to offer and it's all good.