Thursday, 31 October 2013

Samsung’s New African Digital Villages

Samsung Electronics has announced an ambitious plan to connect remote villages on the African continent with resources in larger cities by way of new digital villages.  The company plans to use solar power as the main source of electricity for each of the villages.  According to company officials, the project is aimed at improving the standard of living as it applies to healthcare delivery, education, and economic independence.

A ceremony to mark the official launch of the project was hosted recently in Johannesburg, South Africa.  Samsung was joined by representatives from various African countries, representatives from the UN, and plenty of social welfare groups and international organisations working on the African continent.

What It Looks Like


A typical Samsung digital village will start with a facility designed to generate solar powered electricity.  That electricity will then be used to power the rest of the facilities in the village.  Facilities will include the following:

  • Medical Centre – The tele-medical centre will use the Internet to link patients in remote villages with doctors at major city hospitals.  The solar hospital will utilise the latest networking and conference technologies including video conferencing.

  • Portable Health Centre – To enable greater access to healthcare delivery, the village will include vehicles that will take the medical centre resources out to those who need them.  These vehicles will also be solar powered.

  • Internet School – Samsung will take advantage of the Internet to provide online schools with curriculum and programs tailored to each unique culture and geographic area.  Curriculum materials will all be stored at a central data centre and accessed by students during the day.

  • Solar Lanterns – Solar powered lanterns will be used to provide lighting inside the digital village as well as homes, businesses and offices within the general vicinity.  Samsung believes their lighting systems can provide adequate lighting for as long as 10 years before needing replacement.

Building the entire infrastructure necessary to sustain this digital village is quite an undertaking. If Samsung can demonstrate a successful model in the first village they build, there's no telling how many more will be constructed across the African continent.

Plans are already in place to build digital villages in South Africa, Ethiopia and Gabon before the year is out.  Samsung is cooperating with local governments and international organisations to make it happen, including world health agencies, universities, relief organisations and even UNESCO.

Additional Potential


Samsung's announcement is certainly good news for underdeveloped countries in Africa.  The project has the potential to make a real difference in the lives of those who use the villages, however success also yields potential benefits to the IT sector.


If Samsung technology can do what it promises, it could pave the way for harnessing solar power that can be used in IT in the future.  One hundred percent solar powered data centres and collocation facilities have been a dream for a long time.  Maybe Samsung has what it takes to make it reality.

Monday, 28 October 2013

UK Approves New Somerset Nuclear Power Facility

In a move destined to be greeted by both cheers and jeers, the UK government recently approved construction of a new nuclear power facility in Somerset, England.  The Hinkley Point C power plant will be built by a consortium consisting of France's EDF Energy and a number of Chinese corporations and investors.  Once the plant is up and running, it is expected to provide cheap and clean energy for 60 years.

Those in favour of the project claim that, despite government subsidies, the average UK customer will pay up to £77 less per month for energy by the year 2030.  They say that without additional nuclear power facilities, the cost of energy produced by traditional fossil fuels will rise too far, too fast for the average UK consumer.

Critics of the project do not necessarily have a problem with nuclear power but they do disagree over the government subsidies.  The subsidies come by way of the government paying twice the current wholesale value for every megawatt hour of energy produced at the Hinkley Point plant.  According to the BBC, the strike price reached by the two sides is £92.50 per megawatt hour.

The government says that price will fall to £89.50 if the consortium follows through on plans to build a second plant in Suffolk.  They say once both plants are operational, costs to the consumer will gradually decrease over time.  Alternatively, continuing to focus on fossil fuels and more expensive sustainable power generation will only cause retail prices to continue rising.

Commercial Implications


While much of the talk regarding the new power plant has dealt with retail prices for average consumers, no one is addressing the commercial impact of the deal;  most notably, the ever-expanding IT sector.  The UK leads Europe in nearly every IT category, including data centre construction, collocation, high-speed Internet and fibre optic infrastructure.  A lot more energy is needed to continue development at its present pace.

Establishing two brand-new nuclear facilities are key to meeting the energy demands of the IT sector.  Nuclear power has a decided advantage over sustainable energy in that it does not rely on environmental factors to work properly.  And of course, it does not pollute like traditional fossil fuels.  It is the best power option for data centres, collocation facilities, and web hosting companies.

While it may be extremely costly to build the nuclear power plants, some consider them vital if the UK is going to meet the power demands of the future.  As much as we all encourage sustainable energy production, it's simply not going to be enough on its own to keep up with increasing energy demands.  The Internet age demands a more robust and powerful solution; a solution offered by nuclear power.


In a world increasingly more dependent on cloud computing, virtualisation and on-demand IT services, we cannot really afford to do without nuclear power.  The goal now should be making it safe as possible and cost-efficient to build new plants. 

Thursday, 24 October 2013

Solar Walkway Debuts in Washington

Students and researchers at the George Washington University (GWU) in Washington have announced the completion of the first solar powered walkway.  A recent exhibition of the walkway demonstrated a system comprised of a solar powered trellis and a walkway made up of 27 slip-resistant panels.  The semi-transparent panels include embedded photovoltaic components that allow them to capture energy from sunlight.

The team responsible for the design and build phase of the project created the panels with a combined average peak capacity of 400W.  That energy is used to power a system of LED lights that illuminate the walkway from below. The system ostensibly eliminates the need for some of the overhead lighting currently used for walkway illumination.

According to sources, the GWU project was undertaken in partnership with Onyx Solar.  The two have been working together on the technology since 2011.  GWU officials have expressed their enthusiasm in collaborating with Onyx Solar to develop new photovoltaic technologies.  Onyx officials have reciprocated by hailing GWU's efforts to be a leader in developing practical uses for sustainable energy innovations.

Despite the ear-to-ear smiles regarding the solar walkway announcement, questions remain as to its practical and commercial viability.  A solar powered walkway may indeed reduce the need for overhead lighting powered by more traditional sources, but can it be done cost effectively?

If the walkway is too expensive to build, it is not likely to enjoy widespread acceptance its creators are hoping to see.  And in all things technology, the secret to success is creating things with commercial appeal, otherwise it's impossible to get funding.

What It Means for the Technology Sector


We applaud George Washington University and Onyx Solar for their efforts in developing the solar walkway.  In fact, it might even be a thrill to attend one of the promotional events they hold in the future.  Yet, what they've accomplished is not likely to offer any real benefit to the technology sector at present.

The focus of the solar walkway is the photovoltaic principle of converting sunlight into direct electrical current that can be used to power, in this case, a series of LED lights however, the inefficiency of photovoltaic makes it impractical for use on a large scale.  The future of solar energy is perhaps more likely in solar thermal – a concept that uses ultraviolet rays to heat a liquid, which can then be used for space heat or electrical generation.

Solar thermal is the type of technology the IT sector needs to look at for meeting the power and cooling needs of the future.  Solar thermal is more productive, more efficient and more cost-effective in the long run.


In the meantime, George Washington University and Onyx Solar will likely try to develop their technology in a way that has broader commercial appeal.  That may mean installing the panels throughout university buildings where there is enough natural light to make their use feasible.  It might also mean installing them outdoors, campus-wide. We'll just have to wait and see...

Monday, 21 October 2013

Internet May Be Heading for the Water

Picture, if you will, an oil rig communicating with a ship heading out to resupply it.  How do the two vessels communicate?  Through acoustic waves sent through ocean waters via specialised equipment.  It is technology that has been around for decades.  That technology is about to get a lot smarter and more useful, thanks to efforts by an American researcher and some of his university students.

The University of Buffalo's (UB) Tommaso Melodia, a professor of electrical engineering, is heading up a research project that will hopefully succeed in developing an undersea version of the Internet.  He and several of his students recently ran a preliminary test of their system in Lake Erie, just south of Buffalo, New York. Using two underwater sensors and a surface-based laptop, the team was successfully able to send communications through the water that triggered sonic chirps ricocheting off nearby rocks.

Prof Melodia says his successful tests provide proof of concept for an underwater Internet.  His equipment and system solves many of the problems that have prevented current acoustic audio systems from being used on a larger scale. What's more, they open undersea communications to greater commercial applications unlike anything now being used.

To date, the biggest hindrance to large-scale underwater communications lies in the different infrastructure that exists for each system.  For example, the US-based NOAA uses a system of buoys and computers to keep track of any undersea conditions that would indicate activity such as a tsunami however their system is not compatible with others being used around the world.

The new technology that the UB group is working on solves that problem by acting as an electronic translator of sorts.  It can tie together existing systems with what may be coming down the line in the future.  The combination of underwater sensors, surface-based computers and productivity software could pave the way for a true Internet experience in the depths of the sea.

Limitless Possibilities


In the short term, researchers are looking at the technology as a means of better predicting and warning of tsunami activity however it goes well beyond that.  The technology could be used for everything from oilfield exploration management to interdicting drug smugglers using custom-made submarines to transport drugs.

In applications where interaction with the general public is concerned, the system could be tied to satellite and land-based Internet channels for access by computers and mobile devices alike.  For example, a system for tsunami detection could send text messages to alert those on the shore of coming waves.

The UB group will be presenting their research to an upcoming conference in Taiwan next month.  It will be exciting to see the reaction, as well as the direction their research takes in the near future.  They are likely to be only on the cusp of what could be a data communications revolution.


Our only question is this:  who will be the first company to establish an undersea data centre?  The race might be on a lot sooner than we think...!

Thursday, 17 October 2013

MIT Researchers Develop More Efficient Steam Condensation

All over the world, the incredibly simple process of steam condensation is used to both generate electricity and produce clean water.  Unfortunately, the condensation process can be somewhat inefficient when used on a large scale. MIT researchers may have finally developed a solution that overcomes both inefficiency and feasibility challenges.

According to a variety of sources, the MIT researchers have developed a new coating surface to be applied to condensation equipment.  The coating takes advantage of the hydrophobic effect to increase condensation efficiency, while also providing a durable surface that could potentially last decades.

The hydrophobic effect is what causes water to form beads on a windscreen or on the leaves of certain types of plants. The molecules of water are repelled by the service to which they are trying to adhere, causing them to form beads.  This effect makes condensation more efficient because water can be re-collected and put back into the steam generation system.

The problem with harnessing the hydrophobic effect for power generation comes by way of durability. Until the breakthrough at MIT, it was not possible to design and build a condensation surface that could withstand the test of time while still operating at high temperatures.  As temperature increases, hydrophobic surfaces tend to degrade.  The new MIT coating appears to solve that problem.

According to initial tests, the new coating is a polymer material that could withstand continuous use at temperatures of 100°C or greater.  Most applications currently run at about 40°C, yet degradation is still a problem for condensation surfaces.  If further testing proves the MIT coating to be as durable as predicted, it will have a huge impact on electrical generation and clean water production.

What It Means for IT


In an era of cloud computing, virtualisation and on-demand Internet, the IT industry is increasing its energy needs with every passing day.  Success at MIT could translate into more efficient and cost-effective energy production which, in turn, would make it easier to meet the extensive power needs of web hosting companies, data centres and collocation facilities.

If the technology could be adapted for new data centres where on-site power generation is built into the design, it could eventually result in the ability of such facilities to generate excess power that could then be fed back into the grid.

Imagine combining a highly efficient solar thermal system with a technologically advanced condensation system to produce all the power a data centre needs. Are we fast approaching the day when individual facilities can generate their own power independent from other sources?

If so, worldwide data communications will most certainly be improved.  We would be capable of doing so much more by virtue of no longer being constrained by traditional power sources however that day is still some way off.  For now, we must continue to make do with those resources currently at our disposal.  That means taking advantage of every opportunity to use the most efficient and sustainable energy sources.


Monday, 14 October 2013

Facebook Turning Data Centre Hardware on Its Head

It is safe to say that few other companies have made as much of an impact on the Internet and IT systems as Facebook. Originally launched as a college project by entrepreneur Mark Zuckerberg, Facebook has grown to be an international force to be reckoned with on many levels.  Therefore, it should be no surprise that hardware manufacturers are unhappy with how Facebook is handling its hardware blueprints.

As part of their Open Compute Project, the social media behemoth openly publishes hardware blueprints and data centre designs for all to see and copy. Facebook engineers already design their own systems while contracting the hardware manufacturing out to Asian companies that can produce the equipment for less than their American counterparts can. Anyone else accessing the Facebook information could use it to design and build their own systems in the same way.

Facebook has not disclosed why they are allowing other companies to freely benefit from the efforts of their engineers however we suspect the Open Compute Project follows the same line of thinking as open source software. It is a philosophy that says technology information should be available for all to see, learn from and modify as needed.

Making this data centre information freely available can only hurt the major manufacturers who sell hundreds of billions of euros worth of hardware every single year. If design blueprints are freely available from a company with a proven track record like Facebook, others might be tempted to go with low-cost hardware vendors while designing their own systems.

According to some industry analysts, this could signal the start of a new paradigm shift that will emphasise DIY system design and low-cost hardware suppliers.  This could end up being the biggest shift the networking and IT services sector has ever seen.  Imagine big players like Dell and HP taking a back seat to new manufacturers primarily because their high cost systems are no longer necessary.

More than Just Hardware


At first glance, it would seem the main players in hardware manufacturing could compete simply by lowering their prices. Nevertheless, it's not as easy as that. What Facebook has done goes well beyond just the price they pay for hardware. They have dramatically increased efficiency for even more savings.

For example, the company's newest data centre was recently opened in Sweden. It was designed to be as energy efficient as possible, using state-of-the-art technology and innovative power generation and consumption models. The facility has reduced the ratio of power and cooling to computing from 3 to 1 to approximately 1.04 to 1.  What they have created is a data centre that is three times more efficient than the closest competitor.


Other companies are looking at Facebook's accomplishment and wondering whether they can do it themselves.  They are feverishly studying the Open Compute Project to try to figure it out.  As more of them take the plunge, things are really going to change... you can bet your bottom dollar on that!

Thursday, 10 October 2013

New NSA Data Centre Rendered Impotent by Power Surges

The troubled United States National Security Agency (NSA) has been the subject of an incredible amount of international scrutiny due to surveillance practices that may eventually prove to be a violation of international privacy laws.  And now, just as things couldn't get any worse for the agency, their brand-new data centre has been rendered impotent by a series of costly power surges.

According to the Wall Street Journal and BBC News, the 1 million square-foot facility in Bluffdale, Utah has experienced 10 different electrical surges over the last 13 months. The surges have allegedly damaged up to USD $100,000 worth of computer equipment and delayed the opening of the centre for at least a year.

The $1.4 billion project includes the data centre, administrative and support space, and additional facilities including chiller plants, cooling towers, water treatment facilities and generator facilities.  Apparently, American contractors can design and build every type of facility known to man except that which requires a reliable and stable source of electricity.

News reports are not clear about what has caused the electrical surges. Nonetheless, the NSA spent six months investigating the issue without returning any concrete results. The agency is not even saying whether the problem has been fixed or not.

On their own website, the NSA has presented the issue in a way that seems rather curious.  They downplay both the delay of the data centre's opening and the power surges themselves as no big deal.  Furthermore, the website openly discusses how the agency ‘quietly and secretly’ changed its plans for opening the centre in 2012.  The entire presentation is not what you would expect from an agency like the NSA.  It all seems just a bit surreal.

Cause for Speculation


Reading the reports from various news outlets leaves more questions than are answered.  All of this gives reasonable cause for speculation regarding the delayed opening of the new NSA data centre.  Does it seem reasonable that 10 power surges over 13 months could render such a technologically advanced facility impotent?  Does it seem reasonable that a US agency with the financial resources and expertise to build the necessary infrastructure is somehow unable to solve something as common as a power surge?

Although there is no evidence forthcoming, some international security experts wonder if the alleged power surges are just a way for the NSA to cover itself in the midst of fresh scrutiny over its operations.  Perhaps the delayed opening of the data centre is really the result of some in Washington rethinking the NSA's legitimacy.

In light of whistle-blower Edward Snowden and his leaked documents, questions over the delayed opening seem legitimate.  The NSA has already been caught red-handed exceeding its authority on a number of fronts; perhaps the agency and the current Administration are trying to avoid further embarrassment.  Perhaps the new data centre will open when all of the current anxiety finely dies down. We will have to wait and see...

Tuesday, 8 October 2013

Hard Drive Theft Leads to ICO Warning

The recent theft of a hard drive from a small business owner has resulted in a significant fine and a warning from the Information Commissioner's Office (ICO).  The warning was a stern reminder to business owners of their legal obligation to protect any personal data stored on portable devices.

The theft of the hard drive occurred on a London street early in August of this year.  According to official reports, the business owner was carrying a hard drive in a case along with documents and cash.  While stopped at a traffic light, the case was stolen from his car.  While the hard drive in question was password-protected, the data on it was not encrypted so whoever stole it had easy access to all sorts of personal information.

The most troubling aspect of the theft is the fact that the business owner runs a Wembley-based loan company.  The information on the hard drive included names, addresses and other information needed to process and administer loans. For his carelessness, the owner was fined £5,000.  It could have been as high as £70,000 had he the means to pay it.

In his official remarks, ICO head of enforcement Stephen Eckersley was very clear in stating that his agency has repeatedly warned organisations of their legal obligation to protect customer data. In this particular case he said that if "the hard drive had been encrypted the business owner would not have left all of their customers open to the threat of identity theft and would not be facing a £5,000 penalty following a serious breach of the Data Protection Act."

No Longer an Option


In the IT world, we know how important encryption is to commercial network communications.  It is something we deal with on a daily basis.  Without proper security and management of customer information, it is too easy for hackers to break into computer systems and wreak havoc with the data they find.

What's more, we do not leave security entirely in the hands of mindless software applications.  It requires comprehensive training as well; training in the proper deployment of security software and filling in the holes that such software is not capable of filling on its own. It takes a combination of technology and human action to ensure maximum security.

It appears that this particular case is one where the business owner had assumed the typical ‘these sorts of things never happen to me’ mindset.  There is no other way to explain the transport of a portable hard drive with unencrypted data on it.  As demonstrated by the theft, precious data is not even safe when it's sitting next to you in a car.


Encryption is vitally important because it is usually the last line of defence against data thieves.  A password-protected hard drive is only effective against amateurs or petty thieves with no knowledge of computer systems however it does nothing against the professional whose livelihood depends on stealing data.  Encryption makes data theft significantly more difficult to pull off.

Thursday, 3 October 2013

Typhoon Usagi: No Problem for Hong Kong Data Centres

Even as weather forecasters and government officials were warning of a T-10 typhoon last month, data centre operators in Hong Kong were not panicking. They have been through these things before and they know the infrastructure and systems they've put in place can withstand just about anything the weather throws at them.

This latest storm was no exception. It proved to be a largely non-event for Hong Kong, despite the potential for wind gusts in excess of 130 mph. Areas east of Hong Kong were hardest hit by the typhoon.

It turns out that Hong Kong is one of the safest places to build a data centre according to the Cushman & Wakefield real estate firm. Despite a risk index of 16, the standards and regulations that exist for Hong Kong data centres ensure that they can withstand the most severe storms.

Here are the top five things Hong Kong data centres have going for them:

  • Redundancy – During the initial stages of design, data centre architects make a point to design and build multiple systems of redundancy.  This redundancy includes everything from multiple, independent power plants to double glazed windowpanes with anti-blast film.

  • Flood Control – The safest Hong Kong data centres employ a number of measures for flood control. They build on high ground, they raise the floors on which data centre hardware sits, and they build ‘white areas’ to act as a barrier between exterior walls and interior spaces. The entire space of the data centre is designed around flood control.

  • Planning – One thing Hong Kong does very well is plan for natural disasters.  As a standard rule of thumb, builders in that area of the world design and create structures able to withstand a worst-case, once-every-hundred-years scenario. By planning for the absolute worst, they are more than capable of handling the normal.

  • Preparation – In Hong Kong there is generally more than enough time to prepare before the onset of a typhoon. When people are hired for datacentre jobs for example, it is with the understanding they may be pulling some extra duties in the event of a storm.  Many data centres include living facilities and catering services in order to facilitate extra staff being brought in to work throughout the storm.

  • Training – Data centre employees also go through intense training in order to be ready for any storm scenario. Their training includes everything from protecting hardware to piling sandbags against doors and windows.  Data centre workers are like a well-trained army ready to answer any storm threat.


In the end, Typhoon Usagi proved to be nothing to worry about in Hong Kong.  Nevertheless, rest assured that any potential storms in the future are not likely to wipe out the region's data centres.  Buildings, hardware and personnel are all fortified against severe weather and other natural disasters.  To summarise:  that is why Hong Kong is considered one of the safest places in the world to put a new facility.