Monday, 15 December 2014

Norwegian University Heating Classrooms via Data Centre

Capturing data centre heat for other purposes is not a new idea however the practical aspects of making it work efficiently have prevented the concept from being adopted on a large scale.  That may change in the future, thanks to the example being set by a Norwegian University now heating classrooms via their data centre.

Norway's Arctic University recently implemented a server cooling system that allows them to use the waste heat generated by one of their data centres to provide heat for classrooms.  The Tromso campus is dependent on generated heat year round, no thanks to its location at 70° north.  Being able to use the waste heat from the data centre enables them to keep the campus warm while reducing heating costs and the university's carbon footprint.

The new system is a liquid cooled system utilising two loops and a heat exchanger.  One loop carries heat away from servers while the other transports that heat across campus to provide space heating in classrooms.  The heat exchanger transfers the heat from one loop to the other.

The most important aspect of the cooling system is that the liquid is moved around hot processors in sealed copper tubing that remains in place even if server trays are added or removed.  This creates constant cooling at all times, allowing data centre workers to change server configurations at will without having to shut down the entire system.

According to university officials, the liquid cooling system was chosen over air cooling because liquid is exponentially better at transporting heat.  Some of the university's data centre facilities use a combination of air and liquid cooled systems at the current time, but it intends to eventually convert everything to liquid.  It hopes the completed project will allow it to provide all of the space heat needed across the entire university campus.

Efficient Cooling for Supercomputing

At first glance, it may seem that what is happening at Arctic University is no big deal.  Nevertheless, step back and consider that the university is engaged in routine activities that are considered, by most standards, to be supercomputing.  The load on their servers produces an intense amount of heat that, if left unharnessed, completely goes to waste.  By being willing to design and build systems to capture and use that heat, the university is setting an example that others can follow.

There is no denying that today's IT services, online applications and Internet on-demand is forcing everyone to move closer to supercomputing as the routine standard.  Moreover, once that becomes reality, the power and cooling demands of the average data centre will rise accordingly.  So now is the time to get busy working on ways in which to harness data centre heat so that it can be used for other purposes.

Whether those other purposes include municipal heating or not, the amount of heat generated by supercomputing processes is too valuable to let it go to waste.  Harnessing it will go a long way toward achieving future energy goals.



Wednesday, 10 December 2014

Ofcom: Broadband Service Not What Government Wants It to Be

A new report from Ofcom clearly shows that broadband service in Great Britain is not exactly where the Government wants it to be.  Despite being a clear leader in Europe for providing affordable broadband services to residents, there are still problems within the system that have to be overcome, the regulator's report says.  For example, there is still too big a gap between the fastest and slowest Internet speeds that consumers have access to.

According to the report, a speed of 10Mbps is the standard requirement for residential homes.  Unfortunately, up to 15% of UK households do not have access to speeds that high.  An additional 3% do not even have access to speeds of 2Mbps.  Furthermore, the regulator says that 18% of British households do not have any Internet access at all.

The Ofcom report goes on to say that things are improving as evidenced by the fact that the average download speed for residential consumers is 23Mbps. In addition to this, approximately 75% of UK households have access to superfast broadband boasting speeds of up to 30Mbps, despite only 21% taking advantage of it.  The Government hopes to have superfast broadband access readily available to 95% of the public by the end of 2017.

Ofcom also points out that commercial and residential services in rural areas are still significantly lacking.  This is where the large gap in download speeds comes into play.  While some of the luckiest customers in urban areas can receive speeds of up to 350Mbps, there are rural customers who were slogging along at 0.1Mbps.  The problem, Ofcom says, is the expense of running fibre networks out to rural areas.

Having said that, the Government says solutions are being worked on to fill the infrastructure holes.  It says new technologies may make it possible to increase speeds in rural areas without such a large investment.  Let's hope it’s right – especially if it truly hopes to reach its 95% superfast broadband goal.

What It All Means

It is great that Ofcom analysed all the data and presented this report however what does it mean to both consumers and companies involved in Internet-based businesses?  It is likely to mean that we are nearing the end of the landline in Great Britain.  Landlines have not been necessary for telephone communications for years; the only thing that has kept them around is the need for dial-up Internet access among those who want it.  Nevertheless, dial-up Internet is a dinosaur that is now almost completely extinct.

Along those same lines, expanding data communications networks now make it possible for consumers to do all of their work online via a smartphone.  There is no incentive any more for people or businesses to continue paying for a landline when they can receive calls and access the Internet without it.

As we watch the landline fade off into obscurity, all eyes will be on the broadband industry and Government expansion goals.  By this time next year, we should know how close the Government is to providing superfast broadband to all…

Source:  BBC Technology News – http://www.bbc.com/news/technology-30375854


Thursday, 4 December 2014

The freedom of IT movement

Firms are increasingly relying on IT to deliver business advantages and the number of streams of data that have to be stored, managed and analysed are expanding.  This puts CIOs in a tough position. 

Forces such as cloud computing, data centre integration and security all have to be addressed, yet many IT leaders are spending vast amounts of money and time just maintaining their existing infrastructure.  This leaves little resource for innovation, in spite of increasing pressure from business leaders.  CIOs need to break out of this cycle and find new ways to get creative with technology, while making sure the lights stay on.

Outsourcing and creating a hybrid IT infrastructure is fast becoming the ‘go to’ solution to this problem.  A CenturyLink commissioned survey of 550 global IT leaders revealed that outsourcing day-to-day routines results in up to 11 percent savings in IT budgets.  In addition, outsourcing produces a higher rate of revenue growth for companies.

Companies that outsource expected to raise their investments in outsourcing by 19 percent within the next two years, according to the study.

A viable option for companies looking to take advantage of the cost benefits and expertise provided by wholesale outsourcing without completely relinquishing control, is colocation.  This model allows companies to reap the benefits of a large scale, knowledge driven data centre operation without the resource investments and costs of managing the infrastructure themselves.  Companies are able to house servers or devices in a third-party data centre; ensured of the appropriate bandwidth, security and power & cooling.

Flexibility is the key benefit of outsourcing to a third-party data centre.  Companies have the option to scale their IT infrastructure up and down dependent on business need, with minimal effort on their part, leveraging their provider’s geographic reach, economies of scale and technological reliability.

Another important benefit that companies can reap (in addition to facilitating innovation) is in the area of disaster recovery. With so much of modern companies’ business tied up in IT infrastructure, protecting assets from a potential disaster has become of paramount concern. Data loss prevention is a core focus of colocation providers, and sites are designed specifically to protect against data loss. Strategies to enhance back-up and support a business in the event of a total data failure are at the forefront of many colocation providers’ services.

In its paper, "Converging the Datacentre Infrastructure: Why, How, So What" IDC reinforces the theory that company performance is tied to an alignment of internal IT resources towards innovation, without compromising on the expertise needed to manage day-to-day maintenance and management tasks.  The analyst firm reported that by outsourcing one-third of infrastructure and related routine administrative tasks, CIOs are able to double the time spent on implementing innovative products and offerings.

Ultimately, to remain competitive and have an edge over the rest, CIOs have to be creative with the resources they have and do more with less.  Extreme pressure from data floods and ever changing business demands has reinforced the importance of a forward-thinking and infallible infrastructure. Outsourcing, in the form of colocation, is becoming the choice strategy, simply because it results in a more efficient and profitable company performance, while providing companies the best of both worlds – service expertise and freedom to innovate.

Guest blog by Mike Bennett, VP Global Data Centre Acquisition and Expansion at CenturyLink



Tuesday, 2 December 2014

Bitcasa’s unlimited storage “a wildly money-losing proposition”

When Bitcasa opened for business in 2011, it attempted a business model that would eventually result in the end of the hard drive in favour of unlimited storage on a Bitcasa cloud server. The company had every intention of being a serious competitor to Amazon and others offering inexpensive storage capacity for cloud computing. However, its own business model could end up being the California company's very undoing.

In October 2014, Bitcasa announced an end to their popular Infinite Drive platform of unlimited storage for $999 annually.  It gave customers just three weeks to migrate data to one of its new fee-based options or take their business elsewhere.  Those who did not act were informed that they would risk the potential loss of all their data.  This did not sit well with some customers, resulting in a lawsuit filed by one specific client.

Although Bitcasa won the court battle, the requirements placed upon it by the court will likely end up forcing the company into bankruptcy, according to industry speculation.  Bitcasa can simply not afford to subsidise their largest data users without a sound business model to attract more paying customers.  Gigaom says Bitcasa’s largest client was costing them $3,000-$4,000 per month by using more than 80TB of storage.

“It’s not fun to stare at your earliest and largest users in the eye and say ‘we just can’t do it anymore,’” said Bitcasa CEO Brian Taptich.  “It’s a terrible feeling.  You wish you could subsidise those [customers] forever.”

Bitcasa ran into a serious problem once it decided things needed to change.  Taptich said that the company had no way of knowing what customers were storing and, worse yet, how much of the data they were hosting had been orphaned by clients.  The only way to clean up the environment and implement better management practices was to force customers to migrate themselves to a fee-based plan.

The Next Step

The next step for Bitcasa is to get its financial house in order so the company can be saved.  It has had very patient and supportive investors thus far, but one must wonder if those investors will stick around in light of the change in plans.  If so, Bitcasa could emerge a stronger company for it.  If not, its clients would have been faced with the need to migrate anyway, so better to do it now.

Looking at the bigger picture, the data centre industry is quickly approaching the day when storage capacity may become a much more serious issue.  Big Data has resulted in companies saving every bit and byte in the hope that it will someday be usable for some sort of analysis.  In a sense, we have become data hoarders.

Business models will have to change and adapt as the total amount of stored data grows ever larger.  The only question is how and when that transformation will take place.  Perhaps Bitcasa's troubles are the start of a complete data storage revolution and something good may come from it.



Thursday, 27 November 2014

Google Signs Long-Term Deal for Wind Power

When Google flips the switch on their brand-new data centre in Delfzijl, Netherlands, it will be powered entirely by renewable energy generated by local power providers.  Some of that power will come from a new wind farm now being constructed by Dutch energy company Eneco.  Google and Eneco just signed a 10-year agreement that will see the search engine giant purchase all of the energy produced by the wind farm through until 2026.

Google is building the new data centre near the port of Eemshaven, on the Netherlands' north coast.  It chose the location due to it being the site of a major fibre-optic cable providing data services to the region.  The area is also home to a number of significant energy providers capable of generating the power Google will need for the facility… and it will certainly need a lot.

The data centre will be large enough to cover an area greater than 40 football pitches.  Within its walls will be tens of thousands of servers providing hosting and collocation for companies all over Europe.  A data centre of this size will require a tremendous amount of power for running the servers and keeping the entire building cool.  Google will use all of the 62 MW generated by the Eneco wind farm… and then some.  For their part, Eneco says construction of the wind farm will provide approximately 80 local jobs for a year and a half.

Signing the deal with Eneco is just the latest in Google's ongoing efforts to use more renewable energy.  In addition to the Eneco agreement, the company has also signed power purchase agreements (PPAs) with two other companies in Europe.  What's more, it has now signed a total of eight power agreements around the globe enabling it to purchase renewable energy.  Google is clearly committed to adopting a green strategy for its new data centre operations.

A Boost to the Industry

By signing renewable energy agreements with companies such as Eneco, Google is doing more than just helping local environments and keeping its own energy bills and check.  The company is also providing a much-needed boost to the renewable energy industry. Keep in mind that renewables are an expensive proposition requiring the promise of payback before companies are willing to invest large sums of money.

Every time a company such as Google signs a PPA, they enable renewable energy companies to design and build the architecture necessary to make renewable energy profitable and efficient.  In this case specifically, Eneco has the incentive to put the time and money into the new wind farm because that company is guaranteed all of the power will be purchased.  If the wind farm proves successful in meeting Google's needs, and we have no doubt that it will, Eneco will be encouraged to build more wind farms for other customers.

We are getting closer to the day when the data centre industry will be relying almost exclusively on renewable energy sources.  That day cannot come soon enough...


Friday, 21 November 2014

Report: Many CIOs are Failing to Understand their Ever-Evolving Role

It wasn't too long ago that the main responsibility of the chief information officer (CIO) was to ensure that the company IT department kept up with the interoffice e-mail system and the website however that was then.  Today, however, things have changed.  The modern CIO is an integral part of company management with a more prominent role than ever before yet many of them do not fully grasp their new role, according to a brand-new report from the Society for Information Management (SIM).

The report lays out the role of IT and the CIO in the modern business environment.  It then goes on to discuss how the new roles are supposed to fit into the larger business environment and the fact that the new rules are not being implemented as they should be in many companies.

For example, information technology now involves more than just keeping company computers running.  It is all about transmitting data, analysing data and linking together every organisational department through effective communications.  This new paradigm suggests that the CIO needs to worry as much about company management as infrastructure support.

The report presents data from a study involving just over 1,000 responses to a survey presented to senior IT leaders.  Among the participants, 451 identified themselves as a CIO – either by title or workplace role. According to the data, CEOs are unhappy with the fact that their CIOs do not seem to understand their new roles within the company structure.  CIOs are still keeping computer systems functioning but they are not providing the data, analysis and other IT tools needed to help companies be at their best.

Possible Solutions

 With IT services and technologies consuming ever-larger portions of the company budget, more attention needs to be paid to the issue of CIOs not meeting executive management expectations.  There are three solutions that need to be considered, either separately or in a combined effort.

Firstly, companies need to adjust their cultures so that the IT department is no longer considered a separate, stand-alone entity that exists by itself in a back corner of the building.  IT needs to be considered just as integral to the overall success of the company as the sales force, labour and office staff.

Secondly, the CIO position must be elevated to executive level management at companies in which this has not yet happened.  As with the chief financial officer or chief operations officer, the CIO should be reporting directly to the CEO as a member of the executive management staff.

Thirdly, the CIO needs to be included in the discussions of any business decisions involving the other members of the executive management.  As one of the company's senior officers, he or she cannot be expected to contribute to their full potential if they are not included in executive level decision-making.

There is little debate the roles of both IT and the CIO have evolved over the years.  That evolution is now occurring at a faster rate, requiring more urgency to get it right.



Friday, 14 November 2014

Telecom Fraud Reveals Oft Overlooked Security Flaw

Cyber security these days focuses almost entirely on electronic data breaches by way of network hacks, malware and the like.  And rightly so however the recent fraud conviction of a telecom director suggests that we might be ignoring one of the most fundamental aspects of fraud – to our own detriment.  What is it we are ignoring?  The old-fashioned con artist.

Matthew Devlin, a 25-year-old telecom director from Halifax, recently appeared before a magistrates’ court after he was caught impersonating a security official in order to gain sensitive customer information.  Devlin apparently contacted Everything Everywhere (EE), among other telecoms companies, in an effort to obtain user names and passwords for customer accounts.  He succeeded in obtaining the information he was after, relating to more than 1,000 customers.

Devlin intended to use the information to determine when mobile customers were in line for an upgrade so that he could contact them and pitch his own company's products and services.  Magistrates’ court fined him £500 and assessed a £50 victim surcharge and more than £430 in court costs.

More Severe Penalties

Upon reading the penalties imposed on Mr Devlin, it is hard to imagine he will be deterred from trying the scheme again.  After all, what is a £1,000 bill if he can successfully sell tens of thousands of pounds in new products and services?  Not much, according to Information Commissioner Christopher Graham. Graham was quoted as saying:

“Fines like this are no deterrent.   Our personal details are worth serious money to rogue operators.  If we don't want people to steal our personal details or buy and sell them as they like, then we need to show them how serious we are taking this.  And that means the prospect of prison for the most serious cases.”

The thing we seem to be forgetting is that fighting con artists is completely different from fighting cybercrime at the local data centre or commercial IT department.  By their nature, con games involve the human element which, unfortunately, makes them harder to thwart.  The only way to combat them effectively is with a combination of efficient training and harsh penalties that make such activities a losing proposition.

In most parts of Western Europe, we tend to take an approach toward crime that only deals with the issues around the edges.  Simply put, we are more prone to deal with the symptoms of crime than the actual cause of it.  Therefore, while we can continue to develop sophisticated digital technologies to protect networking and sensitive communications, we allow people such as Mr Devlin to brazenly impersonate security personnel to steal personal data.  Moreover, when caught, we impose penalties that amount to nothing more than a slap on the wrist.

Christopher Graham is right. If we are to prevent this sort of fraud in the future, the penalties for such crimes need to be tougher.  They need to be harsh enough that criminals will be forced to think two or three times before perpetrating such crimes.


Thursday, 13 November 2014

2bm design and build a data centre for ARM in US

Data centre design and build specialists 2bm have recently received some great news.  One of their most recent and challenging contracts has been nominated for three separate awards in the Datacenter Dynamics North American Awards 2014.

1.    Special Assignment Team of the Year
2.    Innovation in the Medium Data Centre
3.    The Green Data Centre

The project was carried out for ARM, who were looking for an experienced and qualified team to provide them with a customised data centre solution that aligned with their ethos of innovation, efficiency and sustainability (as per EUHPC) in Austin, Texas.

Despite never working with ARM in the US before, 2bm has worked alongside them on a facility in Cambridge, UK.  This project was, therefore, intended to serve as a high level model for Austin.

Project Summary:
·        ARM required a 2.0 MW High Performance Cluster (HPC) data centre
·        950 kW operational Jan 2014 + 950 kW on-demand March 2014
·        N+1, Tier III, PUE 1.30 or below
·        Conforms to TIA-942, AIA, EU Code of Conduct for Data Centres, NFPA, ASHRAE, & Local Codes

Senior Project Manager Gordon Smith represented 2bm, working closely alongside Digital Realty. This time the task was to deliver multiple, innovative energy saving features within a single facility.

The system was designed to efficiently handle varying IT loads up to 22kW per cabinet, with the flexibility to dynamically control cooling and adjust to low loads per cabinet.

Innovation features of the project included high density, water cooled racks, using high supply water temperature (75°F).  The temperature of each cabinet (81°F) is tightly controlled  and Water to RDHx loops, supplied via a Venturi negative pressure system, provide full leak prevention to data halls.

In addition, the energy consumption was fully optimised with zero water consumption.
The project represented a series of ‘first times’ for 2bm and a number of the partnering companies. For example, this was the first time 2bm had worked alongside Digital Realty, and then the first time 2bm and ARM had worked together on a project in the US.

“We had an immediate need for a high efficiency, high density data centre. But, more importantly, we have a long-term need for a partner who can potentially help us with our go forward data centre strategy.” -- John Goodenough, Vice President of Design Technology and Automation at ARM Holdings

“The facility is an exceptional example of an energy efficient data centre with every aspect meeting and in some cases surpassing the requirements of the EU code of conduct for data centres. We believe that the design of the ARM NAHPC data centre fully embraces and surpasses the sustainability credentials required for commercial buildings with the city of Austin.” -- CEEDA Recognition

The full list of finalists in all judging categories can be found here and the winners will be announced early next month.
2bm is proud of what has been accomplished and excited about the potential to do this again globally.

To read more about the project - including design highlights - click here.

Guest blog by Ashleigh Soppet, Digital Marketing Manager, 2bm




Tuesday, 11 November 2014

Free Cooling. Is It Really Free?

The term ‘free cooling’ has been used more and more in recent years in relation to data centre cooling. However, on the basis that there is no such thing as a free lunch, we suspect there’s no such thing as FREE cooling either.  In this blog, Alan Beresford, Technical and Managing Director at EcoCooling explains the true costs of so-called free cooling.

If we took a rack of servers and put it in field in a cool environment like the UK you could just about claim you had totally free cooling.    However, in truth, there is the cost and power usage of the two or more blower fans in every server. So even on a cool day, in a cool field, free cooling isn’t actually free!

Let’s move into the real data centre scenario. Provided the external fresh-air is below 18-20C and provided we can force enough of fresh-air through the data hall, then we have nearly-free cooling going on.

However, we now we need big fans to blow around 10 cubic metres per second of fresh-air through the data hall for every 100kW of IT load (about 30 to 50 racks).  So, in addition to the server blower-fans we’re going to need power for these big air-movers.  

In practice we also need filtration. This increases air resistance which adds to the fan power requirement.  We also need to add evaporative cooling (where the fresh-air is cooled by the effects of water evaporation) to deal with the few days per year where the outside temperature is over 20C. 

So the power budget is now up to 3-4kW per 100kW of IT load – a PUE of 1.03 to 1.04. Whilst this is still not ‘free’, it’s massively cheaper than the conventional refrigeration-based cooling systems that have been deployed for the last twenty years or more.

As yet, probably less than two per cent of data centres are cooled with fresh-air and evaporative cooling.  And whilst a lot more could be, it’s not appropriate for every data centre. But we’ll cover that later.

Refrigeration dominates:
Refrigeration–based cooling systems come in a number of formats, the main examples are:

DX CRACS – where there is a DX (direct expansion) compressor and heat exchanger inside a CRAC (computer room air conditioning) unit within the data centre hall. Pipework containing a refrigerant connects the CRAC to a fan-assisted condenser unit outside.   The refrigerator unit inside the CRAC extracts the heat from the data centre hot air and then transports the hot refrigerant to the condenser where the heat is expelled into the atmosphere.

Chilled-Water systems, where a refrigeration unit generally sits outside the data centre. This uses the standard compressor, evaporator and condenser-plus-fans model of refrigeration. However, it requires an additional heat exchanger to chill a water circuit that transports low temperature water to either a data-hall CRAH unit (computer room air handling) or to in-rack solutions like rear door coolers (where another heat exchanger with fans extract the heat from the data hall air).

A legacy chilled water refrigeration system can use up to 100 per cent of the IT load – that’s 100kW of cooling power per 100kW of IT load!

Modern refrigeration systems have benefited significantly from variable-speed fans and consume somewhat less.

Refrigeration with free cooling:
A lot of manufacturers have realised, righty, that there are a large number of days each year, in temperate countries, where the outside air is theoretically low enough to cool the data centre without the refrigeration system being used - and hence save significant power and energy cost.

The trouble with this idea is however two-fold:  Firstly, you still need to power internal and external fans and pumps.

Secondly, in ‘free cooling’ mode a system designed for refrigeration is fairly inefficient.  As a result, ‘free cooling days’ are not those with temperatures up to 18-20C, the inlet temperature to the air cooler unit would in theory need to be below 14C.

In practice, however, external chiller units are generally installed in ‘chiller-farms’ either on the roof or on the ground. There can be significant leakage of hot exhaust-air back into the chiller inlet. This means that the inlet is almost never below 14C. So, in some installations, despite the theory, you’ll practically get zero ‘free-cooling’ days.

Higher temperature, hidden cost:
Theoretically, you can get more ‘free cooling’ days from a system if you increase the server inlet temperature. Under recently relaxed rules from ASHRAE (which sets data centre cooling standards) the input temperature to the servers can be elevated from 18C temperature to 27C.

With say a 10C heat differential from front to back of the servers, this means the exhaust air will be at around 37C.

However, from ASHRAE’s own published figures component reliability on the servers is quite badly reduced – typically 30% more heat related failures at 27C compared with 20C inlet air temperature. Not great in a mission-critical facility.

It’s a little known fact that the servers themselves use more energy at higher supply air temperatures.  A consequence of using high inlet temperatures to maximise cooling plant efficiency can increase server energy use by 3%.   The PUE may look good but the actual operating cost may go up.

And then there are the implications for the engineers and all the power and data cabling, patching and network switches that are housed at the back of the rack - which were never designed to work with ambient temperatures near 40C.

Indirect air cooling:
Some data centre operators are beginning to understand that in many situations, though not all, it’s OK to blow filtered fresh-air through the data hall and servers.

Indirect air systems are a compromise with an air-to-air heat exchanger to keep the data hall air separate from the external fresh-air.   

If, say, your data centre is close to a chemical works, or an inner city full of exhaust fumes, it can make sense. But the downside is that, with two air circuits, you need two sets of fans and the convoluted airflow path increases the air resistance in both circuits and the practical power use is more like 10-15kW when you add up all of the fan power usage.

Most indirect air systems, even those that use evaporative cooling, also need refrigeration for some days of the year, adding the cost of a refrigeration system to the expensive heat exchangers.

Nearest to free:
I’d love to be able to tell you that direct fresh-air cooling is the universal panacea for nearly-free cooling. But, sadly, I have to say it’s only for some people - because you can’t deploy direct fresh-air cooling at every site, nor in every climate. 

EcoCooling now has evidence from 200 installations and from research studies by Cambridge University that show internal data hall air can meet clean-room standards and ASHRAE humidity requirements without any need for dehumidification. 

And all of those 200 data centres have been able to operate for 365 days/year without any need for refrigeration back-up.

But even at PUE of 1.05 to 1.10, it’s still not quite free!

Guest blog by Alan Beresford, Technical and Managing Director at EcoCooling

For more information please contact sales@ecocooling.org or +44 (0) 1284 810586


Wednesday, 5 November 2014

Cloud Sprawl: Are There Too Many Clouds in the Data Sky?

There is a new term emerging in the world of cloud computing: 'cloud sprawl’.  It is a term being used to describe current conditions in which organisations have multiple cloud environments all in place simultaneously, with each one including multiple instances of virtualisation.  The principle of cloud sprawl is based on the municipal planning concept of urban sprawl; it denotes growth that is quickly getting out of control.

The early days of cloud computing were marred by excessive capacity and not enough personnel and systems to properly manage it all.  Despite fairly rapid adoption in North America, Europe was not as quick to catch on because of the implied weaknesses of the system.  Things are now much improved thanks to better management however some experts fear the principle of cloud sprawl could tip things back in the other direction.

For example, a company working with its own enterprise server may no longer have just one cloud.  In fact, most do not.  Most have multiple cloud environments used to serve different groups of people; for example, they might have a private cloud for company employers and vendors and a completely separate cloud for the general public.  Driving these multi-cloud environments is a new love of distributed computer systems.

Another potential problem is one of new cloud administrators being given a piece of new technology and running wild with it, only to find that things get out of hand very quickly.  Those concerned with cloud sprawl say now is the time to get control of cloud environments before these become completely unmanageable.  It is something that needs to be dealt with at data centres and corporate IT facilities alike.

The Cloud Is Here To Stay

It would appear as though the cloud is here to stay.  There was some speculation a few years ago, but the broad adoption of cloud computing has pretty much cemented its place in the world of Internet technology. Furthermore, Internet use is only going to expand as we move into the future.  It is not likely the global community can reach its goal of worldwide Internet access without continuing to utilise the cloud for everything from web-based applications to IT services.  It is what it is.

Having said that, as long as web administrators are going to begin thinking of ways to attack cloud sprawl, an equal amount of attention needs to be paid to on-demand Internet.  It is the insatiable thirst for streaming data and real-time applications that are driving the need for ever-increasing speeds.  Any methodologies put in place to control cloud sprawl have to be measured against the ability to provide for the world's on-demand needs.

It is an exciting time to be part of the world of data centres and cloud computing.  As with the entrepreneurs of the early industrial age, those of us involved in developing the future of the Internet face a daunting world of exciting challenges.  Only time will tell where we end up…


Thursday, 30 October 2014

Pilots Association Call for Strict Drone Regulation

The association recently called on the Government to introduce strict regulations protecting commercial airlines and passengers before allowing large-scale drones to be flown in UK airspace.  BALPA says its members are already concerned after a number of run-ins with smaller drones that are not yet regulated as strictly as pilots would like.

In one such incident this past May (2014), a drone came within 25 metres of a commercial flight landing at Southend Airport.  Though 25 metres may not sound a dangerously close distance to the average airline passenger, it is much too close for a pilot trying to control a large passenger jet on a landing approach.  The drone in this case was close enough to cause great concern.

BALPA contends that the smaller drones now in use post enough concern for commercial pilots. Larger drones eventually intended to carry cargo could pose even more danger in the skies.  The association says the rules that currently regulate small drones would be ineffective and inappropriate for larger vehicles.

BALPA is not against the use of drones in UK airspace.  In fact, the association's general secretary says the UK should welcome the unmanned aircraft in order to take advantage of the opportunities these offer however he says drones should be as safe as manned aircraft at all times.  He went on to say that UK residents deserve to be informed before any sort of unmanned aircraft is flown over their neighbourhoods.

Regulations Are Coming

It is important to hear from organisations such as BALPA in regards to unmanned aircraft.  The fact is that it is only a matter of time before larger commercial drones are operating in our skies to handle everything from cargo delivery to data communications to wireless networking.  Moreover, with that inevitability is the reality that regulations are coming also.  In making its voice heard now, BALPA is ensuring that it has a place at the table when the discussion on regulation commences.

With multiple stakeholders networking and sharing ideas, regional regulations can be put in place that will let us make the best use of drones without endangering the public.  That's what this is all about.  By being proactive with regulations, the UK can create an environment that allows us to be a world leader in yet another emerging technology.  To that end, it is imperative that policy makers get to work on creating regulations now, before those larger drones are ready to take to the skies.






Monday, 27 October 2014

New Seattle Data Centre Proposes to Recycle Waste Heat

At the corner of Sixth Ave and Bell St in downtown Seattle, Washington (USA) sits a parking lot.  That parking lot is within a reasonable distance of a newly built Amazon data centre as well as numerous high-rise office buildings and ongoing development projects.  If a company known as Clise Properties has its way, the lot will become a 12-story data centre with a state-of-the-art system that recycles waste heat and sends it to nearby office buildings.

Clise Properties and Graphite Design Group propose to design and build a new data centre with a completion date of early 2017.  They have already submitted initial plans to the local Planning Award for review.  The key to getting the approval they need is their waste heat recycling system.  It is a system that Clise has already successfully implemented for Amazon as part of the Westin building development.

The new data centre will devote the first two floors to UPS and power and cooling needs.  The remaining eight floors will be data centre space at about 11,000 ft.² per floor.  The space is likely to be used by everything from individual hosting companies to enterprise clients leasing dedicated equipment.

As you know, a data centre of this size will generate tremendous amounts of heat.  Temperatures in the region of 38°C are not unusual for the air coming off cooling fans however, rather than allowing this heat to be wasted through the venting though, Clise plans to use an extensive duct system to send it to nearby office buildings that are part of the same development complex.  This will enable the other buildings to keep tenants warm while still reducing energy bills.

What Clise is doing in Seattle is not unique to them; other projects designed to recycle waste heat have been undertaken by Telecity (France), Telehouse (UK) and IBM (Switzerland) to name just a few.  Recycling waste heat is an extremely simple and efficient way to reduce energy costs by harnessing something that would otherwise be vented into the air.

Onward and Upward


It seems unlikely that the city of Seattle would reject the plans submitted by Clise and Graphite barring some exceptional circumstance no one is aware of yet.  The plan to build a state-of-the-art data centre in one of America's most tech savvy cities has many obvious advantages.  Being able to recycle waste heat for the purposes of space heating is an added bonus that should effectively secure development approval.

It would be nice to see this sort of data centre facility become the de facto standard all over the world and that may very well happen.  As the world looks for ways to be more energy efficient and cost-effective it just does not make sense to continue wasting the heat generated by server farms.  It is essentially free heat just waiting to be harnessed for other purposes.  To not take advantage of it is akin to throwing a lot of money away.



Tuesday, 21 October 2014

Complex Issues Raising Concerns about Energy Security

Energy security in the 1970s was anything but a settled issue.  Rolling blackouts and other interruptions were common in an era when energy production was dependent almost entirely on coal. Although strides have been made in the decades since, there are new concerns over energy security that could cause significant problems in coming years. The good news, according to The Guardian, is that we are in no danger of seeing the lights go out in the immediate future, however things could change beginning with the winter of 2015 / 2016.

At the heart of the issue are a number of complex problems surrounding how we currently produce and use energy.  On one hand, the UK is attempting to get away from burning coal in order to address climate change yet, on the other hand, coal is extremely cheap right now.  It is financially more advantageous to produce electricity with coal than natural gas, leading to existing gas-fired plants being shut down and delays in the construction of new plants.

In the arena of renewable power, things are not as far along as many in the UK would have hoped.  According to The Guardian, just 6% of Britain's total energy was produced by renewable sources last year.  That will have to change drastically if we are to continue reducing our dependence on coal and natural gas.

The current dichotomy between renewables and meeting demand is made even more complex by the question of investment.  The Government has limited funds to invest in commercial research and development, customer subsidies and so on.  At the same time, the big five power companies are not likely to put large amounts of money into what could end up being an energy gamble without Government support behind them.  Nevertheless, who decides which projects are funded and which are left by the wayside?

Decisions Must Be Made


The UK leads the world in two key areas that are, at the current time, opposed to one another: addressing climate change issues and pushing the world into the future of high-speed network communications.  Yes, the UK is on the cutting edge of renewable energy.  Nonetheless, we are also the first to design and build new data centres and high-speed networks to serve the average consumer.  Unfortunately, these facilities and networks consume a tremendous amount of energy that needs to come from somewhere.

Energy consumption has fallen in the UK by about 10% since 2005 / 20006 however, our energy margin has remained steady at 6%.  That margin could be as low as 4% late next year if the right mix of unfortunate conditions is present.  Avoiding that means making decisions now that will increase the margin substantially.

We can continue to design and build state-of-the-art data centres and renewable power generation projects nevertheless, at some point, we have to decide which priority is more important.  Without a miracle power source we have not yet discovered, it seems we cannot have the energy security we want with both.



Monday, 13 October 2014

US Company Builds World's Largest Thermoelectric Generator

What would you say if you were an oil rig manager and you received a call from a new start up claiming that they could save you as much as 3% of your fuel by harnessing the heat produced by diesel generators?  If you spent millions of pounds every year on fuel, you would likely be intrigued by the offer.  Well, it turns out that the scenario we just proposed is not fiction.  An American company has built the world's largest thermoelectric generator that can drastically cut engine emissions and reduce fuel consumption.

The company, known as Alphabet Energy, has managed to design and build their device based on a highly efficient thermoelectric material discovered by researchers at the University of Michigan (USA).  The material converts heat into usable electricity through a process known as the Seebeck effect.

The Seebeck effect is a naturally occurring phenomenon in which two opposing metals create an electric current when responding to the temperature differences between them.  Thermoelectric materials taking advantage of this phenomenon have been around for decades however, until the University of Michigan discovery, there has not been a material that could be used cost-effectively for commercial purposes.  The materials have simply been too expensive to produce.

Alphabet Energy CEO Matt Scullin says his company's new device can be connected to the exhaust of a 1,000 kW generator and, through the Seebeck effect, reduce fuel consumption by some 2.5% simply by capturing the heat and using it to generate electricity.  The electricity would be enough to reduce fuel consumption by more than 52,000 litres.

Scullin went on to say that, the device could be scaled down for use with smaller engines, or scaled up to accommodate larger generators.  He says the first applications for his company's device will most likely be found in the oil, gas and mining industries.  These could use the thermoelectric generator in remote areas where diesel powered generators provide the electricity for normal operations.

Other Potential Applications


We would be lying if we said we were not intrigued by the idea of a scalable thermoelectric generator.  Could it be that this technology may have other applications in the future?  For example, MIT Technology Review says that power plants waste as much as 80% of their heat energy through exhaust.  If that heat energy can be harnessed with a scalable thermoelectric generator, why could we not do the same thing for a data centre?

There are already some projects under way aiming to use the heat generated by data centres for the purposes of community heating or driving turbines that help cool the data centre in question.  Nevertheless, what if we could also use some of that heat to generate electricity that could either be sent back to the grid or utilised by the facility?  It is an intriguing question.

Time will tell if Alphabet Energy's new device enjoys commercial success.  If it does, we can see plenty of potential for other applications in the near future.



Friday, 10 October 2014

Australian Government Now Officially Cloud First

Adoption of the cloud within Australia's government agencies has not been as successful as hoped which has led federal officials to establish a new 'cloud first' policy that goes into effect immediately.  The policy was recently set forth in a document drafted to outline the responsibilities of each agency in adopting and procuring cloud services.

According to news sources, the document reads, in part, “under the Government’s Cloud Policy, agencies now must adopt cloud where it is fit for purpose, provides adequate protection of data, and delivers value for money.”

The new policy essentially dictates that all federal agencies will use cloud services wherever these will reduce costs, increase productivity and protect sensitive data.  The document also outlines benchmarks and procedures that agencies can use to determine whether cloud services are applicable to them or not.  If so, they are expected to solicit bids from a number of contractors previously confirmed by a government panel as being able to provide adequate service.

Government officials hope to save up to 30% per year by adopting cloud services where practical.  Whether that includes IT services and data management functions is not clear.  What is known is that adoption of the cloud will reduce the Australian Government's costs for infrastructure, hardware and software.

Australia's move does not come as a surprise to those who pay attention to cloud computing.  The Government previously commissioned a study to determine how much money could be saved by using the cloud to eliminate duplication and fragmentation while still increasing productivity.  Earlier this year, they also revised a policy that made seeking offshore cloud services difficult.  The new policy language now makes it easier than ever.

Cloud procurement in Australia has been slow since the start of the decade.  With total expenditures of only AUS $4.7 million over the last four years, it is clear that government agencies are reluctant to make the switch.  Now they will have no choice.  As long as cloud services meet qualifications, all government agencies must choose them as the first and best option.

A Better Government


Proponents of the plan say the new cloud first policy will be of great benefit.  They say it will make for a better government that is leaner, more efficient and spending taxpayer money more wisely.  The biggest concern is coming up with a list of qualified service providers able to meet contracts without artificially inflating prices however that's what the Department of Finance special panel is for.

Those opposed to the plan voiced concerns over data security and price competition.  They are not convinced the Government will save money in the long run, nor do they necessarily believe security in the cloud will be adequate.

As for cloud providers, it is too soon to tell whether the competition for Australia's government agencies will be hot and heavy.  We suspect it will be slow going at first, but more providers will get into the market as more agencies make the transition.  That is usually how it goes.



Tuesday, 7 October 2014

Unused Phone Boxes Getting New Lease of Life

There is only one thing in London that has achieved the same iconic status as the red double-decker bus: the red phone box.  Unfortunately, mobile phones have made London's thousands of red phone boxes obsolete.  So much so that city officials are begging for ideas for re-purposing them.

How about using them as charging stations for the very mobile phones that have rendered them useless?  That's a good idea, according to entrepreneurs Harold Craston and Kirsty Kenny.  Craston and Kenny recently unveiled five new green boxes that can be used to recharge a mobile phone at zero cost.

The new green boxes are fitted with multiple charging stations to handle the most popular phones.  Their power comes from 86cm solar panels fitted on the roofs of the boxes.  The panels can create enough electricity to keep the charging stations working all day, providing a 20% charge in about 10 minutes.

Craston and Kenny are funding their enterprise through adverts.  Each box is fitted with a video screen showing fun and interesting adverts from companies that have signed up.  Among the current advertisers are well-known companies such as Tender and Uber.   Craston and Kenny say that 30% of the advertising space is set aside for community projects.

In order to deter theft, the screens are reinforced and the phone boxes are locked at night.  They will be open during regular business hours and during the daylight hours on weekends.  Craston and Kenny plan to open an additional five boxes sometime next year.

Award Winning Idea


Craston and Kenny are to be commended for their ingenious idea.  Not only does its meet a growing need among Londoners for mobile phone charging, it also re-purposes old phone boxes and does so in an environmentally responsible way.  The official business, known as Solarbox, was the winner of this year's LSE Emerging Entrepreneur of the Year award.  It also took second place in the running for the Mayor of London's Low Carbon Entrepreneur of the Year Award.

Thus far, the green boxes attract roughly six users per hour.  Moreover, with the ability to charge up to 100 phones per day, Solarbox is a great example of a unique idea and then has realised commercial success because of the hard work and entrepreneurial spirit of two innovative thinkers.

These are the types of ideas we need in the UK to continue maintaining our position as a leader in technology and green initiatives.  It is such a simple idea, yet one that is quite powerful in the grand scheme of things.  Just think about the impact the green phone boxes will have once Solarbox has hundreds of locations opened throughout London.b Every mobile phone recharged by solar power is one less phone being plugged into a wall socket.  We may get to the point where one of the most exciting events of the day will be going out to the street to charge the phone.  Imagine that!

Source:  BBC Technology News – http://www.bbc.com/news/technology-29455992