Monday, 15 December 2014

Norwegian University Heating Classrooms via Data Centre

Capturing data centre heat for other purposes is not a new idea however the practical aspects of making it work efficiently have prevented the concept from being adopted on a large scale.  That may change in the future, thanks to the example being set by a Norwegian University now heating classrooms via their data centre.

Norway's Arctic University recently implemented a server cooling system that allows them to use the waste heat generated by one of their data centres to provide heat for classrooms.  The Tromso campus is dependent on generated heat year round, no thanks to its location at 70° north.  Being able to use the waste heat from the data centre enables them to keep the campus warm while reducing heating costs and the university's carbon footprint.

The new system is a liquid cooled system utilising two loops and a heat exchanger.  One loop carries heat away from servers while the other transports that heat across campus to provide space heating in classrooms.  The heat exchanger transfers the heat from one loop to the other.

The most important aspect of the cooling system is that the liquid is moved around hot processors in sealed copper tubing that remains in place even if server trays are added or removed.  This creates constant cooling at all times, allowing data centre workers to change server configurations at will without having to shut down the entire system.

According to university officials, the liquid cooling system was chosen over air cooling because liquid is exponentially better at transporting heat.  Some of the university's data centre facilities use a combination of air and liquid cooled systems at the current time, but it intends to eventually convert everything to liquid.  It hopes the completed project will allow it to provide all of the space heat needed across the entire university campus.

Efficient Cooling for Supercomputing

At first glance, it may seem that what is happening at Arctic University is no big deal.  Nevertheless, step back and consider that the university is engaged in routine activities that are considered, by most standards, to be supercomputing.  The load on their servers produces an intense amount of heat that, if left unharnessed, completely goes to waste.  By being willing to design and build systems to capture and use that heat, the university is setting an example that others can follow.

There is no denying that today's IT services, online applications and Internet on-demand is forcing everyone to move closer to supercomputing as the routine standard.  Moreover, once that becomes reality, the power and cooling demands of the average data centre will rise accordingly.  So now is the time to get busy working on ways in which to harness data centre heat so that it can be used for other purposes.

Whether those other purposes include municipal heating or not, the amount of heat generated by supercomputing processes is too valuable to let it go to waste.  Harnessing it will go a long way toward achieving future energy goals.



Wednesday, 10 December 2014

Ofcom: Broadband Service Not What Government Wants It to Be

A new report from Ofcom clearly shows that broadband service in Great Britain is not exactly where the Government wants it to be.  Despite being a clear leader in Europe for providing affordable broadband services to residents, there are still problems within the system that have to be overcome, the regulator's report says.  For example, there is still too big a gap between the fastest and slowest Internet speeds that consumers have access to.

According to the report, a speed of 10Mbps is the standard requirement for residential homes.  Unfortunately, up to 15% of UK households do not have access to speeds that high.  An additional 3% do not even have access to speeds of 2Mbps.  Furthermore, the regulator says that 18% of British households do not have any Internet access at all.

The Ofcom report goes on to say that things are improving as evidenced by the fact that the average download speed for residential consumers is 23Mbps. In addition to this, approximately 75% of UK households have access to superfast broadband boasting speeds of up to 30Mbps, despite only 21% taking advantage of it.  The Government hopes to have superfast broadband access readily available to 95% of the public by the end of 2017.

Ofcom also points out that commercial and residential services in rural areas are still significantly lacking.  This is where the large gap in download speeds comes into play.  While some of the luckiest customers in urban areas can receive speeds of up to 350Mbps, there are rural customers who were slogging along at 0.1Mbps.  The problem, Ofcom says, is the expense of running fibre networks out to rural areas.

Having said that, the Government says solutions are being worked on to fill the infrastructure holes.  It says new technologies may make it possible to increase speeds in rural areas without such a large investment.  Let's hope it’s right – especially if it truly hopes to reach its 95% superfast broadband goal.

What It All Means

It is great that Ofcom analysed all the data and presented this report however what does it mean to both consumers and companies involved in Internet-based businesses?  It is likely to mean that we are nearing the end of the landline in Great Britain.  Landlines have not been necessary for telephone communications for years; the only thing that has kept them around is the need for dial-up Internet access among those who want it.  Nevertheless, dial-up Internet is a dinosaur that is now almost completely extinct.

Along those same lines, expanding data communications networks now make it possible for consumers to do all of their work online via a smartphone.  There is no incentive any more for people or businesses to continue paying for a landline when they can receive calls and access the Internet without it.

As we watch the landline fade off into obscurity, all eyes will be on the broadband industry and Government expansion goals.  By this time next year, we should know how close the Government is to providing superfast broadband to all…

Source:  BBC Technology News – http://www.bbc.com/news/technology-30375854


Thursday, 4 December 2014

The freedom of IT movement

Firms are increasingly relying on IT to deliver business advantages and the number of streams of data that have to be stored, managed and analysed are expanding.  This puts CIOs in a tough position. 

Forces such as cloud computing, data centre integration and security all have to be addressed, yet many IT leaders are spending vast amounts of money and time just maintaining their existing infrastructure.  This leaves little resource for innovation, in spite of increasing pressure from business leaders.  CIOs need to break out of this cycle and find new ways to get creative with technology, while making sure the lights stay on.

Outsourcing and creating a hybrid IT infrastructure is fast becoming the ‘go to’ solution to this problem.  A CenturyLink commissioned survey of 550 global IT leaders revealed that outsourcing day-to-day routines results in up to 11 percent savings in IT budgets.  In addition, outsourcing produces a higher rate of revenue growth for companies.

Companies that outsource expected to raise their investments in outsourcing by 19 percent within the next two years, according to the study.

A viable option for companies looking to take advantage of the cost benefits and expertise provided by wholesale outsourcing without completely relinquishing control, is colocation.  This model allows companies to reap the benefits of a large scale, knowledge driven data centre operation without the resource investments and costs of managing the infrastructure themselves.  Companies are able to house servers or devices in a third-party data centre; ensured of the appropriate bandwidth, security and power & cooling.

Flexibility is the key benefit of outsourcing to a third-party data centre.  Companies have the option to scale their IT infrastructure up and down dependent on business need, with minimal effort on their part, leveraging their provider’s geographic reach, economies of scale and technological reliability.

Another important benefit that companies can reap (in addition to facilitating innovation) is in the area of disaster recovery. With so much of modern companies’ business tied up in IT infrastructure, protecting assets from a potential disaster has become of paramount concern. Data loss prevention is a core focus of colocation providers, and sites are designed specifically to protect against data loss. Strategies to enhance back-up and support a business in the event of a total data failure are at the forefront of many colocation providers’ services.

In its paper, "Converging the Datacentre Infrastructure: Why, How, So What" IDC reinforces the theory that company performance is tied to an alignment of internal IT resources towards innovation, without compromising on the expertise needed to manage day-to-day maintenance and management tasks.  The analyst firm reported that by outsourcing one-third of infrastructure and related routine administrative tasks, CIOs are able to double the time spent on implementing innovative products and offerings.

Ultimately, to remain competitive and have an edge over the rest, CIOs have to be creative with the resources they have and do more with less.  Extreme pressure from data floods and ever changing business demands has reinforced the importance of a forward-thinking and infallible infrastructure. Outsourcing, in the form of colocation, is becoming the choice strategy, simply because it results in a more efficient and profitable company performance, while providing companies the best of both worlds – service expertise and freedom to innovate.

Guest blog by Mike Bennett, VP Global Data Centre Acquisition and Expansion at CenturyLink



Tuesday, 2 December 2014

Bitcasa’s unlimited storage “a wildly money-losing proposition”

When Bitcasa opened for business in 2011, it attempted a business model that would eventually result in the end of the hard drive in favour of unlimited storage on a Bitcasa cloud server. The company had every intention of being a serious competitor to Amazon and others offering inexpensive storage capacity for cloud computing. However, its own business model could end up being the California company's very undoing.

In October 2014, Bitcasa announced an end to their popular Infinite Drive platform of unlimited storage for $999 annually.  It gave customers just three weeks to migrate data to one of its new fee-based options or take their business elsewhere.  Those who did not act were informed that they would risk the potential loss of all their data.  This did not sit well with some customers, resulting in a lawsuit filed by one specific client.

Although Bitcasa won the court battle, the requirements placed upon it by the court will likely end up forcing the company into bankruptcy, according to industry speculation.  Bitcasa can simply not afford to subsidise their largest data users without a sound business model to attract more paying customers.  Gigaom says Bitcasa’s largest client was costing them $3,000-$4,000 per month by using more than 80TB of storage.

“It’s not fun to stare at your earliest and largest users in the eye and say ‘we just can’t do it anymore,’” said Bitcasa CEO Brian Taptich.  “It’s a terrible feeling.  You wish you could subsidise those [customers] forever.”

Bitcasa ran into a serious problem once it decided things needed to change.  Taptich said that the company had no way of knowing what customers were storing and, worse yet, how much of the data they were hosting had been orphaned by clients.  The only way to clean up the environment and implement better management practices was to force customers to migrate themselves to a fee-based plan.

The Next Step

The next step for Bitcasa is to get its financial house in order so the company can be saved.  It has had very patient and supportive investors thus far, but one must wonder if those investors will stick around in light of the change in plans.  If so, Bitcasa could emerge a stronger company for it.  If not, its clients would have been faced with the need to migrate anyway, so better to do it now.

Looking at the bigger picture, the data centre industry is quickly approaching the day when storage capacity may become a much more serious issue.  Big Data has resulted in companies saving every bit and byte in the hope that it will someday be usable for some sort of analysis.  In a sense, we have become data hoarders.

Business models will have to change and adapt as the total amount of stored data grows ever larger.  The only question is how and when that transformation will take place.  Perhaps Bitcasa's troubles are the start of a complete data storage revolution and something good may come from it.