Thursday, 29 May 2014

Joyent Blames Operator Error for Data Centre Crash

Joyent, the California company specialising in cloud computing and virtualisation, witnessed the crash of its US-East-1 data centre earlier this week.  The crash had the company scrambling to get all of its servers and VMs up and running as quickly as possible.  In the aftermath, CTO Bryan Cantrill stated that an operator error “took down the data centre.”

Cantrill did not go on to explain what that meant, except to say that it was an honest admin mistake rather than an actual hardware failure.  Whatever the administrator did caused every server at the data centre to simultaneously reboot.  The number of servers and amount of data involved meant that getting everything up and running again was a slow process that took about an hour to complete.

A system-wide data centre reboot is serious in terms of customer satisfaction however it is not as bad as simultaneously losing multiple data centres or inadvertently deleting customer data.  Fortunately, Joyent customers will be reimbursed for downtime based on existing service level agreements.

In the meantime, Cantrill says the incident is a great opportunity for his company to learn some important lessons about how their system works.  The company will use the knowledge it gleans to improve data centre training and develop new systems to prevent similar incidents from happening in the future.  Cantrill says no data centre jobs are at risk because of the error.  He says his company is not out to punish the administrator, but rather to learn and get better.

When the data centre first went down, company officials were quick to respond with immediate updates.  They continued updating the situation as the recovery process worked its way through each individual node.  Joyent promised a full post-mortem to determine how a single error by an administrator could have resulted in a complete centre shut down.

The Storm within the Cloud

The nature of cloud computing does not necessarily make a data centre more vulnerable to these sorts of shutdowns than a traditional, non-cloud environment.  The problem with the cloud lies in how data is stored and accessed.  In a traditional environment, the data belonging to a single customer is contained all in one place, allowing everything to start functioning again as soon as the rebooted server initialises.  Things are different in the cloud environment.

Cloud-based data and applications are split-up and distributed across multiple servers simultaneously.  That means every computer node has to be up and running before a customer's system is fully functional.  This is not a problem on a small scale however a wide-scale reboot like Joyent's takes a lot longer to recover from because there are substantially more nodes to deal with.

In the end, Joyent will learn from this in order to get better.  Like Google, Microsoft and Amazon before it, the company will figure it out and put new controls in place. However, it is only a matter of time before the next 'big one' hits…

Source:  http://www.theregister.co.uk/2014/05/28/joyent_cloud_down/

Tuesday, 27 May 2014

Welcome to the Fog: a New Way of Intranet Computing


For the last five or six years, worldwide networking has been all about the cloud.  The cloud has been marketed as the ultimate networking solution for businesses, government agencies, educational institutions and even individual users, however all of this cloud activity has been hampered by the very nature of the technology itself.  More dense Internet traffic combined with service providers incapable of offering fast enough speeds has limited cloud adoption.

Welcome to the fog.

Something known as 'fog computing' offers the latest form of networking based on bringing the cloud concept closer to home.  In other words, where a company may currently use cloud sites anywhere in the world to host IT services, they could enjoy higher speeds and fewer disruptions by bringing those services closer to the company's actual, physical location.  Though this may sound like a throwback to the old days of mainframe computing, there is something decidedly different about the fog concept.

Rather than go back to mainframe hosting in a central location, the fog concept calls for using the Internet of Things to provide 'mini hosts' within a given environment.  Simply put, every electronic device capable of plugging into a network is also capable of hosting data and applications.  Fog developers can envision everything from desktop boxes to high-capacity printers acting as access points for dozens of network computers.

The exciting thing about the fog concept is that almost every company and government entity already possesses the infrastructure to make it work.  Take the average high-speed commercial router for example.  It is not uncommon for a company's 1 GB router to be capable of working a lot faster than Internet servers and providers will allow.  That router's potential is being wasted because a cloud connection is simply not fast enough.

Connect that same router to an Intranet environment involving numerous devices and you have an entirely new paradigm. File transfers are faster, applications are more robust and everything is exponentially more productive.  It all comes down to cutting both the distance data has to travel and the number of users on a given network.

Bringing the Internet Back Home


Development of fog computing will result in 'bringing the Internet back home' so to speak.  The idea is similar to walking next door to hand your neighbour a package rather than sending it to the Post Office so they can turn around and bring it right back to your neighbourhood.  The fog concept eliminates the external network in applications where it is not necessary.  It is actually a very smart idea.

Those in favour of the fog-computing concept believe it will benefit the entire Internet community rather than just those who utilise it.  We do not disagree.  The more unnecessary traffic we can take off Internet networks, the faster and more efficient it will be for everyone.  Here's hoping fog developers can make it work as envisioned. They might be able to eliminate some of the haze of the modern cloud.



Monday, 19 May 2014

Raritan’s Dominion KX III KVM-Over-IP vs Its KX II Predecessor

AIT Partnership Group, a leading Raritan reseller in Europe and a trusted advisor on Data Center Management, Secure Network and Broadcast & Control Room Solutions, featured Raritan’s new Dominion KX III KVM-over-IP switch in their Data Centre Blog in April.   

The post describes the many advancements of the next-generation KX III, highlighting the blazing video performance of the KX III in a video comparison versus its predecessor.  Capable of IP remote video performance of up to 30 frames per second for 1920x1080 HD video, the KX III is perfect for both traditional IT applications and for dynamic, audio and video-based applications.  

Try the Dominion KX III from your computer, right now! Take a test drive.

Thursday, 15 May 2014

Battle Brewing in UK over Solar Funding

A battle is brewing between the government and the Solar Trade Association (STA) over the future of large-scale solar projects in the UK.  At the core of the dispute is a recent government decision to reduce some of the funding for large-scale projects in favour of diverting the money to medium scale and rooftop projects.  If the consultation published earlier this week becomes reality, the changes would go into effect from April 1 of next year.

The proposal from the Department Of Energy and Climate Change (DECC) calls for restricting access to the Renewable Obligation Certificate for any new solar PV projects of more than 5 MW.  Then DECC believes this step is necessary if they are to support medium scale and rooftop projects without adding significant cost to consumer utility bills.  The STA vehemently disagrees.

STA officials are on record as saying the government is using financial management as an excuse to “define the energy mix” in the UK.  They believe the price of solar power has fallen significantly enough that the government is spending much less on the development of new projects than they had originally anticipated.  The STA believes there is no reason to begin limiting funding now.

In order to limit that funding while still supporting smaller scale solar development, the DECC proposal would also separate the current feed in tariff for installations over 50 kW into two categories:  standalone and non-standalone projects.  This would enable the government to make more FIT funds available for rooftop solar installations.  They hope the increase in funding will encourage both commercial and residential property owners to consider opportunities to design and build rooftop solar PV systems to supplement grid consumption.

Picking Winners or Responsible Management

On its face, the dispute between the DECC and the STA seems to be valid on both sides.  The government does have a responsibility to make sure all funding for the solar industry is used responsibly and in a way that provides enough stability to help the industry continue moving forward.  On the other hand, the FTA is legitimately concerned that a lack of funding for large-scale solar projects in the future could dampen industry growth that has, thus far, been very encouraging throughout Britain.

So what's the answer?  That really depends on your view of the type of relationship government and business should have.  What is clear is that there are no easy answers in this current dispute.  One way or the other the different players are going to have to work things out in a way that benefits everyone involved – including UK consumers who ultimately pay for the power produced through renewable sources.

Whether it is wind, solar, wave or biomass power generation, the ultimate goal is to provide a service that consumers can afford.  That might require the government to rethink the way it spends the money they have allocated to renewable energy development. For better or worse, the government is not a bottomless pit of money.



Tuesday, 13 May 2014

Toyota Adapts Hydrogen Fuel Cell for Stationary Power

Toyota wowed the automotive world last year when it introduced the first production-level hydrogen fuel cell electric vehicle at the Tokyo Motor Show.  Since the unveiling, the Japanese automaker has been preparing Western markets to start taking shipments of the revolutionary vehicle next year.  In the meantime, Toyota is not sitting around waiting to see whether the car will be a commercial success or not - the company has taken its hydrogen fuel cell technology to the next level by adapting it for stationary use.

Out in Los Angeles, Toyota operates a six-building campus hosting a number of very important sales operations for the North American market.  The climate of Southern California is such that the facility experiences extreme power needs during peak summer months.  In order to compensate, the company built a 1.1 MW hydrogen fuel cell and installed it on the campus.  The result is a power cell that can produce up to half the power needed within the six buildings on any given day.  Toyota says that is enough energy to power 765 American homes.

The strength of the system is how easily it can be switched on and off to compensate for heavy loads.  It is an easy next step to automate the system so that it requires no manual control.  When load reaches a certain level, the power cell will automatically kick in to balance things out.  When consumption dips below peak load, the system would shut off.

Toyota has proven, at least in concept, that their hydrogen fuel cell has wider applications than just electric cars.  The company is now many strides ahead of the competition in developing reliable and powerful sources of electricity that do not use fossil fuels.  It will be exciting to see where this technology goes in the near future.

Greener by the Day

One of the criticisms against renewable energy sources is that none of them can provide the raw power of fossil fuels.  It is a valid criticism.  For example, there hasn't been a single wind farm ever built that can compete with a traditional coal or oil-fired power plant.  However, what Toyota has shown us is this: you can combine multiple technologies together to overcome the need for fossil fuels.  Indeed, we are becoming greener by the day.

Let's assume Toyota could combine solar thermal, wind, biomass and hydrogen fuel-cell technology for the power needs of a brand-new campus.  Making it work would mean the campus buildings could be completely powered without the need for connecting to the grid.  It could be that their fuel-cell technology is the missing piece that makes this all possible.

With the proper combination of sustainable energy sources, it is entirely possible that we are only a few years from powering an advanced data centre without the need for any external sources.  On a larger scale, we could adapt the technologies to power the homes and businesses of entire cities.  The possibilities are enough to let the imagination run wild.


Friday, 9 May 2014

Paper Batteries: The Next Generation of Mobile Power?

Imagine having to replace the battery in your mobile phone with a small strip that is no thicker than the average piece of paper.  Over the years, we have heard many ideas about how to create powerful batteries with ultra-small footprints; too many ideas, in fact.  Now it appears that we are on the verge of seeing one of these unique ideas come to fruition.

EDN Network's Steve Taranovich recently reported on an American company developing an ultra-thin battery composed of electrodes, current collectors and an electrolyte – all housed in a thin strip of paper.  The company, known as the Paper Battery Company, has already managed to design and build a prototype that works as advertised.  The prototype is a long way from commercial production at this point, but they have proven the concept is workable.

For any battery to work properly it needs three things: a storage medium, an electrolyte to carry ions through the system and electrodes (anodes/cathodes) to complete the electrical circuit.  Limited materials and technology has meant that the oldest batteries were extremely large in relation to their intended uses.  As technology has advanced, all of these elements have become smaller and smaller.

In the case of the paper battery, the developers managed to use a specific type of paper that also acts as the electrolyte.  That is significant in terms of reducing the size.  The electrodes are made of carbon nano-tubes or graphene and kept separated from one another by the paper.  Energy collection and storage is accomplished with something as simple as aluminium foil.

The advantages of this type of battery are numerous.  First of all, a paper battery would be extremely simple to manufacture in large roles that could be cut to size for specific uses.  This leads to the second obvious benefit: paper batteries would be extremely cost-effective.  Finally, these would be environmentally friendly in terms of both production and disposal.  Remember that paper batteries would not contain the heavy metals we find in most modern batteries.

Not New, But Still Exciting


Taranovich explained in his piece that what the Paper Battery Company has accomplished is not necessarily new, although it is exciting.  Other projects have also been undertaken at Purdue University in the US and Portugal's New University in Lisbon.  In both cases, the designs were slightly different modifications of the same principle:  using specially produced paper as both the battery housing and the electrolyte.

We will be keeping our eyes out for a detailed publication explaining what the future might hold for paper batteries.  In the meantime, just thinking about the possibilities is pretty exciting.  Imagine what a paper battery could do for mobile communications by reducing the amount of space a mobile device needed for power storage.  It could make for larger processors, larger video cards, or even devices that are smaller overall.

Indeed, it's great to be part of the world of modern technology.  We cannot wait for the future to get here.



Thursday, 1 May 2014

Microsoft – US Court Begin Sorting out Search and Seizure Authority

A US Magistrate's court judge by the name of James C Francis issued a ruling last December (2013) compelling Microsoft to turn over stored e-mail data as requested by a subpoena from the United States government.  The subpoena was part of ongoing government surveillance carried out by the US National Security Agency (NSA) under supposed statutory authority contained in the Stored Communications Act (SCA) that was made law in 1986.

Microsoft disagreed with the legal authority of the subpoena, filing a motion to challenge it in court.  The company recently lost its appeal before the same judge that issued the original order.  According to Francis, the US government does have the authority to subpoena stored computer data regardless of where it is physically held. In this case, the data is on a server in Ireland.

Despite Judge Francis' ruling, Microsoft has no intention of turning over the requested data.  Its legal team has maintained all along that it expected such an outcome.  It also plans to take its case all the way through a federal appeals court and, if necessary, up to the US Supreme Court.  The company believes that, just as the US government does not have the authority to search a private home located in another country, it also has no right to search the contents of a data centre server located in another country.

The government's position rests in the fact that Microsoft is a US-based company with servers located within its borders.  Furthermore, some of the information regarding the e-mail account in question is also stored on those US servers.  The government believes the circumstances make that e-mail account a US-based account subject to US search and seizure laws.

Regardless of your position, rest assured this would not be the last decision on this issue.  The age of cloud computing has brought to fruition what many privacy experts were warning years ago: the blurred lines between national borders thanks to cloud data being scattered across sites around the world.

Troubling Trends


Edward Snowden's whistle-blowing efforts last year blew the lid open on the previously secret operations of the US NSA.  For that much, the world probably owes him at least some measure of respect and gratitude.  However, we have now been made aware of some troubling trends regarding how world governments treat what is supposed to be private information.  If Microsoft ultimately fails in its motion to block the federal subpoena, what will be next?

Will the US government begin subpoenaing data centre reports detailing every bit and byte transferred to and from its servers on a daily basis?  Will it start combing through private communications between family members, private financial records including sensitive data like bank accounts, or any other electronic record they believe is vital to the security cause?  It is not beyond imagination, that's for sure.  


For the sake of Internet security and freedom, let's hope US courts step up and put an end to the over-reaching arm of government.