Thursday, 11 August 2016
The second-largest airline carrier in the US is still struggling to regain normal operations after a data centre failure that grounded hundreds of flights and stranded thousands of passengers worldwide. Somewhere around 2.30am EDT on Monday, August 9 Delta staff in Atlanta were unable the access computer networks for some unknown reason. Operations around the country and, eventually, the world also suffered the same fate.
The US-based company, which is part of the SkycapTeam consortium that also includes Air France-KLM, has not offered any real concrete answers about what caused the problem. But, in the days following the outage, they have struggled to get their computer systems back online and all the data synced across their worldwide network. The airline says it is doing everything it can to return service to normal.
Initial reports suggest that Delta technicians were running a routine test of backup power procedures when a piece of equipment was inadvertently tripped. That failure ostensibly locked airline computers out of access to both Georgia Power and their own reserve backup generators. With no power, the system shut down.
However, another rumour has emerged suggesting a fire might have taken out the airline's main data centre in Atlanta. Some sources say that as technicians were attempting to switch computer networks to a backup generator, a fire broke out, destroying two generators in the process. In either case, Delta's computer networks went down due to a data centre failure related to a lack of power.
As of Wednesday, August 10th 2016, things were still not back to normal. A few thousand of Delta's flights were back on schedule, but airport information boards were not necessarily correct. Information on the company's website pertaining to arrivals and departures could also not be entirely trusted. Delta Airlines continues to investigate what went wrong.
Delta Airlines is sure to take a PR beating as a result of its data centre failure. And although there will be new strategies put in place to prevent future outages, the company's networks were already operating up to standards as far as we know. Their data centre had backup power in place for purposes of redundancy, just as would be expected, but the perfect storm occurred in just the right way to cause a big problem.
The lesson to be learned here is that no network is invulnerable. No matter how much technology we put in place, no matter how much redundancy we include, computer networks will always be at risk of failure. It is something we have to learn to live with. That does not help the thousands of Delta passengers stranded around the world, but it is the reality in which we live. Computer networks are not perfect.
Hopefully, Delta would be more forthcoming in the future as to what caused the failure. Their willingness to share information will help others avoid similar problems in the future.
Tuesday, 19 July 2016
We have smartphones, smart cars, and smart homes filled with dozens of smart devices. So, are you now ready for “smart cities”??? They may have been a fanciful thing of the past for futurists and dreamers, but smart cities are now here. They are beginning to emerge thanks to billions of devices across the globe able to communicate via the internet. And yes, data centres are playing a big part.
The data centre of the future is likely to be the bedrock of the smart city for obvious reasons. But, before we get to discussing what that might look like, let us first consider where we are right now. ITProPortal's Laurence James recently wrote a very timely blog post in which he cited data suggesting that upwards of 1.6 billion devices will be connected to smart city infrastructure before 2016 is out. He mentions things such as smart transport, traffic management systems via connected cars and even the local rubbish bin that is capable of sending a message that it needs to be emptied.
James used the 2012 Olympics in London as an example of how smart cities are already working. Officials at TfL had to put a system in place to manage traffic that could support up to 18 million journeys per day. The system they settled on used data analytics to predict traffic patterns so that trains, buses and other options could move through London as efficiently as possible.
Data Centres at the Heart of Smart
At the heart of smart is the data centre. But here's the thing: in order to make smart cities a reality, we are going to need a lot more local data centres that are capable of processing tremendous volumes of data extremely quickly. Relying on regional data centres will simply not be enough.
This presents a problem; especially in an era when we are trying to reduce our carbon footprint while at the same time consuming less energy. As we already know, data centres are hungry consumers of power. We need to find a way to reduce power consumption if we are going to build enough data centres to support smart cities without completely obliterating our energy goals. The solution appears to be the Solid State Drive (SSD) 'flash' drive.
In his post, James explains that experts predict mechanical hard drives will be capable of supporting 40 TB of data by 2020. As tremendous as that number is, it is insufficient. The good news is that SSDs should be able to support 128 TB at 10% of the power and 6% of the volume required by mechanical hard drives. In other words, SSDs can handle more data at faster speeds, at a lower cost, and with a smaller footprint requirement.
Smart cities are here now. In the future, they will be driven by local data centres that rely on SSDs to handle the massive data flow. Who knew the technology behind the flash drive in your pocket would be so integral to powering the future?
Wednesday, 13 July 2016
Most of us are fully aware of the fact that the UK is a world leader in clean energy – particularly in the area of solar - so it should be no surprise that a new analysis offered by the Solar Trade Association (STA) reveals that producers have recently hit the latest milestone in solar energy production, by generating nearly 24% of the total energy demand during the afternoon hours of 5 June 2016.
According to the STA, the UK is now home to almost 12 GW of solar power capacity that, at peak generation, can produce up to 25% of the nation's total energy needs. The STA is firmly behind solar as the best way to provide clean energy and reduce dependence on fossil fuels. Chief executive Paul Barwell was quoted by E&T magazine as saying, "This is what the country and the world needs to decarbonise the energy sector at the lowest price to the consumer."
Solar Farms and Rooftop Installations:
The popularity of solar power in the UK is evident by the rapid uptake of both solar farms and rooftop installations. According to E&T magazine, one particular rooftop installation in Telford consists of 14,000 solar panels on top of a commercial building operated by Lyreco. The magazine goes on to say that all of the clean energy sources currently in use in the UK, combine to provide more than 25% of the UK's total power generation.
Across the UK, more and more homes are being fitted with solar panels for two purposes. Consumers are utilising PV systems to generate direct electrical current and solar thermal systems for hot water and space heat. Commercial and industrial enterprises are also embracing solar for space heat, process heat and hot water.
The STA says that all the solar industry needs at this point is one more "push from the government" to reach its goal of being subsidy-free sometime early in the next decade. The government seems like it is on board, for now.
Solar for Data Centre Requirements:
We are thrilled to know that solar and other clean energy sources are doing so well and, to have UK solar capacity reach this most recent milestone is certainly encouraging. It leads us to wonder if we will ever see a viable solar application for powering data centres. Finding some sort of renewable solution is critical given the fact that data centres are among the most prolific power consumers in the world. If we can find a way to get data centres off fossil fuels, doing so would have a tremendous impact on meeting clean energy goals.
Solar isn't adequate for data centre needs in its present form. But we can envision a day when highly efficient solar thermal systems with sufficient storage capacity could be used to generate the power requirements of a data centre in order for it to operate 24/7. A development like this would certainly be exciting and one that all of us in the data centre industry would be absolutely thrilled to see.
Source: E & T – http://eandt.theiet.org/news/2016/jul/solar-power-uk-high.cfm
Thursday, 30 June 2016
The Brexit votes had barely been tallied and made official when opponents of the outcome established an online petition calling for a second vote. That much was expected in the days and weeks leading up to the vote, given that polling showed things to be extremely close. What was not expected is an almost ridiculous lack of security that has allowed the petition to be tainted by auto bots.
According to the BBC, the House of Commons petitions committee has said it has already removed 77,000 invalid signatures coming from people allegedly living in Antarctica, North Korea, the Sandwich Islands and even the Vatican. Although officials say that most of the remaining signatures now appear to be from people living in the UK, there is no way to know how many of those signatures were added legitimately as opposed to being placed on the petition through auto bots.
An Appalling Lack of Security
The re-vote petition is already the most active petition ever placed on the Parliamentary website. The BBC says it currently has 3.6 million signatures. However, one computer security expert told the BBC that any site like the House of Commons petition site needs to have security measures in place to defeat intrusions. We clearly agree.
What's most appalling about the lack of security in this case is the fact that stopping auto bots is relatively simple. It's not as if we are talking about encrypted malware or tough-to-detect rootkits that go to the heart of computer networking systems. Auto bots are nothing more than computer scripts that log onto a website and submit or retrieve data without any human intervention. They can be stopped with something as simple as a captcha script.
Because whoever designed the petition site was so careless, there is no way of knowing how many of the signatures on the petition calling for a second EU vote are legitimate. But it goes beyond just this petition. How many other petitions have been affected by the site's lack of security?
The BBC references a group that runs the 4chan message board as being one of the primary attackers of the re-vote petition. According to their report, one of the message boards members claims to have signed the petition some 33,000 times simply by running an auto bot.
Things Must Change Now
For the record, the House of Commons petitions committee says it will continue to monitor the situation for any additional evidence of auto bot activity. Meanwhile, Prime Minister David Cameron has said there would be no second vote, regardless of the petition and its signatures.
That's all well and good, but something must be done to improve the security of the petition site now. If we cannot trust something as simple as online petitions as being secure, we are left to wonder how many other government websites are equally vulnerable. Shame on the House of Commons and their web developer for such a stunning lack of security.
Source: BBC – http://www.bbc.com/news/technology-36640459
Tuesday, 14 June 2016
We all know Apple as a maker of computers, smartphones, tablets and wearables. Now it appears that the California company is getting into the renewable energy business thanks to a deal signed with a US landfill to utilise methane gas. This could be a precursor to other similar projects around the country…
According to various news sources, Apple has reached an agreement with Catawba County in North Carolina, one of the southern states along the US East Coast. Catawba County will lease 3.7 acres to Apple for 16 years. At the conclusion of the lease, Apple will have an opportunity to vacate the premises or sign for an additional five years.
Apple has not detailed what it plans to do with the renewable energy that it creates at the Blackburn Resource Recovery Facility. It could be used to generate green electricity or be sold as-is to customers who need gas fuel.
How It Works
Landfills in the US typically deal with the methane produced via waste decomposition by simply venting it into the air. But a growing number of operators are now installing energy plants to trap the methane gas, process it and then use it for other purposes. This is exactly what Apple will be doing.
Catawba County plans to harness 40% of the methane produced by the landfill and sell it to Quadrogen Power Systems for treatment and processing. They will then pass the processed gas along to Apple. The remaining 60% of the methane will be used by the county to supply some of its energy requirements.
Speculation abounds that Apple will use the methane gas to produce electricity for a data centre it also operates in the county. But that remains to be seen. Such a use would make complete sense given Apple's commitment to eventually powering as many of its facilities as possible using renewable sources, but how much benefit the company will realistically get from the methane harvested from the county landfill may not do much in the grand scheme of things. It may be too little in the end.
Another Piece of the Puzzle
Irrespective of how much power Apple actually generates from the new deal, it is less important than the fact that its plans are yet another piece of the puzzle. As the world's data centre needs expand, the amount of energy consumed by bigger and more robust facilities will only increase. We have to find ways to power the data centres of the future without relying on fossil fuels. That may mean a combination of renewable sources that include sun, wind, water and biomass.
Harnessing methane is a particularly exciting prospect because we are already producing the gas anyway. Just by burying our rubbish and letting it decompose, we are creating a gas that can be harnessed for multiple purposes. Indeed, methane is one of the greenest biomass energy sources available to us. Apple's decision just helps it take one step closer to eventually using only renewables.
Monday, 6 June 2016
Given that a large percentage of the power used to run the average data centre is directly related to cooling, builders and designers do their best to locate new facilities in locales with cooler climates and lower humidity. The idea is to save money by reducing the amount of power used for temperature and humidity control. Still, the curious among us want to know if a data centre could still operate at peak performance under conditions twice the current norms.
We are about to find out thanks to a test to get under way shortly in Singapore. News reports say the world's first tropical data centre is now in the planning stages and involves a number of big-name partners including Dell, Hewlett-Packard Enterprise, Intel, ERS, Fujitsu and others. The consortium will set up a controlled test environment in an existing Keppel data centre for the test.
Current standards dictate that data centres not be allowed reach temperatures in excess of 20°C with a relative humidity of no more than 50-60%. Those numbers will be almost doubled for the test. The test centre will be allowed to reach 38°C and the relative humidity upwards of 90%.
Researchers appear to be at least somewhat optimistic that their test will prove data centres do not have to be kept under such tight controls. If they are proven correct, the test will open the door to a much larger geographic area in which data centres could be built without compromising performance.
Temperature, Humidity or Contaminants?
Current standards for temperature and humidity at data centres have not really been questioned over the last 30 to 40 years. As with so many other things in the digital arena, there is even considerable debate as to how the industry arrived at the current standards and if these are even scientific at all. Indeed, a number of studies several years ago suggested that air-borne contaminants were more damaging to sensitive data centre equipment than ambient temperature and humidity.
Some researchers have gone as far as to speculate that purifying the air circulating through data centres would do far more to achieve maximum performance than tightly controlling temperature and humidity. Whether that is true or not is a matter for future tests. But if the tropical data centre being established in Singapore does turn out to be successful, it would be worth repeating the test under identical circumstances that would also include air purification controls.
Building Greener Data Centres:
The Singapore test is, at the end of the day, all about learning how we can build greener data centres that do not consume nearly as much power. As the digital world grows, more and more of our energy resources will have to be put toward powering the data centres that make modern life possible. If data centres can truly operate at nearly twice the current standards for temperature and humidity, imagine how much money we could save by not having to control the data centre environment so tightly.
Source: Channel News Asia – http://www.channelnewsasia.com/news/singapore/singapore-to-trial-world/2827640.html
Monday, 23 May 2016
The risk of an electrical fire in data centres is an ever-increasing concern for owners and operators. It has been an ongoing battle for when it comes time to decide what type of fire suppression system should be utilised in their facilities. The use of a water-based system could risk destruction to all electronics housed in the facility, resulting in thousands of dollars of damage. CO2 based systems release toxic gas that can be detrimental to the health of your employees and is considered a greenhouse pollutant.
The myth that data centre owners have to choose between water and / or gas systems to protect their space no longer is the case. Unique hybrid fire extinguishing systems - that use a mix of water and nitrogen gas to extinguish the fires - have arrived in the fire protection / suppression market. This technology uses the best characteristics of both water mist and inert gas to extinguish a fire. Among the benefits of this type of system include: life safety, enclosure integrity, environment safety, cooling capacity and no costly clean up or equipment replacement.
Made entirely of non-toxic agents, all personnel are safe even during activation. The reduction of oxygen in the space is at levels within safe breathing tolerances. Hybrid systems are designed specifically for information technology spaces. Providing the best capabilities of both water mist and inert gas systems the technology is also 100% environmentally safe. There is no costly cleanup or equipment replacement after the system is activated. Immediately after a fire, the system rapidly recharges and is ready for use that same day, which is extremely important for information technology facilities such as data centres.
The success of a hybrid technology is its unique ability to extinguish fires via heat absorption and oxygen deprivation and with minimal water presence. This system works by combining nitrogen and water, a homogenous suspension of nitrogen and sub 10-micron water droplets penetrate through vented type enclosures to extinguish a fire without significant water residue. When the mixture enters the enclosure, both the nitrogen and the water attack the fire simultaneously. The water cools the space and the nitrogen reduces the oxygen content and generating steam.
By installing a hybrid fire extinguishing system you no longer will have to worry about damaged property, loss of money or the health and safety of your personnel.