Tuesday, 27 September 2016

FCA IT Outage a Bit of Irony

A bit of irony struck this past weekend when the Financial Conduct Authority (FCA) was forced to announce late last Friday that an incident at one of their outsourced data centres caused a widespread outage that affected a number of the watchdog's IT services. The FCA described the outage as 'major' even as it was working with their vendor to restore inaccessible services.

The irony of the outage is related to comments made earlier in the week by FCA specialist supervision team director Nausicaa Delfas, who berated private sector companies for not having appropriate systems in place to prevent cyber-attacks and network failures. At a cyber security conference last Wednesday, Delfas made it clear that the FCA wants the companies it regulates to do better.

"Most attacks you have read about were caused by basic failings – you can trace the majority back to: poor perimeter defences, un-patched, or end-of-life systems, or just a plain lack of security awareness within an organisation," Delfas said. "So we strongly encourage firms to evolve and instil within them a holistic 'security culture' – covering not just technology, but people and processes too."

Confirmed Hardware Failure

In the FCA's defence, the incident was not the result of any sort of cyber-attack or internal systems shortcoming. It was a direct consequence of a hardware failure as confirmed by Fujitsu, the vendor responsible for the data centre in question. Nonetheless, having not restored all systems several days into the incident demonstrates to the FCA just how difficult it can be to maintain networks when things like this happen.

The FCA has long argued that the companies they regulate should be prepared for any sort of incident that could knock out network access for any length of time. To show just how serious they are, regulators fined the Royal Bank of Scotland a record £56 million after an IT failure in 2014 left millions of customers without access to their accounts. That has some critics of the agency ready to speak out against the regulator.

ACI Worldwide's Paul Thomalla is among those executives calling out the City watchdog. He told the Financial Times that the watchdog has to be held to the same standards they apply to the financial sector. He said that if the FCA expects the institutions it regulates to maintain high standards of security and network reliability they need to implement the same standards for themselves.

Only time will tell how devastating the weekend incident really turns out to be and if there is any long-term fallout at all. The lesson to be learned is that there is no such thing as a 100% safe and reliable network. Things can happen even with the best of intentions and rock solid contingency plans in place. Our job is to do the best we can to mitigate the adverse effects of those incidents. When they happen, we just have to do all we can to get things fixed as quickly as possible.

Source:



Thursday, 22 September 2016

The National GCHQ Firewall: Will It Work?

If you haven't heard the news yet, the Government Communications Headquarters (GCHQ) is taking aggressive action against cyber criminals with the establishment of a new division known as the National Cyber Security Centre (NCSC). The centre, which is slated to open sometime in October (2016), will be the first such government agency dedicated solely to defending the UK against cyber security threats. One of their first missions will be to build a 'national firewall' that would protect internet users from the most common cyber threats.

Thus far, GCHQ has not detailed how the national firewall will work, but they have said that the NCSC will not actually be responsible for filtering out suspect sites and emails. Instead, the primary mission of the firewall is to provide a national domain name system that internet providers and others can use to block access to computers via IP address.

The question on everybody's mind should be, will it work?

As explained by the Telegraph on its website, there are quite a few ISPs with IP blocking policies already in place. They have enjoyed some limited success in preventing malware attacks, phishing attacks and the like. They have also prevented British internet users from accessing sites with content that violates copyright protections.

Some Success Already

The Telegraph says the government has also enjoyed some measure of success with a tool that is capable of identifying and intercepting malicious emails that appear to come from government agencies. It is based on the identification of any emails purporting to come from government sources and then checking origin IP addresses against an existing database of known government addresses. Any email with an IP address that does not match is automatically blocked.

GCHQ has developed a tool to a point where they have been testing its effectiveness on a state tax refund site that was sending out as many as 58,000 emails per day. According to NCSC chief executive Ciaran Martin, that site is no longer sending their emails.

The fact that the government has seen modest success in large-scale email blocking seems to suggest that their plans for a national firewall could work. But there are still plenty of hurdles to overcome. Ultimately, the success or failure of the system is going to rely on how well government and private entities work together.

Every Tool Can Help

Knowing what we know about cyber security and network threats, we can say with a fair degree of confidence that a national firewall will not be a perfect solution all by itself. No single cyber security tool can protect us against every single threat. But every tool that does what it is designed to do adds to a much larger arsenal that is better able to defend against cyberattacks with every passing day.

We look forward to seeing what the GCHQ comes up with for a national firewall. Hopefully, their efforts will allow private organisations to take some much-needed strides in addressing cyber threats.

Tuesday, 13 September 2016

ING Data Centre Crash Caused by Loud Noise

ING Bank found itself apologising to customers this week after a data centre failure in Bucharest, Romania left them without most online services over the weekend. The good news in an otherwise disturbing situation is that the event could have been much worse. The outage led mostly to inconvenience, due to its occurrence on the weekend. Had it happened during the week, the results could have been much worse.

Numerous news reports say that ING Romania was running a standard fire suppression test at the Bucharest facility on 10th September. The facility's fire suppression system uses an inert gas that is designed to be harmless to equipment. In this case, the gas itself did not cause the problem. The catastrophic shut-down of the facility was a result of a loud noise emitted when the high-pressure gas was released.

One news source says that the gas was under a pressure that was too high for the system. When it was released, it emitted a loud booming noise that sent a shock wave throughout the facility. That shock wave created vibrations strong enough to damage hard drives and servers within the data centre.

Service Down for 10 Hours

Damage to the equipment was severe enough that the centre was down for about 10 hours. During that time, customers were unable to conduct online transactions, communicate with the bank online or conduct transactions at ATMs around Bucharest. Some transactions already in progress when the outage occurred were simply lost. The bank's website was also down for a time.

Bank officials say they brought in an extra 70 staff members to help recover the system and restore data. Although described as ‘exceptional’ and ‘unprecedented’, ING Bank maintains that service interruptions were merely a matter of convenience. They have not said whether all systems are up and running yet however it does not appear, at time of writing this article, that any critical data was lost or compromised.

Unfortunate but Important

ING Bank's misfortunes aside, the fire suppression test and subsequent shut-down are important events for the data centre community. Why? Because it has long been assumed that loud noises creating substantial shock waves could damage data centre equipment, but no one has known for sure because it has never happened before. Now that it has, we have a working example we can use to address what we now know is a possibility.

In the months ahead, we can expect testing and research designed to figure out what happened in Bucharest over the weekend. The more we learn about the incident, the better able we will be to protect data centres from similar events in the future. This is good for the data centre community despite the fact that the outage inconvenienced ING Romania customers.

Making the best use of the information collected on the outage will, of course, depend on ING Bank being willing to be forthcoming with their findings. Hopefully they will, for the good of the entire data centre industry.

Sources:



Thursday, 11 August 2016

Delta Airlines Data Centre Fails – The Reason Why Is Still a Mystery

The second-largest airline carrier in the US is still struggling to regain normal operations after a data centre failure that grounded hundreds of flights and stranded thousands of passengers worldwide. Somewhere around 2.30am EDT on Monday, August 9 Delta staff in Atlanta were unable the access computer networks for some unknown reason. Operations around the country and, eventually, the world also suffered the same fate.

The US-based company, which is part of the SkycapTeam consortium that also includes Air France-KLM, has not offered any real concrete answers about what caused the problem. But, in the days following the outage, they have struggled to get their computer systems back online and all the data synced across their worldwide network. The airline says it is doing everything it can to return service to normal.

A Power Switch Problem

Initial reports suggest that Delta technicians were running a routine test of backup power procedures when a piece of equipment was inadvertently tripped. That failure ostensibly locked airline computers out of access to both Georgia Power and their own reserve backup generators. With no power, the system shut down.

However, another rumour has emerged suggesting a fire might have taken out the airline's main data centre in Atlanta. Some sources say that as technicians were attempting to switch computer networks to a backup generator, a fire broke out, destroying two generators in the process. In either case, Delta's computer networks went down due to a data centre failure related to a lack of power.

As of Wednesday, August 10th 2016, things were still not back to normal. A few thousand of Delta's flights were back on schedule, but airport information boards were not necessarily correct. Information on the company's website pertaining to arrivals and departures could also not be entirely trusted. Delta Airlines continues to investigate what went wrong.

Computer Networks Vulnerable Everywhere

Delta Airlines is sure to take a PR beating as a result of its data centre failure. And although there will be new strategies put in place to prevent future outages, the company's networks were already operating up to standards as far as we know. Their data centre had backup power in place for purposes of redundancy, just as would be expected, but the perfect storm occurred in just the right way to cause a big problem.

The lesson to be learned here is that no network is invulnerable. No matter how much technology we put in place, no matter how much redundancy we include, computer networks will always be at risk of failure. It is something we have to learn to live with. That does not help the thousands of Delta passengers stranded around the world, but it is the reality in which we live. Computer networks are not perfect.

Hopefully, Delta would be more forthcoming in the future as to what caused the failure. Their willingness to share information will help others avoid similar problems in the future.

Tuesday, 19 July 2016

Smart Cities and the SSD-Driven Data Centre

We have smartphones, smart cars, and smart homes filled with dozens of smart devices. So, are you now ready for “smart cities”??? They may have been a fanciful thing of the past for futurists and dreamers, but smart cities are now here. They are beginning to emerge thanks to billions of devices across the globe able to communicate via the internet. And yes, data centres are playing a big part.

The data centre of the future is likely to be the bedrock of the smart city for obvious reasons. But, before we get to discussing what that might look like, let us first consider where we are right now. ITProPortal's Laurence James recently wrote a very timely blog post in which he cited data suggesting that upwards of 1.6 billion devices will be connected to smart city infrastructure before 2016 is out. He mentions things such as smart transport, traffic management systems via connected cars and even the local rubbish bin that is capable of sending a message that it needs to be emptied.

James used the 2012 Olympics in London as an example of how smart cities are already working. Officials at TfL had to put a system in place to manage traffic that could support up to 18 million journeys per day. The system they settled on used data analytics to predict traffic patterns so that trains, buses and other options could move through London as efficiently as possible.

Data Centres at the Heart of Smart

At the heart of smart is the data centre. But here's the thing: in order to make smart cities a reality, we are going to need a lot more local data centres that are capable of processing tremendous volumes of data extremely quickly. Relying on regional data centres will simply not be enough.

This presents a problem; especially in an era when we are trying to reduce our carbon footprint while at the same time consuming less energy. As we already know, data centres are hungry consumers of power. We need to find a way to reduce power consumption if we are going to build enough data centres to support smart cities without completely obliterating our energy goals. The solution appears to be the Solid State Drive (SSD) 'flash' drive.

In his post, James explains that experts predict mechanical hard drives will be capable of supporting 40 TB of data by 2020. As tremendous as that number is, it is insufficient. The good news is that SSDs should be able to support 128 TB at 10% of the power and 6% of the volume required by mechanical hard drives. In other words, SSDs can handle more data at faster speeds, at a lower cost, and with a smaller footprint requirement.

Smart cities are here now. In the future, they will be driven by local data centres that rely on SSDs to handle the massive data flow. Who knew the technology behind the flash drive in your pocket would be so integral to powering the future?

Source:

Wednesday, 13 July 2016

UK Solar Power Reaches New Milestone

Most of us are fully aware of the fact that the UK is a world leader in clean energy – particularly in the area of solar - so it should be no surprise that a new analysis offered by the Solar Trade Association (STA) reveals that producers have recently hit the latest milestone in solar energy production, by generating nearly 24% of the total energy demand during the afternoon hours of 5 June 2016.

According to the STA, the UK is now home to almost 12 GW of solar power capacity that, at peak generation, can produce up to 25% of the nation's total energy needs. The STA is firmly behind solar as the best way to provide clean energy and reduce dependence on fossil fuels. Chief executive Paul Barwell was quoted by E&T magazine as saying, "This is what the country and the world needs to decarbonise the energy sector at the lowest price to the consumer."

Solar Farms and Rooftop Installations:

The popularity of solar power in the UK is evident by the rapid uptake of both solar farms and rooftop installations. According to E&T magazine, one particular rooftop installation in Telford consists of 14,000 solar panels on top of a commercial building operated by Lyreco. The magazine goes on to say that all of the clean energy sources currently in use in the UK, combine to provide more than 25% of the UK's total power generation.

Across the UK, more and more homes are being fitted with solar panels for two purposes. Consumers are utilising PV systems to generate direct electrical current and solar thermal systems for hot water and space heat. Commercial and industrial enterprises are also embracing solar for space heat, process heat and hot water.

The STA says that all the solar industry needs at this point is one more "push from the government" to reach its goal of being subsidy-free sometime early in the next decade. The government seems like it is on board, for now.

Solar for Data Centre Requirements:

We are thrilled to know that solar and other clean energy sources are doing so well and, to have UK solar capacity reach this most recent milestone is certainly encouraging. It leads us to wonder if we will ever see a viable solar application for powering data centres. Finding some sort of renewable solution is critical given the fact that data centres are among the most prolific power consumers in the world. If we can find a way to get data centres off fossil fuels, doing so would have a tremendous impact on meeting clean energy goals.

Solar isn't adequate for data centre needs in its present form. But we can envision a day when highly efficient solar thermal systems with sufficient storage capacity could be used to generate the power requirements of a data centre in order for it to operate 24/7. A development like this would certainly be exciting and one that all of us in the data centre industry would be absolutely thrilled to see.

Thursday, 30 June 2016

Lack of Security Taints EU Re-Vote Petition

The Brexit votes had barely been tallied and made official when opponents of the outcome established an online petition calling for a second vote. That much was expected in the days and weeks leading up to the vote, given that polling showed things to be extremely close. What was not expected is an almost ridiculous lack of security that has allowed the petition to be tainted by auto bots.

According to the BBC, the House of Commons petitions committee has said it has already removed 77,000 invalid signatures coming from people allegedly living in Antarctica, North Korea, the Sandwich Islands and even the Vatican. Although officials say that most of the remaining signatures now appear to be from people living in the UK, there is no way to know how many of those signatures were added legitimately as opposed to being placed on the petition through auto bots.

An Appalling Lack of Security

The re-vote petition is already the most active petition ever placed on the Parliamentary website. The BBC says it currently has 3.6 million signatures. However, one computer security expert told the BBC that any site like the House of Commons petition site needs to have security measures in place to defeat intrusions. We clearly agree.

What's most appalling about the lack of security in this case is the fact that stopping auto bots is relatively simple. It's not as if we are talking about encrypted malware or tough-to-detect rootkits that go to the heart of computer networking systems. Auto bots are nothing more than computer scripts that log onto a website and submit or retrieve data without any human intervention. They can be stopped with something as simple as a captcha script.

Because whoever designed the petition site was so careless, there is no way of knowing how many of the signatures on the petition calling for a second EU vote are legitimate. But it goes beyond just this petition. How many other petitions have been affected by the site's lack of security?

The BBC references a group that runs the 4chan message board as being one of the primary attackers of the re-vote petition. According to their report, one of the message boards members claims to have signed the petition some 33,000 times simply by running an auto bot.

Things Must Change Now

For the record, the House of Commons petitions committee says it will continue to monitor the situation for any additional evidence of auto bot activity. Meanwhile, Prime Minister David Cameron has said there would be no second vote, regardless of the petition and its signatures.

That's all well and good, but something must be done to improve the security of the petition site now. If we cannot trust something as simple as online petitions as being secure, we are left to wonder how many other government websites are equally vulnerable. Shame on the House of Commons and their web developer for such a stunning lack of security.