Tuesday, 27 September 2016

FCA IT Outage a Bit of Irony

A bit of irony struck this past weekend when the Financial Conduct Authority (FCA) was forced to announce late last Friday that an incident at one of their outsourced data centres caused a widespread outage that affected a number of the watchdog's IT services. The FCA described the outage as 'major' even as it was working with their vendor to restore inaccessible services.

The irony of the outage is related to comments made earlier in the week by FCA specialist supervision team director Nausicaa Delfas, who berated private sector companies for not having appropriate systems in place to prevent cyber-attacks and network failures. At a cyber security conference last Wednesday, Delfas made it clear that the FCA wants the companies it regulates to do better.

"Most attacks you have read about were caused by basic failings – you can trace the majority back to: poor perimeter defences, un-patched, or end-of-life systems, or just a plain lack of security awareness within an organisation," Delfas said. "So we strongly encourage firms to evolve and instil within them a holistic 'security culture' – covering not just technology, but people and processes too."

Confirmed Hardware Failure

In the FCA's defence, the incident was not the result of any sort of cyber-attack or internal systems shortcoming. It was a direct consequence of a hardware failure as confirmed by Fujitsu, the vendor responsible for the data centre in question. Nonetheless, having not restored all systems several days into the incident demonstrates to the FCA just how difficult it can be to maintain networks when things like this happen.

The FCA has long argued that the companies they regulate should be prepared for any sort of incident that could knock out network access for any length of time. To show just how serious they are, regulators fined the Royal Bank of Scotland a record £56 million after an IT failure in 2014 left millions of customers without access to their accounts. That has some critics of the agency ready to speak out against the regulator.

ACI Worldwide's Paul Thomalla is among those executives calling out the City watchdog. He told the Financial Times that the watchdog has to be held to the same standards they apply to the financial sector. He said that if the FCA expects the institutions it regulates to maintain high standards of security and network reliability they need to implement the same standards for themselves.

Only time will tell how devastating the weekend incident really turns out to be and if there is any long-term fallout at all. The lesson to be learned is that there is no such thing as a 100% safe and reliable network. Things can happen even with the best of intentions and rock solid contingency plans in place. Our job is to do the best we can to mitigate the adverse effects of those incidents. When they happen, we just have to do all we can to get things fixed as quickly as possible.


Thursday, 22 September 2016

The National GCHQ Firewall: Will It Work?

If you haven't heard the news yet, the Government Communications Headquarters (GCHQ) is taking aggressive action against cyber criminals with the establishment of a new division known as the National Cyber Security Centre (NCSC). The centre, which is slated to open sometime in October (2016), will be the first such government agency dedicated solely to defending the UK against cyber security threats. One of their first missions will be to build a 'national firewall' that would protect internet users from the most common cyber threats.

Thus far, GCHQ has not detailed how the national firewall will work, but they have said that the NCSC will not actually be responsible for filtering out suspect sites and emails. Instead, the primary mission of the firewall is to provide a national domain name system that internet providers and others can use to block access to computers via IP address.

The question on everybody's mind should be, will it work?

As explained by the Telegraph on its website, there are quite a few ISPs with IP blocking policies already in place. They have enjoyed some limited success in preventing malware attacks, phishing attacks and the like. They have also prevented British internet users from accessing sites with content that violates copyright protections.

Some Success Already

The Telegraph says the government has also enjoyed some measure of success with a tool that is capable of identifying and intercepting malicious emails that appear to come from government agencies. It is based on the identification of any emails purporting to come from government sources and then checking origin IP addresses against an existing database of known government addresses. Any email with an IP address that does not match is automatically blocked.

GCHQ has developed a tool to a point where they have been testing its effectiveness on a state tax refund site that was sending out as many as 58,000 emails per day. According to NCSC chief executive Ciaran Martin, that site is no longer sending their emails.

The fact that the government has seen modest success in large-scale email blocking seems to suggest that their plans for a national firewall could work. But there are still plenty of hurdles to overcome. Ultimately, the success or failure of the system is going to rely on how well government and private entities work together.

Every Tool Can Help

Knowing what we know about cyber security and network threats, we can say with a fair degree of confidence that a national firewall will not be a perfect solution all by itself. No single cyber security tool can protect us against every single threat. But every tool that does what it is designed to do adds to a much larger arsenal that is better able to defend against cyberattacks with every passing day.

We look forward to seeing what the GCHQ comes up with for a national firewall. Hopefully, their efforts will allow private organisations to take some much-needed strides in addressing cyber threats.

Tuesday, 13 September 2016

ING Data Centre Crash Caused by Loud Noise

ING Bank found itself apologising to customers this week after a data centre failure in Bucharest, Romania left them without most online services over the weekend. The good news in an otherwise disturbing situation is that the event could have been much worse. The outage led mostly to inconvenience, due to its occurrence on the weekend. Had it happened during the week, the results could have been much worse.

Numerous news reports say that ING Romania was running a standard fire suppression test at the Bucharest facility on 10th September. The facility's fire suppression system uses an inert gas that is designed to be harmless to equipment. In this case, the gas itself did not cause the problem. The catastrophic shut-down of the facility was a result of a loud noise emitted when the high-pressure gas was released.

One news source says that the gas was under a pressure that was too high for the system. When it was released, it emitted a loud booming noise that sent a shock wave throughout the facility. That shock wave created vibrations strong enough to damage hard drives and servers within the data centre.

Service Down for 10 Hours

Damage to the equipment was severe enough that the centre was down for about 10 hours. During that time, customers were unable to conduct online transactions, communicate with the bank online or conduct transactions at ATMs around Bucharest. Some transactions already in progress when the outage occurred were simply lost. The bank's website was also down for a time.

Bank officials say they brought in an extra 70 staff members to help recover the system and restore data. Although described as ‘exceptional’ and ‘unprecedented’, ING Bank maintains that service interruptions were merely a matter of convenience. They have not said whether all systems are up and running yet however it does not appear, at time of writing this article, that any critical data was lost or compromised.

Unfortunate but Important

ING Bank's misfortunes aside, the fire suppression test and subsequent shut-down are important events for the data centre community. Why? Because it has long been assumed that loud noises creating substantial shock waves could damage data centre equipment, but no one has known for sure because it has never happened before. Now that it has, we have a working example we can use to address what we now know is a possibility.

In the months ahead, we can expect testing and research designed to figure out what happened in Bucharest over the weekend. The more we learn about the incident, the better able we will be to protect data centres from similar events in the future. This is good for the data centre community despite the fact that the outage inconvenienced ING Romania customers.

Making the best use of the information collected on the outage will, of course, depend on ING Bank being willing to be forthcoming with their findings. Hopefully they will, for the good of the entire data centre industry.