Wednesday, 20 September 2017

Sound: The Next Frontier for High Speed Computer Processing?

When most people think about sound in relation to high-speed networking, they think about the quality of sound embedded in their high-definition videos or streaming from their favourite music services. What if we told you that sound could be the next frontier in high-speed computer processing? Well, it looks like that might be the case in the not-too-distant future...

Breakthrough research out of the University of Sydney (Australia) has led to the development of a process capable of converting optical data into storable sound waves. The process makes it very possible that future iterations of network technology could store and send data as light waves to receivers that would then convert those light waves into sound waves that computer chips could interpret as meaningful data.

The idea of using light waves to store and transmit data is nothing new. Scientists have known about the potential for years. The problem has always been converting optical data into a format computer chips could work with, in a way that was both fast and efficient.

University of Sydney researchers described the problem of converting optical data into something more usable as comparable to the difference between thunder and lightning. In other words, the speed of light in air is more than 880,000 times faster than the speed of sound. Even if computer chips could read data in the optical domain, they would not be able to keep up with the speed at which that data travels. Thus, the need to slow it down during the conversion process.

Phononics Is the Answer

The field of storing and transmitting data in the audio domain is known as phononics. Both private enterprise and public institutions have been researching phononics for quite some time in the hope of developing some sort of technological advantage over the current process of converting optical data into electrons. The Australian researchers may finally have come up with the answer via phononics.

Current technologies that transmit optical data before converting it into electrons that can be read and stored by computer chips still produces pretty amazing results compared to our capabilities of a decade ago. The process has an inherent weakness though: it produces an incredible amount of waste heat. That waste heat limits the practical use of optical-electron applications. Phononics has solved the problem.

The process developed by the Australian researchers eliminates waste heat by converting optical data to sound waves. More importantly, computer chips can more quickly read, analyse and use audio data as compared to electron data. Researchers say that the process has proved successful enough to open the door to more powerful signal processing and future quantum computing applications.

Those of us in the data centre industry eagerly await further development from Australia. If it turns out their process possesses both mass-market appeal and financial viability, it will completely transform our understanding and application of high-speed computer processing. There will be huge implications in everything from global financial transactions to scientific research to global media access.

Thursday, 14 September 2017

TPS Violation Costs Company £85,000

It is against the law to call consumers whose phone numbers are registered with the Telephone Preference Service (TPS) without explicit consent from such consumers. Being found in violation could cost tens of thousands of pounds, as one Dartford-based telephone company recently found out.

According to an 11th September release from the Information Commissioner's Office (ICO), True Telecom Ltd made nuisance phone calls to consumers for more than two years despite many of those they called being on the TPS list. More astoundingly, the company continued making the calls even after being warned by the ICO to cease. The calls were placed between April 2015 and April 2017, and during that time, more than 200 complaints were registered with the ICO.

The company attempted to mislead consumers by concealing the number they were calling from and giving people the impression they were from the organisation known as BT Openreach. According to the ICO, True Telecom was unable to prove its innocence to investigators by providing evidence of consent.

The result of True Telecom's actions was an £85,000 fine and an official ICO enforcement notice informing True Telecom that it must stop making illegal phone calls immediately. Continuing the calls could result in court action down the road.

ICO Head of Enforcement Steve Eckersley said in his official statement that the rules pertaining to nuisance calls are clear and that his agency intends to enforce them. He went on to say:

"These calls are at best annoying and at worst downright distressing, and companies who pester the public in this way must understand they won't get away with it. The ICO will take action."

Respecting the Privacy of Consumers

Few would argue the fact that we live in a time when the information age has given way to the age of big data. Few would argue that we need rules and regulations in place to protect the privacy of consumers. The rules surrounding nuisance phone calls certainly apply. They were put in place to prevent companies from pressing consumers with repeated, unwanted phone calls.

It is all well and good that True Telecom has been investigated and fined for their illegal activity. But the ICO's report begs the question of how this company got the contact information they used to make the calls. If that information was supplied directly by the consumers themselves as a result of doing business with True Telecom, that's one thing. But, if the information was sold to them by another organisation, we are talking an entirely different matter.

Could it be that it's time to start enacting rules that prevent companies from selling personal information? If we are truly trying to protect the privacy and security of consumers, why on earth do we allow personal information to be packaged and sold? It makes no sense. As long as we do nothing about this loophole, consumers will continue to be victimised by companies who care nothing about their privacy and security.

Tuesday, 5 September 2017

Accident Reveals Sensitive Information on Council Website

A consumer innocently browsing the internet accidentally stumbled across sensitive personal information left unsecured on a council website. This immediately raised concerns about how such data could be left out in the open, at the same time reminding organisations that no one is immune to breaches of data security. The revelation has also led to a substantial fine.

In a 31st August news release from the Information Commissioner's Office (ICO), it was revealed that Nottinghamshire County Council made protected data – including personal addresses, postcodes, and care requirements of the elderly and disabled – publicly available on an insecure site. The data was uncovered when a member of the public stumbled across it without the need to use a username and password to access the information.

ICO head of enforcement Steve Eckersley wrote in the news release:

"This was a serious and prolonged breach of the law. For no good reason, the council overlooked the need to put robust measures in place to protect people's personal information, despite having the financial and staffing resources available."

Eckersley went on to state that the actions by those responsible were both ‘unacceptable and inexcusable’ given how sensitive the data is. The data pertained primarily to individuals who received services based on the council's Homecare Allocation System (HCAS) first launched in 2011. The most egregious aspect of the mistake is the fact that the information had been left unprotected for five years by the time it was discovered in June 2016.

Nottinghamshire County Council has been fined £70,000 by the ICO for its carelessness. It is not yet known whether the 3,000 people whose data was left unprotected suffered any negative consequences as a result.

Proof of the Need for Due Diligence

As an organisation involved in the data centre industry, it is apparent to us that Nottinghamshire County Council was extremely careless in the handling of the HCAS data. It also seems rather strange that the mistake went unnoticed for so long given how much attention the ICO is supposed to be giving to matters of this sort. If anything, the story is further proof of the need for due diligence among those that store data as well government agencies tasked with protecting the general public.

Whenever any online system is created for the purposes of collecting, storing and utilising personal data, a tremendous amount of responsibility comes with that data. There is never an excuse to allow any sort of sensitive data to be freely available to the general public without need for protected access.

The ICO news release says that Nottinghamshire County Council has ‘offered no mitigation’ to this point. Let's hope that this changes sooner rather than later. The public deserves to know how the Council responded to the original revelation and what procedures are now in place to make sure such exposure never occurs again. If we cannot trust those entrusted to protect us, our data security problems are much bigger than we realise.

Friday, 1 September 2017

Houston Data Centres: When Disaster Strikes

As everyone is no doubt aware by now, America's fourth-largest city, Houston, in Texas, was hit with a major Category 4 hurricane late last week. Though the storm was quickly downgraded after making landfall, it has still caused unprecedented damage through relentless rainfall that is likely to top 4 feet in some areas before it's done. US officials are now saying Houston and south-central Texas have never experienced an event like this.

Being in the data centre business, we are curious as to how the four major data centres located in Houston are faring today. There is both good and bad news.

The good news is that all four data centres are still operational and not reporting any damage. This is not unexpected, given the fact that data centre designers build into their plans certain protections against natural disasters. But that brings us to the bad news: streets in Houston are closed, and many of those streets around the data centres are flooded however the power has stayed on at all four facilities thus far.

Running on Emergency Generators

There is no way to know when power will be restored until after the rain finally stops. It could be days but it is more likely to take weeks. Data centre builders plan for such events by equipping facilities with generators and all four Houston data centres are prepared to run their generators if necessary. The question is, will they actually have to?

Should they need the generators, the facilities would be depending on fuel deliveries to keep them going. Data centres are second in line for fuel deliveries behind hospitals, police stations and other first-responder facilities, but with the roads flooded, how long before the fuel trucks could actually make it through?

Preparing Customers for Data Recovery

No one could have predicted that Houston would get 4 feet of rainfall from Harvey. In fact, the storm exceeded all expectations normally associated with hurricanes coming off the Gulf of Mexico. Unfortunately, all the preparations in the world cannot account for everything. Knowing what they now know about the devastation, data centre officials are beginning to contact customers in the hope of helping them make data recovery plans should the unthinkable happen.

The lesson in all of this is that nature will do what it wants to do. Data centre designers and operators go to great lengths to protect their facilities against the most severe weather, but sometimes it's just not enough. Hopefully Houston's four data centres will not suffer any interruption of service. Whether they do or not, there will be plenty of lessons to be learned in the aftermath Hurricane Harvey and its historic flooding.

Wednesday, 16 August 2017

From Superfast to Ultrafast – Speedier Broadband on the Way

On the heels of BT offering to invest in the infrastructure needed to bring high-speed internet to those Britons who do not yet have it, researchers have announced the possibility of current technology becoming obsolete within a short amount of time. We aren't talking high-speed Internet any more. We're not even talking superfast. Instead, we are now looking at ultrafast speeds measured in gigabytes rather than megabytes.

Ultrafast wi-fi has been on the radar for quite some time now. Until recently though, making it happen has remained somewhat of a mystery. That mystery may have been solved by switching from traditional microwaves to terahertz. Researchers at Brown University School of Engineering in Providence, Rhode Island (USA) have demonstrated they can "transmit separate data streams on terahertz waves at very high speeds and with very low error rates," according a report on the Telegraph website.

"This is the first time anybody has characterised a terahertz multiplex system using actual data," the researcher said in an official statement, "and our results show that our approach could be viable in future terahertz wireless networks."

What It Means to You

If you don't know the difference between a microwave and a terahertz, you are not alone. Here's what it means to you in simple terms: ultrafast internet access that could be upwards of 100 times faster than the best high-speed service now available. We are looking at speeds of 50 GB per second as opposed to 500 MB per second, the highest speed available with state-of-the-art microwave technology.

If science is successful in developing terahertz applications, the implications of the new technology would be incredible. First and foremost, terahertz networks would bring to an end the very real danger of outstripping microwave capacity with current high-speed applications.

Secondly, we would be able to develop platforms capable of much higher data densities. Terahertz waves operate at higher frequencies than microwaves and higher frequencies means more data packed into the same stream.

Thirdly, a successful venture into terahertz technology would mean high definition streaming on-the-go for everything from live television to superfast data crunching for banks, businesses and other large consumers of data. That alone would do wonders for worldwide financial markets.

Proving It Works

Proof-of-concept experiments out of Brown University involved two HD television broadcasts that were encoded on two different terahertz frequencies and then sent out across a wi-fi network together. Researchers obtained error-free results at 10 GB per second. Errors were only slight at 50 GB per second and well within the range of standard error correction systems.

From high-speed to superfast to ultrafast, the speeds at which we can send data through the air will only be going up over the next several years. Imagine a wi-fi connection 100 times faster than you currently use. It is possible through terahertz; at least in principle. Now it is up to scientists to make terahertz technology viable for the mass market. It appears as though they are very much on course.

Tuesday, 8 August 2017

Statement of Intent: New UK Consumer Data Protection Rules to be Enforced

A recently issued statement of intent from the Department for Digital, Culture, Media & Sport aims to change the way those who collect data online use it for the foreseeable future. The statement outlines plans to initiate legislation that will further protect consumers against the misuse of their personal information, with or without their consent.

Among the new protections is greater control over data by individual consumers. As an example, consumers will be able to maximise the 'right to be forgotten' by requesting social media sites erase their information.

All the new data protection rules will be encapsulated in legislation to be known as the Data Protection Bill. The government says the bill will instil confidence in people that they have absolute control over their data. According to the statement of intent, research currently shows that up to 80% of the public lacks that confidence right now.

Shifting the Responsibility

Once the new legislation becomes law, it will shift the burden of responsibility from the consumer to the organisation that collects data. It will require organisations to obtain explicit consent for processing certain kinds of personal data. At the same time, it will eliminate the default tick boxes organisations currently use to obtain consent. It is the government's contention that such tick boxes are largely ignored.

The legislation will also:

  • Make it easier for consumers to withdraw consent
  • Give consumers the ability to request their personal data be erased
  • Give parents and guardians control over data collected from minors
  • Expand the definition of 'personal data' to include cookie information, ip addresses, and dna information
  • Give consumers more power to force organisations to reveal what kinds of data they have
  • Make it easier for consumers to move data between organisations
  • Update and strengthen existing data protection laws to bring them in line with the state of the digital economy
The government is extremely serious about shifting responsibility from consumers to data collectors. They have created significant penalties for violators, and the Information Commissioners Office (ICO) will be granted greater authority to enforce the rules. The statement of intent makes it clear that organisations will be held accountable for the data they gather.

"Our measures are designed to support businesses in their use of data, and give consumers the confidence that their data is protected and those who misuse it will be held to account," said Minister of State for Digital Matt Hancock.

Hancock went on to explain that the legislation will give consumers more control over their data while also preparing the UK for Brexit. On that latter point, the legislation brings UK consumer data protection laws in line with the EU's General Data Protection Regulation.

All that remains to be seen now is whether the final Data Protection Bill lives up to the promises of the statement of intent. If it does, the UK will have one of the strongest consumer data protection laws in the world.

Thursday, 3 August 2017

BT Counters Universal Service Obligation Mandate with Proactive Offer

If you had to choose between the government's proposed universal service obligation mandate and BT proactively building the necessary infrastructure to offer universal broadband, which would you choose? It is an interesting question that MPs and Ofcom now have to wrestle with. Thanks to a new offer by BT, the government may not have to implement the universal service obligation after all.

BT has proposed investing some £600 million in new infrastructure that will provide high-speed broadband to up to 99% of UK households by 2020. Universal access would be available by 2022.

BT defines high-speed broadband as a connection that gets at least 10Mbps. They say approval of their plan would mean most UK households getting broadband through fibre and fixed wireless technologies. Those that could not be reached through traditional infrastructure could be offered satellite broadband.

As an alternative, the government has already proposed the universal service obligation. Should they decide to implement it, every UK household without broadband would be able to request it beginning in 2020. BT would have no choice but to install the necessary infrastructure to meet that request.

Why Make the Offer?

It is a fascinating exercise to try and figure out why BT would make such an expensive offer? Agreeing to spend £600 million is an awfully pricey proposition when you consider that BT already provides high-speed broadband to 95% of UK households. To understand BT's move, it's important to understand how they will come up with the money.

Under the universal service obligation mandate, new broadband customers who get service after making a request could not be charged more than customers already receiving the same service. That would mean BT having to swallow the cost of building infrastructure on a case-by-case basis. Their proposal pays for the infrastructure in a different way.

According to the BBC, all of BT's current customers would pay for the construction through increased monthly bills. Those who eventually get access would then be paying the higher bills as well. From a business standpoint, it just makes better sense for BT to voluntarily create universal broadband access rather than being forced into it.

A Few Concerns

BT's proposal has thus far met with mostly positive reactions. However, there are a couple of concerns. First is the possibility that 10Mbps service would be obsolete before the company finished building the infrastructure. That would mean all of those newly connected households would once again be behind in a game of perpetual catch-up.

The second worry is obviously the higher cost to broadband customers. While the first concern can be addressed through well-planned technology upgrades, there's nothing anyone can do about the second. Someone has to pay to build and maintain the infrastructure. If customers don't pay for it directly, they will have to pay for it through tax contributions to cover government subsidies.

We shall see what the government decides to do with BT's proposal. Either way, universal broadband – even in the hardest to reach places – is the ultimate goal.

Wednesday, 26 July 2017

ICO: Don't Illegally Share Personal Information

Having access to the personal information of clients or customers is a privilege that allows businesses to stay in business. It is also a privilege that must be protected. According to the Information Commissioner's Office (ICO), there are businesses and individuals guilty of not protecting personal information. The ICO is now warning those who have access to personal information to be more careful.

The recent ICO warning comes on the heels of the successful prosecution of a recruitment manager who illegally disclosed personal information to a third-party recipient without the knowledge and consent of victims. The man, 39-year-old Stuart Franklin from the West Midlands, provided the information to a hiring agency while he was in the employ of HomeServe Membership Ltd.

An official report from the ICO says that Franklin sent copies of 26 CVs to a recruiting company during his time with HomeServe. Those electronic documents contained sensitive personal information on the individual applicants. Franklin had no legitimate business reason to do so, and he never sought the permission of the owners of that information.

After the successful prosecution based on S55 of the Data Protection Act, Franklin was ordered to pay a total of £994 covering his fine, court costs, and a victim surcharge. As for the ICO, Head of Enforcement Steve Eckersley produced a statement which said, in part:

"We're asking people to stop and think about the consequences before taking or sharing information illegally. Most people know it's wrong but they don't seem to realise it's a criminal offence and they could face prosecution."

The Human Factor: A Big One

Most of what we hear about in terms of data centre and network security is directly related to hacking by outside sources. That is a big problem indeed. But equally problematic is the human factor. As the Franklin case demonstrates, you do not need sophisticated hackers with equally sophisticated hacking tools to create a serious security breach that could ruin lives. Sometimes all it takes is a careless employee who passes along confidential information without giving it a second thought.

Organisations should absolutely take every effort to ensure networks and data are completely secure. Doing so goes beyond hiring competent IT staff and installing the right kind of hardware and software. It is also a matter of educating employees about their responsibilities for safeguarding personal information, then routinely updating training and conducting audits.

If we are to truly secure our data against theft and misappropriation, we all need to do a better job of protecting it with whatever means are available to us. Employees need to be careful about illegally sharing information they are not authorised to share. Individuals have to be more diligent about the information they share and the reasons for doing so.

In the meantime, the ICO is reminding employers and other organisations that passing along personal information belonging to someone else is not legal unless consent has been obtained and there is a legitimate business reason for doing so.

Tuesday, 6 June 2017

Budapest Convention to Change Digital Evidence Sharing Rules

When crimes are committed in Europe, police investigators are sometimes limited in the kinds of digital evidence they can collect and use for prosecutorial purposes. Despite the Budapest Convention on Cybercrime being opened and maintained for the last 16 years, a lack of clear rules relating to how digital evidence can be used continues to be a problem for European police officials. Now the Convention aims to change that.

News reports say that the Convention is getting ready to sign a new deal that will make it a lot easier for police officials to collect, use and share digital evidence with other participating countries, even if that evidence does not reside on a server located within the borders of the investigating country.

Why the Changes Are Necessary:

Being involved in the data centre sector, we are painfully aware of national laws that require operators in certain countries to make sure data belonging to domestic customers is stored only on domestic servers. We are constantly reminded about national laws requiring the security of that data. It is just part of the game.

Under the current rules, police officials have to be concerned about how digital evidence is shared across European borders. There are times when a police agency could freely access digital data in another country but fail to do so out of fears that such evidence would not be admissible in court. There are other times when accessing cross-border data is actually against the law.

In order to get around the rules, police agencies in member countries take advantage of what are known as Mutual Legal Assistance Treaties. However, going through the treaty process is painfully slow. It is so slow, in fact, that cases can fall apart while police agencies are waiting for approval to get the necessary evidence.

What the New Rules Do:

If new rules are agreed upon without any changes to the current proposals, they will allow police agencies to speed up investigations through faster access to digital data. The rules cover everything from mobile phone use to e-mail to websites and social media. Essentially, any kind of data that can be transmitted online will be subject to better and faster collection by police agencies.

The rules will also put in place policies for reacting to emergency situations. The Budapest Convention is looking to the US for guidance here. That country already has emergency policies in place, policies that enabled France to quickly get information they needed during the Charlie Hebdo attack a couple of years ago.

Based on the known trouble that police agencies go through to collect and use digital evidence, it is quite obvious that some rule changes are needed. There is a danger though. As America's NSA has proven, not carefully thinking through the rules to account for the possibility of digital information being used improperly can lead to all sorts of unintentional spying. The Budapest Convention does need to act, but they need to do so carefully and circumspectly.

Wednesday, 31 May 2017

Microsoft Looking at DNA Data Storage

How would you feel about donating some of your DNA to eventually be utilised as a personal storage space for all your digital data? The idea may seem a bit far-fetched, but Microsoft recently revealed that they are working on a system that, in theory, could make exactly what has just been described pretty routine at some point in the near future.

Microsoft has revealed a research project aimed at using strands of DNA for large-scale data storage. According to a report published by MIT Technology Review, the US-based software company expects to have a workable DNA data storage system in place by the end of this decade.

The system involves using individual strands of nucleic acids to store data as nucleic acid sequences in much the same way a magnetic strip stores data as sequences of positive and negative charges. The benefit of the DNA model is primarily one of capacity. As an example, a Harvard geneticist investigating the possibility of DNA data storage a number of years ago converted and stored his book on the subject using 55,000 DNA strands.

According to reports, a single gram of DNA is capable of holding 215 PB of data. For the record, a petabyte is 1 million GB. That is a tremendous volume of the data stored on something incredibly small. At the rate data is exploding these days, we are going to need something that impressive just to keep up with it all.

Overcoming Current Limits:

Using nucleic acids to store digital data is very promising in that proof of concept has already been established. But like any new technology, it is too cost prohibitive to be mainstream at the current time. There are some inherent limits to DNA data storage that must be overcome before you and I will be donating our own DNA to the cause.

Right now, the biggest challenge seems to be speed. Sending data to the storage system has been as slow as 400 bytes per second. In order to come up with a workable solution that could be embraced by the retail market, researchers have to get to at least to 100 MB per second. And as every year ticks by, that number will increase alongside other technologies.

The other big challenge is the price of materials. Researchers currently invest roughly £620,000 in the materials needed to build a DNA data storage and retrieval system. That cost is way too much to make for a feasible mass market product. The price will have to be reduced to several hundred dollars, at the most, if the idea is to ever be marketable.

Human DNA has been storing critical data since the dawn of man. How ironic it would be if we could take something as fundamental to our existence as DNA and use it to store and retrieve digital information that is becoming equally critical to our everyday lives. Microsoft hopes to make it happen within the next few years.

Thursday, 25 May 2017

ICO to Look at Data Analytics in Politics

Big data is everywhere. If you do anything online, whether with a mobile phone or laptop computer, there are entities out there in the digital universe collecting data about you and analysing it for marketing purposes. There are also political entities making use of that data, according to the Information Commissioner's Office (ICO).  The ICO has therefore announced the start of a formal investigation with the intent to learn just how data analytics are used for political purposes.

An informal investigation was originally announced by the ICO earlier this year. According to Commissioner Elizabeth Denham, her office believes that what they have learned since March warrants a formal investigation now. Denham acknowledges that data analytics have a significant impact on individual privacy and, as such, people have a right to know how data is being used to influence votes.

"Having considered the evidence we have already gathered, I have decided to open a formal investigation into the use of data analytics for political purposes," Denham wrote in an official release. "This will involve deepening our current activity to explore practices deployed during the UK's EU Referendum campaign but potentially also in other campaigns."

The commissioner has indicated that her investigation will be ongoing even in the midst of campaigning for the snap General Election coming up. She also maintains that her decision to launch a formal investigation has nothing to do with that election or it possible outcome.

What It All Means

Without coming out and saying so directly, the Government has taken the position that politics has become more orientated toward marketing in the digital age. Indeed, that is the entire point of big data anyway. Analysts gather as much data on individuals as they possibly can and then find ways to decipher and apply that data in order to be more effective in their outreach.

While big data is alive and well in all sorts of fields, it has only been perfected – at least as much as is possible right now – within the marketing environment. Therefore, it stands to reason that the ICO will be looking at data analytics from that standpoint. They want to know if politicians are marketing their messages to voters based on what they learn from data analytics.

Finding out they do would not be much of a surprise. Politics has always been about messaging. What may be a surprise is the extent to which data analytics is being used. If it is determined that individuals or political campaigns are misusing data in order to target their messaging, there could be some significant consequences in the future.

At any rate, Denham also took the occasion of her official release to remind all political parties that their current activities in relation to the upcoming election must adhere to all applicable laws. The ICO offers updated guidance on political campaigning that parties can avail themselves of. As an individual, you are also welcome to download that guidance from the ICO website.  Simply follow the below link to the original source of this blog.

Tuesday, 16 May 2017

Fire Takes Out Aussie Data Centre and Disrupts Business

A data centre fire in southern Australia disrupted numerous businesses last week, including account access among customers of UniSuper, a superannuation provider with more than AUS $56 billion in assets. Fortunately, no customer information was lost as a result of the failure and the data centre was back online a day later.

The affected data centre remains undisclosed at this time, but news reports did identify it as a facility somewhere in the Port Melbourne area. Port Melbourne is a suburb of Melbourne in the state of Victoria. News reports also indicate that the data centre is in the same general vicinity as two companies in which UniSuper is heavily invested.

No Information on Cause:

At of the time of this writing, the cause of the fire remains unknown and it could be some time before that information is released. All that is known at this point is that the data centre caught fire and, in the aftermath, UniSuper and several other businesses suffered partial shut-downs. The fact that the centre resumed operation the following day indicates the fire was not as severe as it could have been.

Data centres the world over are equipped with fire suppression systems in order to minimise the damage fire and smoke could cause. These are chemical or water systems that can extinguish fires without damaging computer hardware. It is assumed such a system is that which saved the Australian data centre.

Unfortunately, fire suppression systems themselves do not always work. A number of years ago, a Romanian data centre operated by ING suffered extensive damage from a fire suppression system test. The system made such a loud boom that the sound waves actually damaged hardware!

Fire Is Always a Risk:

Those of us within the data centre community are fully aware that fire is always a risk. The general public, on the other hand, may not realise just how much of a problem fire can be. For starters, think about the tremendous amount of heat that data centres produce on a daily basis.

Data centres have to be kept cool because excess heat can damage sensitive network hardware. But, more importantly, allowing excess heat to build up could spark a catastrophic fire. The larger a data centre is, the greater the potential for fire if cooling solutions are not designed and implemented properly.

We have seen notable data centre fires all over the world in the past. In 2016, Ford experienced a fire at its US corporate headquarters in Dearborn, Michigan. A government data centre in Ottawa (Canada) also went down in 2016 after hardware suffered severe damage due to inexplicable smoke. And, of course, who can forget the 2015 fire in Azerbaijan that decimated the country's internet service.

Thankfully the data centre fire in Australia was not serious enough to cause widespread damage and knock out services for an extended period. Hopefully, facility owners will identify what caused the fire and take corrective action to prevent it from occurring in the future.

Wednesday, 10 May 2017

Barclays Announces New Cyber Crime Initiative

With cyber crime seemingly increasing on a daily basis, one UK high street bank has decided to fight back. Barclays has launched a new nationwide initiative designed to educate consumers, businesses and authorities in how cyber crimes are carried out and what can be done to prevent them. The initiative includes £10 million for an extensive advertising campaign throughout the UK.

According to Barclays, cyber crime in the form of digital fraud is at an all-time high. In fact, digital fraud now makes up at least half of the total crime reported in the UK. Barclays suspects the numbers could be even higher when one considers how often cyber crimes go unreported. The kinds of crimes that Barclays is referring to include things like scams and digital identity theft.

Surprisingly, older people are not the most vulnerable to cyber crimes involving digital fraud. According to Barclays, that distinction belongs to young people between the ages of 25 and 34. Even more surprising is that highly educated young people in the Greater London area are the most vulnerable group in the UK.

What Barclays Will Do:

It's clear that Barclays alone cannot make a dent in cyber crime and digital fraud. Real change will be the result of banks, businesses, authorities, and the public all working together. With that said, Barclays is committed to doing its part by way of their new Digital Safety initiative.

The first part of the initiative calls for giving Barclays customers more control over how their debit cards are used. Customers will be able to set their own daily withdrawal limits and turn remote purchasing capabilities on and off by way of the Barclays app. On the education front, Barclays has a lot planned.

They now offer an online quiz designed to help people understand their own level of risk. The quiz is followed by helpful tips designed to make individuals more secure based on their answers. Barclays is hoping to help as many as 3 million consumers with the quiz.

As previously mentioned, Barclays will invest £10 million in an advertising campaign that will involve billboards, printed adverts, TV, and online efforts. The ad campaign will target the most vulnerable demographics with essential information they need to understand and the precautions they should be taking.

An updated website will include 'fraud awareness takeovers' in order to promote fraud prevention. Barclays believes that it is more important to make people secure than to sell new products, so these new takeovers will replace many of the existing elements that currently market new products to consumers.

Lastly, Barclays will begin offering educational seminars and support clinics for both businesses and retail consumers. The company hopes to reach as many as one million small and medium-sized businesses with targeted educational opportunities designed to help them reduce their fraud risks.

It is clear that Barclays is serious about addressing cyber crime and digital fraud. Kudos to them for stepping up and committing themselves so extensively.


Wednesday, 3 May 2017

New Apple Data Centre Will Help Heat Homes

It is no secret that Apple is looking to be the dominant technology company where green energy is concerned. Their new corporate headquarters in Cupertino, California (USA) is already slated to run on 100% renewable energy and Apple has made great strides in using more environmentally friendly packaging. Now they have their eyes on a brand-new data centre being built in the Jutland region of Denmark, a data centre that will utilise green energy and recycle its excess heat to help keep local homes warm.

The data centre is being partly powered by recycling agricultural waste from local farms. Apple has partnered with Aarhus University to develop a system that converts the waste into methane gas by way of a biochemical 'digester'. The methane gas can then be harnessed and used to power the facility. What the digester leaves behind becomes fertiliser for local farms.

Apple also says that the data centre will put no stress on the local power grid. Instead, it will be powered by 100% renewable energy. As such, Apple is giving back to the community in multiple ways. It is a great partnership that will benefit local residents, businesses, farmers, the University, and even Apple itself.

A Company-Wide Goal:

We should not be surprised by what Apple is doing in Denmark. After all, the company has stated numerous times that they fully intend to eventually operate all their data centres on 100% renewable energy. All their existing data centres already use renewable power to one extent or another and Apple claims as many as 96% of them are already exclusively renewable.

The renewable energy goals are not what is so surprising about the Denmark project. Rather, it is remarkable that Apple will harness the excess heat their data centre produces and return it to the community as municipal heat for homes. Apple could just as easily have turned around and used that heat as another source of power on their own premises. Instead, the local community will benefit from it.

Apple is not alone in harnessing data centre heat for other purposes. There are others who use excess data centre heat to keep their own offices warm and still others who use it to generate the hot water their facilities need. And when you stop to think about it, heat recycling strategies make perfect sense.

Data centres are not only insatiable users of power; they also produce a tremendous amount of heat. There really is no viable reason to allow that heat to escape when it can be reclaimed for so many purposes. The fact that it has taken technology companies so long to get to this point is the only thing that really surprises us about heat recycling.

Apple's new Denmark data centre will be a model of renewable energy and recycling when it finally opens. Apple might be hard-pressed to call themselves the world leader in green technology at this moment in time, but they are certainly among the industry's major players.

Tuesday, 11 April 2017

Keeping Sensitive Data Hidden

Network troubleshooting, performance monitoring, and security are daily tasks in the data centre. Add data privacy and other regulations in the healthcare, government, education, finance and other sectors and you are adding another level of complexity to your network monitoring. Network visibility solutions that recognise data patterns can help reduce business risks by inspecting the packet payload, providing insights on specific data patterns, masking data to improve data privacy and support compliance to HIPAA1, PCI2 and internal best practices or recognising patterns that alert security. 

Pattern matching uses regular expressions to define search patterns. These patterns can then be used to find strings of characters in files, databases and network traffic. One of the earliest uses for pattern matching was text editing. A user could use a regular expression to search and replace a particular string throughout an entire document using a single command.

An example of a regular expression is “\b\d{5}\b.” This expression can be used to find any five digit US zip code, such as 49017. This regular expression can be expanded to search for a nine digit zip code like 49017-3822. The expanded version of the expression is “\b\d{5}-\d{4}\b.”

After a desired string of characters is matched by a regular expression, several types of actions can be taken. Depending on the system, these actions can include:

·        Generate an alert message
·        Highlight the data
·        Mask the data by replacing each of its characters with a different character
·        Remove the data altogether

An example use for masking data is complying with privacy regulations like HIPAA or PHI. These regulations require companies and organization to protect private information, such as social security numbers, credit card numbers, and health related information.

Pattern Matching Applications:

Today, pattern matching is used in numerous applications like text editing, compiling computer programs, and protecting private data during network monitoring activities.

Protecting private data, while monitoring networks, represents one of the growing uses for pattern matching. In order to solve a network problem, a trouble shooter must monitor network traffic and examine its packet headers (e.g. Ethernet Header, IP Header, etc.). However, the payload portion of a packet may include a person’s personal information that needs to be protected.

Pattern matching can be used to mask personal data in the payload portion of each packet prior to the packet being examined. This capability assists organizations with complying with regulations like HIPPA and PHI.

Another use for pattern matching is filtering. When a match occurs, the action can be to either drop the packet or pass it. This type of application is applicable when a virus or malware is identified in a packet. In some cases, the action may include dropping the entire network session.

Typical Regular Expressions:

A typical regular expression library could include the ability to search for the following types of data:

·        Credit Card Numbers
·        Phone Numbers
·        Zip Code Numbers
·        Email Addresses
·        Postal Addresses

Typical Pattern Matching Features:

A user should easily be able to perform the following functions with a pattern matching system:

·        Have commonly used regular expressions available in a library.
·        Add additional regular expressions to the regular expression library by copying them from the plethora of expressions found on the Internet.
·        Test whether a regular expression matches a particular string without having to configure a network to send the string through the system.
·        Allow the user to mask data using a user selectable character.

APCON delivers a pattern matching feature as part of its network and security visibility solution. This allows the inspection of the packet payload to look for specific data patterns and masks the matched data, improving data privacy and supporting compliance to HIPAA, PCI and internal best practices. For an example of a network pattern matching system, check out Apcon’s new pattern matching feature on the HyperEngine packet processor blade or contact Kevin Copestake, UK & Ireland Sales Manager / +44 (0) 7834 868628 for more information.

Compliance Regulations
1Health Insurance Portability and Accountability Act (HIPAA)
2Protected Health Information (PHI)

Guest blog by APCON.  For a link to the original blog plus related diagrams, please visit

Wednesday, 5 April 2017

Edge Data Centres have arrived but how resilient are they?

The massive migration of critical applications from traditional data centres to the cloud has garnered much attention from analysts, industry observers, and data centre stakeholders.  However, as the great cloud migration transforms the data centre industry, a smaller, less noticed revolution has been taking place around the non-cloud applications that have been left behind. These “edge” applications have remained on-premise and, because of the nature of the cloud, the criticality of these applications has increased significantly.

Let me explain:  The centralized cloud was conceived for applications where timing wasn’t absolutely crucial.  As critical applications shifted to the cloud, it became apparent that latency, bandwidth limitations, security, and other regulatory requirements were placing limits on what could be placed in the cloud.  It was deemed, on a case-by-case basis, that certain existing applications (e.g. factory floor processing), and indeed some new emerging applications (like self-driving cars, smart traffic lights, and other “Internet of Things” high bandwidth apps), were more suited for remaining on the edge.

Considering the nature of these rapid changes, it is easy for some data centre planners to misinterpret the cloud trend and equate the decreased footprint and capacity of the on-premise data centre with a lower criticality.  In fact, the opposite is true.  Because of the need for a greater level of control, adherence to regulatory requirements, low latency, and connectivity, these new edge data centres need to be designed with criticality and high availability in mind.

The issue is that many downsized on-premise data centres are not properly designed to assume their new role as critical data outposts.  Most are organized as one or two servers housed within a wiring closet.  As such, these sites, as currently configured, are prone to system downtime and physical security risks, and therefore, require some rethinking.

Systems redundancy is also an issue.  With most of the applications living in the cloud, when that access point is down, employees cannot be productive.  The edge systems, when kept up and running during these downtime scenarios, help to bolster business continuity.

Steps that enhance edge resiliency:

In order to enhance critical edge application availability, several best practices are recommended:
Enhanced security – When you enter some of these server rooms and closets, you typically see unsecured entry doors and open racks (no doors). To enhance security, equipment should be moved to a locked room or placed within a locked enclosure.  Biometric access control should be considered.  

For harsh environments, equipment should be secured in an enclosure that protects against dust, water, humidity, and vandalism.  Deploy video surveillance and 24 x 7 environmental monitoring.
Dedicated cooling – Traditional small rooms and closets often rely on the building’s comfort cooling system. This may no longer be enough to keep systems up and running.  Reassess cooling to determine whether proper cooling and humidification requires a passive airflow, active airflow, or a dedicated cooling approach.

DCIM management – These rooms are often left alone with no dedicated staff or software to manage the assets and to ensure downtime is avoided. Take inventory of the existing management methods and systems.  Consolidate to a centralized monitoring platform for all assets across these remote sites.  Deploy remote monitoring when human resources are constrained.

Rack management – Cable management within racks in these remote locations is often an after-thought, causing cable clutter, obstructions to airflow within the racks, and increased human error during adds/moves/changes. Modern racks, equipped with easy cable management options can lower unanticipated downtime risks.

Redundancy – Power (UPS, distribution) systems are often 1N in traditional environments which decreases availability and eliminates the ability to keep systems up and running when maintenance is performed. Consider redundant power paths for concurrent maintainability in critical sites.  Ensure critical circuits are on emergency generator.  Consider adding a second network provider for critical sites.  Organize network cables with network management cable devices (raceways, routing systems, and ties).  Label and color-code network lines to avoid human error.

A systematic approach to evaluating small remote data centres is necessary to ensure greatest return on edge investments.  To learn more, download Schneider Electric White Paper 256, “Why Cloud Computing is Requiring us to Rethink Resiliency at the Edge”.  This paper reviews a simple method for organizing a scorecard that allows executives and managers to evaluate the resiliency of their edge environments.

Guest blog by Wendy Torell, Senior Research Analyst at Schneider Electric’s Data Center Science Centre

Tuesday, 28 March 2017

How Do We Balance Security with Personal Privacy?

As the whole world knows by now, March 22nd 2017 was a deadly day in London. A man identified as Khalid Masood drove a rental car onto the pavement as he crossed Westminster Bridge, purposely hitting pedestrians as he made his way directly to the Houses of Parliament, where he exited the vehicle and stabbed a police officer to death before being shot by other officers.

In the hours following the deadly incident, police investigators learned that Masood had used the WhatsApp messaging service minutes before beginning his rampage. Police do not know what was communicated due to end-to-end encryption that prevents them from seeing the actual contents of the communications. The incident itself - along with the encrypted posts – has, once again, led the UK government to raise the question of balancing security with privacy.

End-To-End Encryption Explained

Many popular mobile apps, including WhatsApp and iMessage, use end-to-end encryption by default. With this kind of encryption, a message is encrypted at its source, sent over the network, and then decrypted by the recipient device at the other end. The server that carries the data is unable to decrypt data because it does not have the shared key.

The result of end-to-end encryption is that companies like Facebook and Apple can provide only limited amounts of data to police investigators. In the Masood case, the only way for investigators to know what he communicated is to break into his password-protected phone.

Security vs Privacy Conundrum

Government officials have made clear in the wake of this latest attack that they expect technology companies not to provide a means of online communication that cannot be accessed by authorities. Yet their calls for less secure systems fly in the face of demands that those same companies take every possible step to protect personal privacy. In essence, it would seem the government wants it both ways.

Some suggest that companies such as Facebook (owners of WhatsApp) and Apple are deploying end-to-end encryption in order to take themselves out of the equation when incidents like this occur. Whether that is true or not, they also say that making their hardware and software less secure gives their customers legitimate concerns about their own privacy.

If technology makers created an encryption system that could be accessed by authorities in the event of a crime or terrorist act, they have also created a system that can be accessed by hackers. Less secure means less secure across the board. You cannot make technology easier for authorities to access yet still more difficult for criminals and terrorists. It doesn't work that way.

The stark reality is that there is no way to balance security and personal privacy. They are weighted differently, depending on your perspective and your reasons for wanting them. In the end, one will always prevail over the other to some degree. So do we strive for greater security at the expense of personal privacy, or do we make sure privacy is still the primary concern?

Tuesday, 21 March 2017

Data Breaches Do Not Require Computers or Networks

We undeniably should be doing everything we can to prevent data breaches. But to expect that we'll ever reach a day when any and all data breaches are eliminated is unrealistic. The fact is that humans are imperfect creatures capable of making all kinds of mistakes. As a case in point, consider a recent £60,000 fine levied by the Information Commissioner's Office (ICO) against a local council that allowed a used cabinet to be sent to a second-hand shop with client files still inside.

On 20th March (2017) the ICO released a bulletin explaining that it had fined Norfolk County Council after a customer purchased a cabinet from a local second-hand shop only to discover case files still inside. Those case files contained sensitive information relating to seven children, according to the bulletin.

ICO Head of Enforcement Steve Eckersley wrote in the statement:

"Councils have a duty to look after any personal information they hold, all the more so when highly sensitive information is concerned – in particular about adults and children in vulnerable circumstances. For no good reason, Norfolk County Council appears to have overlooked the need to ensure it had robust measures in place to protect this information."

The ICO did not release a lot of details about the case, but these should be easy to deduce based on typical human behaviour. It is likely that council officials decided to dispose of the cabinet and assigned a low-level employee to clean it out in preparation for transfer. The employee failed to remove all the files from the cabinet before it left the council's facility.

Once at the second-hand shop, its employees also failed to thoroughly inspect the unit before putting it on the sale floor. It was purchased, taken home, and only then opened to reveal the case files.

Multiple Failures Along the Line

The point of our blog post is not to assign blame or to ridicule the County Council mentioned in any way. Rather, it is to show that there were multiple failures along the line that led to the new owner of the cabinet ultimately finding sensitive data. It is not unlike network data breaches that are the result of multiple failures.

In the Norfolk County Council case, the employee who cleaned out the cabinet failed to do so thoroughly. That was followed by an inadequate inspection by a member of management and those responsible for transporting the cabinet to the second-hand shop. Shop staff also failed in that they did not thoroughly inspect the cabinet prior to offering it for sale.

In the arena of network security, there are many more layers and a lot more hands buried deep in the security pie. Therefore, the potential for failure is increased. We are doing a very good job of protecting personal data stored on networks and we must continue doing our best to improve the security, however we are never going to eliminate it fully. Unfortunately, failure is part of being human.

Wednesday, 15 March 2017

Record Fine Illustrates the Vulnerability of Information

Have you ever entered personal information into an online account without reading the fine print? Of course you have; we all do it from time to time. What you may not know is that located in all that fine print may be a sentence that says something like, 'you agree that we can share your information with third parties whose offers we think might interest you.'

Such statements act as digital confirmation that you are giving permission for your personal information to be sold to others. The sale of personal information is a serious problem, as demonstrated by a record fine just announced by the Information Commissioner's Office (ICO) against a company accused of making tens of millions of nuisance calls.

According to an ICO press release dated 9 March 2017, a Hampshire company trading as Media Tactics was found to have made 22 million nuisance calls using phone numbers purchased from other online entities that had collected the information. The company made nuisance calls covering a broad range of topics from debt management to personal injury claims.

"These 22 million pre-recorded calls will have left many people feeling frustrated," said the ICO's Steve Eckersley. "But some people found them alarming and distressing – we heard from one complainant who found the calls depressing and another who was too frightened to answer any calls at all."

According to the law, companies like Media Tactics can only place calls to people who have given their consent. Assuming their claims of purchasing phone numbers from other entities who obtained such consent is true, we have a much larger problem here than just one company making nuisance phone calls. We have the greater issue of online entities selling personal information without discriminating.

When That Fine Line Is Crossed

The idea of selling personal information is nothing new, nor is it confined to the digital arena. Companies have been selling names, addresses and phone numbers since long before the internet age. But it seems in recent years we have crossed that fine line for which there appears to be no turning back.

Internet users have a right to expect privacy when they enter their personal information for the purposes of making a purchase, opening an online account or other such activities. Just because a company inserts a consent disclosure in the fine print does not absolve them from being guilty of crossing the line. Such entities may not be guilty of any criminal offence, but there is the ethical side of things to be concerned about.

The average consumer is left having to make a choice of not entering personal information into online accounts or doing so and hoping for the best. Remember, this is not a matter of security. Online entities voluntarily chose to sell information to Media Tactics, information that was collected legally and with alleged consent.

For the record, the ICO's fine against Media Tactics was £270,000. Hopefully, it will serve as a deterrent to other companies engaging in the same kinds of ethically-challenged tactics.

Tuesday, 7 March 2017

UK Government Embarks on New Digital Strategy

The UK has been a world leader in digital technology and the digital economy for a while now. More importantly, our position as a world leader is not something we have come by through mere accident or coincidence. It has been a concerted effort by government leaders and the private sector to build the infrastructure and business environment necessary to be a world leader. And now the government intends to go even further with a brand-new Digital Strategy for the UK.

A press release issued by the Culture Secretary on 1st March lays out plans by which the government hopes to ensure that Britain is the best place in the world to ‘start and grow a digital business’. Officially dubbed the 'Digital Strategy', the government plan calls for:

·        developing the skills, infrastructure and innovation necessary to support the digital economy in Britain;
·        developing new Digital Skills Partnerships for the purposes of creating digital training opportunities; and
·        supporting the digital sector through long-term investments intended to promote productivity and innovation.

The Culture Secretary estimates that the programme will create in the region of 4 million free digital skills training opportunities through partnerships between government, charities, volunteer organisations and the private sector. Training opportunities should ensure that there are enough skilled workers to support the growing number of digital businesses estimated to crop up over the next 5 to 10 years.

The Culture Secretary recognised three specific companies:

·        Lloyds Banking Group – Plans to train 2.5 million individuals, SMEs and charities.
·        Google – Plans to offer five hours of free digital skills training to individuals.
·        Barclays – Plans to train 45,000 young people in basic coding and as many as 1 million adults in general digital skills.

The Culture Secretary's press release indicates that the Digital Strategy is part of the larger Industrial Strategy the government is hoping will make Britain the most competitive nation in the world. If the strategy succeeds, Britain will be the place to both locate digital businesses and innovate technologies that will drive the future.

The Coming Digital Economy

Those of us already active in the digital world are not at all surprised by the government's action. As much as we rely on digital technology in the current day and age, the future looks even more digitally inclusive. Computers are getting more powerful, networks are expanding, and global communications are as robust as they have ever been. The coming digital economy of 2020 and beyond will make what we do today look pale by comparison.

The government and private sector businesses supporting the Digital Strategy are right to assume that our strategy to remain a world leader cannot be sustained without proper training and investment. They are making an effort to provide what we need to continue being the world leader. Indeed, the Digital Strategy could end up being one of the best things we have ever done to help ourselves on the world stage.