Wednesday, 20 September 2017

Sound: The Next Frontier for High Speed Computer Processing?

When most people think about sound in relation to high-speed networking, they think about the quality of sound embedded in their high-definition videos or streaming from their favourite music services. What if we told you that sound could be the next frontier in high-speed computer processing? Well, it looks like that might be the case in the not-too-distant future...

Breakthrough research out of the University of Sydney (Australia) has led to the development of a process capable of converting optical data into storable sound waves. The process makes it very possible that future iterations of network technology could store and send data as light waves to receivers that would then convert those light waves into sound waves that computer chips could interpret as meaningful data.

The idea of using light waves to store and transmit data is nothing new. Scientists have known about the potential for years. The problem has always been converting optical data into a format computer chips could work with, in a way that was both fast and efficient.

University of Sydney researchers described the problem of converting optical data into something more usable as comparable to the difference between thunder and lightning. In other words, the speed of light in air is more than 880,000 times faster than the speed of sound. Even if computer chips could read data in the optical domain, they would not be able to keep up with the speed at which that data travels. Thus, the need to slow it down during the conversion process.

Phononics Is the Answer

The field of storing and transmitting data in the audio domain is known as phononics. Both private enterprise and public institutions have been researching phononics for quite some time in the hope of developing some sort of technological advantage over the current process of converting optical data into electrons. The Australian researchers may finally have come up with the answer via phononics.

Current technologies that transmit optical data before converting it into electrons that can be read and stored by computer chips still produces pretty amazing results compared to our capabilities of a decade ago. The process has an inherent weakness though: it produces an incredible amount of waste heat. That waste heat limits the practical use of optical-electron applications. Phononics has solved the problem.

The process developed by the Australian researchers eliminates waste heat by converting optical data to sound waves. More importantly, computer chips can more quickly read, analyse and use audio data as compared to electron data. Researchers say that the process has proved successful enough to open the door to more powerful signal processing and future quantum computing applications.

Those of us in the data centre industry eagerly await further development from Australia. If it turns out their process possesses both mass-market appeal and financial viability, it will completely transform our understanding and application of high-speed computer processing. There will be huge implications in everything from global financial transactions to scientific research to global media access.

Thursday, 14 September 2017

TPS Violation Costs Company £85,000

It is against the law to call consumers whose phone numbers are registered with the Telephone Preference Service (TPS) without explicit consent from such consumers. Being found in violation could cost tens of thousands of pounds, as one Dartford-based telephone company recently found out.

According to an 11th September release from the Information Commissioner's Office (ICO), True Telecom Ltd made nuisance phone calls to consumers for more than two years despite many of those they called being on the TPS list. More astoundingly, the company continued making the calls even after being warned by the ICO to cease. The calls were placed between April 2015 and April 2017, and during that time, more than 200 complaints were registered with the ICO.

The company attempted to mislead consumers by concealing the number they were calling from and giving people the impression they were from the organisation known as BT Openreach. According to the ICO, True Telecom was unable to prove its innocence to investigators by providing evidence of consent.

The result of True Telecom's actions was an £85,000 fine and an official ICO enforcement notice informing True Telecom that it must stop making illegal phone calls immediately. Continuing the calls could result in court action down the road.

ICO Head of Enforcement Steve Eckersley said in his official statement that the rules pertaining to nuisance calls are clear and that his agency intends to enforce them. He went on to say:

"These calls are at best annoying and at worst downright distressing, and companies who pester the public in this way must understand they won't get away with it. The ICO will take action."

Respecting the Privacy of Consumers

Few would argue the fact that we live in a time when the information age has given way to the age of big data. Few would argue that we need rules and regulations in place to protect the privacy of consumers. The rules surrounding nuisance phone calls certainly apply. They were put in place to prevent companies from pressing consumers with repeated, unwanted phone calls.

It is all well and good that True Telecom has been investigated and fined for their illegal activity. But the ICO's report begs the question of how this company got the contact information they used to make the calls. If that information was supplied directly by the consumers themselves as a result of doing business with True Telecom, that's one thing. But, if the information was sold to them by another organisation, we are talking an entirely different matter.

Could it be that it's time to start enacting rules that prevent companies from selling personal information? If we are truly trying to protect the privacy and security of consumers, why on earth do we allow personal information to be packaged and sold? It makes no sense. As long as we do nothing about this loophole, consumers will continue to be victimised by companies who care nothing about their privacy and security.

Tuesday, 5 September 2017

Accident Reveals Sensitive Information on Council Website

A consumer innocently browsing the internet accidentally stumbled across sensitive personal information left unsecured on a council website. This immediately raised concerns about how such data could be left out in the open, at the same time reminding organisations that no one is immune to breaches of data security. The revelation has also led to a substantial fine.

In a 31st August news release from the Information Commissioner's Office (ICO), it was revealed that Nottinghamshire County Council made protected data – including personal addresses, postcodes, and care requirements of the elderly and disabled – publicly available on an insecure site. The data was uncovered when a member of the public stumbled across it without the need to use a username and password to access the information.

ICO head of enforcement Steve Eckersley wrote in the news release:

"This was a serious and prolonged breach of the law. For no good reason, the council overlooked the need to put robust measures in place to protect people's personal information, despite having the financial and staffing resources available."

Eckersley went on to state that the actions by those responsible were both ‘unacceptable and inexcusable’ given how sensitive the data is. The data pertained primarily to individuals who received services based on the council's Homecare Allocation System (HCAS) first launched in 2011. The most egregious aspect of the mistake is the fact that the information had been left unprotected for five years by the time it was discovered in June 2016.

Nottinghamshire County Council has been fined £70,000 by the ICO for its carelessness. It is not yet known whether the 3,000 people whose data was left unprotected suffered any negative consequences as a result.

Proof of the Need for Due Diligence

As an organisation involved in the data centre industry, it is apparent to us that Nottinghamshire County Council was extremely careless in the handling of the HCAS data. It also seems rather strange that the mistake went unnoticed for so long given how much attention the ICO is supposed to be giving to matters of this sort. If anything, the story is further proof of the need for due diligence among those that store data as well government agencies tasked with protecting the general public.

Whenever any online system is created for the purposes of collecting, storing and utilising personal data, a tremendous amount of responsibility comes with that data. There is never an excuse to allow any sort of sensitive data to be freely available to the general public without need for protected access.

The ICO news release says that Nottinghamshire County Council has ‘offered no mitigation’ to this point. Let's hope that this changes sooner rather than later. The public deserves to know how the Council responded to the original revelation and what procedures are now in place to make sure such exposure never occurs again. If we cannot trust those entrusted to protect us, our data security problems are much bigger than we realise.

Friday, 1 September 2017

Houston Data Centres: When Disaster Strikes

As everyone is no doubt aware by now, America's fourth-largest city, Houston, in Texas, was hit with a major Category 4 hurricane late last week. Though the storm was quickly downgraded after making landfall, it has still caused unprecedented damage through relentless rainfall that is likely to top 4 feet in some areas before it's done. US officials are now saying Houston and south-central Texas have never experienced an event like this.

Being in the data centre business, we are curious as to how the four major data centres located in Houston are faring today. There is both good and bad news.

The good news is that all four data centres are still operational and not reporting any damage. This is not unexpected, given the fact that data centre designers build into their plans certain protections against natural disasters. But that brings us to the bad news: streets in Houston are closed, and many of those streets around the data centres are flooded however the power has stayed on at all four facilities thus far.

Running on Emergency Generators

There is no way to know when power will be restored until after the rain finally stops. It could be days but it is more likely to take weeks. Data centre builders plan for such events by equipping facilities with generators and all four Houston data centres are prepared to run their generators if necessary. The question is, will they actually have to?

Should they need the generators, the facilities would be depending on fuel deliveries to keep them going. Data centres are second in line for fuel deliveries behind hospitals, police stations and other first-responder facilities, but with the roads flooded, how long before the fuel trucks could actually make it through?

Preparing Customers for Data Recovery

No one could have predicted that Houston would get 4 feet of rainfall from Harvey. In fact, the storm exceeded all expectations normally associated with hurricanes coming off the Gulf of Mexico. Unfortunately, all the preparations in the world cannot account for everything. Knowing what they now know about the devastation, data centre officials are beginning to contact customers in the hope of helping them make data recovery plans should the unthinkable happen.

The lesson in all of this is that nature will do what it wants to do. Data centre designers and operators go to great lengths to protect their facilities against the most severe weather, but sometimes it's just not enough. Hopefully Houston's four data centres will not suffer any interruption of service. Whether they do or not, there will be plenty of lessons to be learned in the aftermath Hurricane Harvey and its historic flooding.

Wednesday, 16 August 2017

From Superfast to Ultrafast – Speedier Broadband on the Way

On the heels of BT offering to invest in the infrastructure needed to bring high-speed internet to those Britons who do not yet have it, researchers have announced the possibility of current technology becoming obsolete within a short amount of time. We aren't talking high-speed Internet any more. We're not even talking superfast. Instead, we are now looking at ultrafast speeds measured in gigabytes rather than megabytes.

Ultrafast wi-fi has been on the radar for quite some time now. Until recently though, making it happen has remained somewhat of a mystery. That mystery may have been solved by switching from traditional microwaves to terahertz. Researchers at Brown University School of Engineering in Providence, Rhode Island (USA) have demonstrated they can "transmit separate data streams on terahertz waves at very high speeds and with very low error rates," according a report on the Telegraph website.

"This is the first time anybody has characterised a terahertz multiplex system using actual data," the researcher said in an official statement, "and our results show that our approach could be viable in future terahertz wireless networks."

What It Means to You

If you don't know the difference between a microwave and a terahertz, you are not alone. Here's what it means to you in simple terms: ultrafast internet access that could be upwards of 100 times faster than the best high-speed service now available. We are looking at speeds of 50 GB per second as opposed to 500 MB per second, the highest speed available with state-of-the-art microwave technology.

If science is successful in developing terahertz applications, the implications of the new technology would be incredible. First and foremost, terahertz networks would bring to an end the very real danger of outstripping microwave capacity with current high-speed applications.

Secondly, we would be able to develop platforms capable of much higher data densities. Terahertz waves operate at higher frequencies than microwaves and higher frequencies means more data packed into the same stream.

Thirdly, a successful venture into terahertz technology would mean high definition streaming on-the-go for everything from live television to superfast data crunching for banks, businesses and other large consumers of data. That alone would do wonders for worldwide financial markets.

Proving It Works

Proof-of-concept experiments out of Brown University involved two HD television broadcasts that were encoded on two different terahertz frequencies and then sent out across a wi-fi network together. Researchers obtained error-free results at 10 GB per second. Errors were only slight at 50 GB per second and well within the range of standard error correction systems.

From high-speed to superfast to ultrafast, the speeds at which we can send data through the air will only be going up over the next several years. Imagine a wi-fi connection 100 times faster than you currently use. It is possible through terahertz; at least in principle. Now it is up to scientists to make terahertz technology viable for the mass market. It appears as though they are very much on course.

Tuesday, 8 August 2017

Statement of Intent: New UK Consumer Data Protection Rules to be Enforced

A recently issued statement of intent from the Department for Digital, Culture, Media & Sport aims to change the way those who collect data online use it for the foreseeable future. The statement outlines plans to initiate legislation that will further protect consumers against the misuse of their personal information, with or without their consent.

Among the new protections is greater control over data by individual consumers. As an example, consumers will be able to maximise the 'right to be forgotten' by requesting social media sites erase their information.

All the new data protection rules will be encapsulated in legislation to be known as the Data Protection Bill. The government says the bill will instil confidence in people that they have absolute control over their data. According to the statement of intent, research currently shows that up to 80% of the public lacks that confidence right now.

Shifting the Responsibility

Once the new legislation becomes law, it will shift the burden of responsibility from the consumer to the organisation that collects data. It will require organisations to obtain explicit consent for processing certain kinds of personal data. At the same time, it will eliminate the default tick boxes organisations currently use to obtain consent. It is the government's contention that such tick boxes are largely ignored.

The legislation will also:

  • Make it easier for consumers to withdraw consent
  • Give consumers the ability to request their personal data be erased
  • Give parents and guardians control over data collected from minors
  • Expand the definition of 'personal data' to include cookie information, ip addresses, and dna information
  • Give consumers more power to force organisations to reveal what kinds of data they have
  • Make it easier for consumers to move data between organisations
  • Update and strengthen existing data protection laws to bring them in line with the state of the digital economy
The government is extremely serious about shifting responsibility from consumers to data collectors. They have created significant penalties for violators, and the Information Commissioners Office (ICO) will be granted greater authority to enforce the rules. The statement of intent makes it clear that organisations will be held accountable for the data they gather.

"Our measures are designed to support businesses in their use of data, and give consumers the confidence that their data is protected and those who misuse it will be held to account," said Minister of State for Digital Matt Hancock.

Hancock went on to explain that the legislation will give consumers more control over their data while also preparing the UK for Brexit. On that latter point, the legislation brings UK consumer data protection laws in line with the EU's General Data Protection Regulation.

All that remains to be seen now is whether the final Data Protection Bill lives up to the promises of the statement of intent. If it does, the UK will have one of the strongest consumer data protection laws in the world.

Thursday, 3 August 2017

BT Counters Universal Service Obligation Mandate with Proactive Offer

If you had to choose between the government's proposed universal service obligation mandate and BT proactively building the necessary infrastructure to offer universal broadband, which would you choose? It is an interesting question that MPs and Ofcom now have to wrestle with. Thanks to a new offer by BT, the government may not have to implement the universal service obligation after all.

BT has proposed investing some £600 million in new infrastructure that will provide high-speed broadband to up to 99% of UK households by 2020. Universal access would be available by 2022.

BT defines high-speed broadband as a connection that gets at least 10Mbps. They say approval of their plan would mean most UK households getting broadband through fibre and fixed wireless technologies. Those that could not be reached through traditional infrastructure could be offered satellite broadband.

As an alternative, the government has already proposed the universal service obligation. Should they decide to implement it, every UK household without broadband would be able to request it beginning in 2020. BT would have no choice but to install the necessary infrastructure to meet that request.

Why Make the Offer?

It is a fascinating exercise to try and figure out why BT would make such an expensive offer? Agreeing to spend £600 million is an awfully pricey proposition when you consider that BT already provides high-speed broadband to 95% of UK households. To understand BT's move, it's important to understand how they will come up with the money.

Under the universal service obligation mandate, new broadband customers who get service after making a request could not be charged more than customers already receiving the same service. That would mean BT having to swallow the cost of building infrastructure on a case-by-case basis. Their proposal pays for the infrastructure in a different way.

According to the BBC, all of BT's current customers would pay for the construction through increased monthly bills. Those who eventually get access would then be paying the higher bills as well. From a business standpoint, it just makes better sense for BT to voluntarily create universal broadband access rather than being forced into it.

A Few Concerns

BT's proposal has thus far met with mostly positive reactions. However, there are a couple of concerns. First is the possibility that 10Mbps service would be obsolete before the company finished building the infrastructure. That would mean all of those newly connected households would once again be behind in a game of perpetual catch-up.

The second worry is obviously the higher cost to broadband customers. While the first concern can be addressed through well-planned technology upgrades, there's nothing anyone can do about the second. Someone has to pay to build and maintain the infrastructure. If customers don't pay for it directly, they will have to pay for it through tax contributions to cover government subsidies.

We shall see what the government decides to do with BT's proposal. Either way, universal broadband – even in the hardest to reach places – is the ultimate goal.