Wednesday, 15 November 2017

Johannesburg Cable Heist: Money or Something Else?

Officials in Johannesburg, South Africa have been left scratching their heads, following a brazen cable heist that resulted in the loss of 2 million rand (£110,000) worth of power cables during a burglary some are calling an inside job. The theft occurred at a brand-new data centre in Braamfontein.

News sources say the data centre is a combination data and recovery centre designed to increase the server space and infrastructure necessary for the city to end its reliance on outside service providers. The city essentially wants to host its own data on city-owned servers powered by city infrastructure.

Those plans took a step back after burglars broke into the data centre by entering through an emergency exit on the ground floor. However, there were no signs of forced entry. Once inside the building, the thieves broke into a room where contractors had been storing their tools. They used some of those tools to cut the cables that they eventually stole.

Apparently, the cables were attached to new generators that contractors were testing. There was no loss of power, indicating that the generators were turned off prior to the theft. There were no reports detailing whether the generators were damaged or not. Investigators are now left to speculate as to the motive behind the theft.

Several Possibilities


The first assumption is that the thieves stole the cables for money. After all, they are worth more than £100,000. But how would the thieves off-load the stolen cables without being discovered? This is a question that investigators are still trying to answer. However, there is another possible motive...

In an official statement released after the burglary was discovered, Mayor Herman Mashaba indicated that the heist was an inside job given how little damage was done. He maintained that whoever stole the cables knew exactly what they were looking for and where to find them. He believes the theft may have had nothing to do with money.

Mayor Mashaba has suggested that perhaps the heist occurred in order to dissuade the city of Johannesburg from continuing to build. If it was not to dissuade them, then at least to slow down the progress. If the mayor is right, this would indicate an action taken by one of the companies providing data centre services to the city. They do not want the city to succeed because that would mean a loss of contracts for them.

An Impressive Theft


Right now, there is no clear indication as to the motive behind the theft. Whether it was for money or competitive purposes, one thing is certain: the theft was a rather impressive event in terms of what it took to get in, find the tools, cut the cables and run.

The Mayor has made it clear that the theft will not deter his city's efforts to finish the data and operational centre. It is probably a safe bet that the city will beef up security until the centre is up and running, perhaps even beyond that.

Sources: 


Wednesday, 1 November 2017

WhatsApp and Facebook: Non-Compliance with EU?

Are WhatsApp and Facebook guilty of non-compliance with EU law? That is what a special task force wants to know, according to a 26 October (2017) story published by the BBC. That story says that a data protection task force has been established to consider practices related to data sharing between WhatsApp and Facebook.

Facebook purchased the WhatsApp messaging app in 2014 in order to better compete against Microsoft and other rivals. At the time of purchase, company officials pledged to keep the two platforms completely independent from one another. That changed in 2016 when officials at WhatsApp announced plans in August to begin sharing user information with Facebook.

Under EU law, any such information sharing can only be conducted with the explicit consent of users. Then UK Information Commissioner Elizabeth Denham complained that WhatsApp's plan for obtaining user consent was insufficient to comply with the law. Still, WhatsApp and Facebook went ahead with their plans to share friend suggestions and advertising information on the two platforms.

Deficient User Consent


According to the BBC report, the Information Commissioner's new task force has invited officials from both WhatsApp and Facebook to meet with them. There is no word yet about whether they will or not. However, do not rely on the Information Commissioner going easy on Facebook and its subsidiary. People in positions of power are already unhappy and that will not change unless WhatsApp and Facebook change what they are doing.

The BBC report cited a letter the Working Party to WhatsApp officials. That letter apparently pointed out a number of deficiencies with WhatsApp's current user consent practices, including the following:

  • An unclear pop-up notice that does not fully explain that user information will be shared with Facebook;
  • A misleading implication that WhatsApp's privacy policy has been updated to ‘reflect new features’;
  • Requiring users to uncheck a pre-checked box that otherwise gives consent; and
  • A lack of easier means to allow users to opt out of data sharing.

Greater Scrutiny of Digital Companies


The complaints against WhatsApp and Facebook come at a time when the EU is subjecting digital companies to greater scrutiny over privacy concerns. As to whether WhatsApp and Facebook will face any real penalties for their alleged lack of compliance remains to be seen. But the fact that a task force has been established shows that the government believes it has a fairly compelling case.

If the case goes against WhatsApp and Facebook, it could set the stage for other digital companies revamping their privacy policies. That is not necessarily a bad thing. We already know that people are rather careless about protecting their own data online, so it seems to make sense to implement privacy policies that protect users as much as possible, thereby forcing them to make a conscious decision to be less careless.

In the meantime, WhatsApp users should be aware of what the company is doing with their data. They are probably sharing it with Facebook.

Tuesday, 24 October 2017

London Embracing Square Mile Broadband Innovation

London's “Square Mile” city centre is a hotbed of economic activity and cultural development. It is not all that great when it comes to superfast broadband. London ranks 26th out of 33 European capitals for broadband speed, according to a recent report published by City A.M. But city officials intend to change that.

City A.M. reports that the City of London Corporation is on the cusp of launching a brand-new wi-fi network capable of achieving speeds as high as 180 Mbps within the Square Mile. If the plan comes to fruition, it will make London's city centre one of the fastest places in Europe for wi-fi internet access.

In addition, the government will be investing millions of pounds in the Square Mile over the next few years to upgrade fibre optic networks capable of delivering internet at 1 GB per second. City leaders have their eyes firmly focused on 5G wireless as well, with the intent of ensuring that mobile data services are the fastest in the world.

By February, City of London Corporation chair Catherine McGuinness says some 7,500 residents in 12 City Corporation housing estates will enjoy upgraded fibre optic. London eventually expects to expand the faster broadband throughout the City's seven boroughs.

Broadband the Future of Communications

So why exactly is the City of London pouring so much money into broadband and mobile communications? In a phrase, it is the future of communications. The UK has long been a technology leader in broadband and data delivery services and city officials want London to be at the forefront in both the short and long terms. City leaders believe it is worth the money to develop broadband and mobile services in the Square Mile.

You could make the case that part of the recent push by the City of London Corporation is a direct result of 2016's Brexit vote in as much as experts are warning of a business exit from the capital once the UK pulls out of the EU. Whether that exit actually occurs is of no consequence in this regard. Simply the fear of an exit is enough to spur city leaders to do whatever they can to encourage more businesses to stay in the city. If that means upgraded fibre optic broadband networks and faster wi-fi and mobile services, so be it.

Faster broadband and mobile services in the Square Mile area will certainly benefit local residents and businesses and it will benefit the rest of the UK as well. Over time, what is implemented in the City of London will gradually spread across the entire UK. The only question is whether it will happen fast enough to make us the legitimate leader in Europe.

Irrespective of if it does or not, London's city leaders believe it is imperative to keep the Square Mile at the cutting-edge of communications. They are backing up those beliefs with money; now we will see what that money buys. Hopefully it buys remarkably faster data services very soon.

Tuesday, 17 October 2017

Court Clears Way for New Apple Data Centre in Ireland

The Commercial Court with jurisdiction over County Galway in western Ireland recently dismissed two cases, clearing the way for Apple to take the next steps in developing a group of data centres planned for the county. Apple will spend upwards of €850 million (£762 million) to build the 8-facility campus.

New reports say that two law suits were brought against the project after the local Board gave its permission back in August. Commercial Court justice Paul McDermott rejected the lawsuits on different grounds. Apple may now proceed, though there is still no guarantee that the data centres will be built. Other hurdles will have to be cleared.

Local Objections

The first lawsuit to challenge Apple's plan was brought by a local couple whose home is located near the proposed site. They claimed that Apple failed to carry out a proper environmental impact assessment, making the original Board decision invalid. Justice McDermott disagreed.

The second case was brought by another local resident who believed that proper planning procedures were not being followed. The plaintiff claimed to not be opposed to Apple's plans per se, he was just convinced that there were some planning procedure issues. Apple maintained that the plaintiff had made no submissions to the Galway County Council in opposition to the project, nor had he appealed to the local Board. The Commercial Court sided with Apple.

Big Plans by Apple

Since the project was first proposed, Apple has had big plans for Galway. They have maintained all along that building the new data centres will add hundreds of jobs to the local area while also helping to meet the growing demand for data processing and storage in Ireland.

Apple has not detailed exactly what they plan to do with the data centre, but it is not beyond the realms of possibility to assume it could be a very important data processing hub for the British Isles, if not most of Western Europe. Some news reports have speculated that Apple wants to use the new facilities to power everything from the iTunes Store to iMessage throughout Europe.

Irish Minister for community development Seán Kyne greeted the Commercial Court ruling with delight, calling it "very positive news for Galway and the West of Ireland." He and some 4,000 local members of an Apple Facebook page are encouraged by the ruling, especially given that the project has been delayed numerous times over the past two years.

It is understandable that there are objections whenever a data centre of this size is proposed. However, the courts have to be very careful about ruling based on public opinion. The digital world is expanding exponentially with every passing quarter and we are going to need a lot more data centres in the very near future to keep up with demand. Unless the world is ready to go back to the pre-digital era, both consumers and courts have to be willing to allow data centres to be built.

Wednesday, 4 October 2017

New Microsoft Data Centre Powered by Natural Gas

No matter what you think of Microsoft software and licencing, it is hard to argue against the fact that the American company is among a small handful of technology leaders paving the way to a greener future. The latest iteration of Microsoft's efforts in the green arena come by way of a brand-new data centre – they are calling it a 'lab' instead – powered entirely by natural gas.

Built in Seattle in the United States, Microsoft's Advanced Energy Lab is a new kind of data centre designed around Microsoft's decades-old 'Tent City' concept. What makes the lab so unique is the fact that it was built from the ground up with the goal of being completely separate from grid infrastructure. Microsoft officials say this is a distinct difference in as much as other efforts to use renewable energy to power data centres have been pursued in parallel with grid energy. Microsoft wanted to be the first to come up with a design that required absolutely no power from the grid.

Natural Gas and Fuel Cells

The Advanced Energy Lab powers its servers with energy derived from natural gas. Servers are hooked directly to a natural gas connection that utilises highly efficient fuel cells for power. The fuel cells convert energy from the gas into electricity for both server power and cooling. The benefits to this design are numerous:

  • Keeping power separate from the grid allows the data centre to continue operating even if the surrounding grid goes down due to natural disaster or infrastructure failure
  • The system is more efficient because it reduces the waste and loss of traditional grid distribution, transmission and conversion
  • The design is a comparatively simple one as well, reducing the likelihood of failure by reducing the number of 'moving parts' in the system
  • Data centres based on this design will cost less to build, operate and maintain across-the-board
Microsoft began working on the lab in earnest after developing a partnership with the National Fuel Cell Research Centre in 2013. Their first promising breakthrough came in 2014 when a pilot project proved that fuel cells do not necessarily require clean natural gas to work. The pilot proved that biogas, a renewable fuel, would work just as effectively.

According to Microsoft, the Advanced Energy Lab encapsulates everything the company has learned thus far about natural gas and fuel cells working in tandem to generate electricity. In the coming months and years, they will be refining the technology with the goal of eventually putting it into service.

Microsoft eventually hopes to put together an energy-independent, green and efficient data centre, capable of meeting our ever-expanding data needs without having any negative impact on the environment. It would appear as though the Advanced Energy Lab is a rather large step in that direction. Where they go from here is anyone's guess, but you can bet whatever Microsoft does will probably break new ground. If nothing else, it will be fascinating to watch…

Wednesday, 27 September 2017

New Undersea Cable Comes at a Critical Time

Hurricane Sandy and the damage she unleashed on the US north-east coast in 2012 was a wake-up call for Microsoft. In the aftermath of the storm, international network communications were disrupted for hours as a result of undersea cables that terminated in New Jersey being damaged. Officials at Microsoft suddenly realised it was not wise to have a single location on the East Coast for incoming data. They set out to change that.

Microsoft has since teamed up with Facebook and Telxius, a Spanish technology company and subsidiary of Telefónica, to lay a brand-new subsea cable linking the US and Spain. They have named the cable Marea. On either side are the now sister cities of Virginia Beach, Virginia in the US and Bilboa in Spain. The two locations were chosen because they are well south of connection points in both countries, increasing the likelihood that a natural disaster will not simultaneously knock out both Marea and other connection points farther north.

According to Microsoft president Brad Smith, the new cable "comes at a critical time." The cable is capable of 160 TB per second which, according to a Microsoft blog, is 16 million times faster than the average residential internet connection. At a time when the amount of data flowing across the world's networks is increasing exponentially, the new cable offers more than just redundancy. It also increases capacity.

"Submarine cables in the Atlantic already carry 55 per cent more data than trans-Pacific routes and 40 per cent more data than between the U.S. and Latin America," Smith said. "There is no question that the demand for data flows across the Atlantic will continue to increase and Marea will provide a critical connection for the United States, Spain and beyond."

More Connected Every Day

It is interesting to note that Microsoft and Facebook are normally internet rivals competing for the largest possible market share. The fact that they teamed up along with Telxius is proof of just how vital a new undersea cable was and is. There is no denying that we are becoming a more connected world with each passing day, and we are going to need ever greater capacity and redundancy to keep up as technology evolves.

Marea is a highly advanced data cable that is remarkably different in a number of ways. Most notably, the technology behind the cable involves a highly flexible design that will allow for better future scalability and modification. Right now, it also represents an entirely new and previously unused route for transatlantic data between Europe and North America. Increased capacity is critical in a day and age in which mobility and the Internet of Things are both growing exponentially.

Marea has come at a critical time. It was obvious following Sandy that something had to be done to increase capacity and redundancy. Had Microsoft and its two partners not acted when they did, someone else probably would have picked up the baton. At any rate, global communications will be the main beneficiary.

Wednesday, 20 September 2017

Sound: The Next Frontier for High Speed Computer Processing?

When most people think about sound in relation to high-speed networking, they think about the quality of sound embedded in their high-definition videos or streaming from their favourite music services. What if we told you that sound could be the next frontier in high-speed computer processing? Well, it looks like that might be the case in the not-too-distant future...

Breakthrough research out of the University of Sydney (Australia) has led to the development of a process capable of converting optical data into storable sound waves. The process makes it very possible that future iterations of network technology could store and send data as light waves to receivers that would then convert those light waves into sound waves that computer chips could interpret as meaningful data.

The idea of using light waves to store and transmit data is nothing new. Scientists have known about the potential for years. The problem has always been converting optical data into a format computer chips could work with, in a way that was both fast and efficient.

University of Sydney researchers described the problem of converting optical data into something more usable as comparable to the difference between thunder and lightning. In other words, the speed of light in air is more than 880,000 times faster than the speed of sound. Even if computer chips could read data in the optical domain, they would not be able to keep up with the speed at which that data travels. Thus, the need to slow it down during the conversion process.

Phononics Is the Answer

The field of storing and transmitting data in the audio domain is known as phononics. Both private enterprise and public institutions have been researching phononics for quite some time in the hope of developing some sort of technological advantage over the current process of converting optical data into electrons. The Australian researchers may finally have come up with the answer via phononics.

Current technologies that transmit optical data before converting it into electrons that can be read and stored by computer chips still produces pretty amazing results compared to our capabilities of a decade ago. The process has an inherent weakness though: it produces an incredible amount of waste heat. That waste heat limits the practical use of optical-electron applications. Phononics has solved the problem.

The process developed by the Australian researchers eliminates waste heat by converting optical data to sound waves. More importantly, computer chips can more quickly read, analyse and use audio data as compared to electron data. Researchers say that the process has proved successful enough to open the door to more powerful signal processing and future quantum computing applications.

Those of us in the data centre industry eagerly await further development from Australia. If it turns out their process possesses both mass-market appeal and financial viability, it will completely transform our understanding and application of high-speed computer processing. There will be huge implications in everything from global financial transactions to scientific research to global media access.

Thursday, 14 September 2017

TPS Violation Costs Company £85,000

It is against the law to call consumers whose phone numbers are registered with the Telephone Preference Service (TPS) without explicit consent from such consumers. Being found in violation could cost tens of thousands of pounds, as one Dartford-based telephone company recently found out.

According to an 11th September release from the Information Commissioner's Office (ICO), True Telecom Ltd made nuisance phone calls to consumers for more than two years despite many of those they called being on the TPS list. More astoundingly, the company continued making the calls even after being warned by the ICO to cease. The calls were placed between April 2015 and April 2017, and during that time, more than 200 complaints were registered with the ICO.

The company attempted to mislead consumers by concealing the number they were calling from and giving people the impression they were from the organisation known as BT Openreach. According to the ICO, True Telecom was unable to prove its innocence to investigators by providing evidence of consent.

The result of True Telecom's actions was an £85,000 fine and an official ICO enforcement notice informing True Telecom that it must stop making illegal phone calls immediately. Continuing the calls could result in court action down the road.

ICO Head of Enforcement Steve Eckersley said in his official statement that the rules pertaining to nuisance calls are clear and that his agency intends to enforce them. He went on to say:

"These calls are at best annoying and at worst downright distressing, and companies who pester the public in this way must understand they won't get away with it. The ICO will take action."

Respecting the Privacy of Consumers

Few would argue the fact that we live in a time when the information age has given way to the age of big data. Few would argue that we need rules and regulations in place to protect the privacy of consumers. The rules surrounding nuisance phone calls certainly apply. They were put in place to prevent companies from pressing consumers with repeated, unwanted phone calls.

It is all well and good that True Telecom has been investigated and fined for their illegal activity. But the ICO's report begs the question of how this company got the contact information they used to make the calls. If that information was supplied directly by the consumers themselves as a result of doing business with True Telecom, that's one thing. But, if the information was sold to them by another organisation, we are talking an entirely different matter.

Could it be that it's time to start enacting rules that prevent companies from selling personal information? If we are truly trying to protect the privacy and security of consumers, why on earth do we allow personal information to be packaged and sold? It makes no sense. As long as we do nothing about this loophole, consumers will continue to be victimised by companies who care nothing about their privacy and security.

Tuesday, 5 September 2017

Accident Reveals Sensitive Information on Council Website

A consumer innocently browsing the internet accidentally stumbled across sensitive personal information left unsecured on a council website. This immediately raised concerns about how such data could be left out in the open, at the same time reminding organisations that no one is immune to breaches of data security. The revelation has also led to a substantial fine.

In a 31st August news release from the Information Commissioner's Office (ICO), it was revealed that Nottinghamshire County Council made protected data – including personal addresses, postcodes, and care requirements of the elderly and disabled – publicly available on an insecure site. The data was uncovered when a member of the public stumbled across it without the need to use a username and password to access the information.

ICO head of enforcement Steve Eckersley wrote in the news release:

"This was a serious and prolonged breach of the law. For no good reason, the council overlooked the need to put robust measures in place to protect people's personal information, despite having the financial and staffing resources available."

Eckersley went on to state that the actions by those responsible were both ‘unacceptable and inexcusable’ given how sensitive the data is. The data pertained primarily to individuals who received services based on the council's Homecare Allocation System (HCAS) first launched in 2011. The most egregious aspect of the mistake is the fact that the information had been left unprotected for five years by the time it was discovered in June 2016.

Nottinghamshire County Council has been fined £70,000 by the ICO for its carelessness. It is not yet known whether the 3,000 people whose data was left unprotected suffered any negative consequences as a result.

Proof of the Need for Due Diligence

As an organisation involved in the data centre industry, it is apparent to us that Nottinghamshire County Council was extremely careless in the handling of the HCAS data. It also seems rather strange that the mistake went unnoticed for so long given how much attention the ICO is supposed to be giving to matters of this sort. If anything, the story is further proof of the need for due diligence among those that store data as well government agencies tasked with protecting the general public.

Whenever any online system is created for the purposes of collecting, storing and utilising personal data, a tremendous amount of responsibility comes with that data. There is never an excuse to allow any sort of sensitive data to be freely available to the general public without need for protected access.

The ICO news release says that Nottinghamshire County Council has ‘offered no mitigation’ to this point. Let's hope that this changes sooner rather than later. The public deserves to know how the Council responded to the original revelation and what procedures are now in place to make sure such exposure never occurs again. If we cannot trust those entrusted to protect us, our data security problems are much bigger than we realise.

Friday, 1 September 2017

Houston Data Centres: When Disaster Strikes

As everyone is no doubt aware by now, America's fourth-largest city, Houston, in Texas, was hit with a major Category 4 hurricane late last week. Though the storm was quickly downgraded after making landfall, it has still caused unprecedented damage through relentless rainfall that is likely to top 4 feet in some areas before it's done. US officials are now saying Houston and south-central Texas have never experienced an event like this.

Being in the data centre business, we are curious as to how the four major data centres located in Houston are faring today. There is both good and bad news.

The good news is that all four data centres are still operational and not reporting any damage. This is not unexpected, given the fact that data centre designers build into their plans certain protections against natural disasters. But that brings us to the bad news: streets in Houston are closed, and many of those streets around the data centres are flooded however the power has stayed on at all four facilities thus far.

Running on Emergency Generators

There is no way to know when power will be restored until after the rain finally stops. It could be days but it is more likely to take weeks. Data centre builders plan for such events by equipping facilities with generators and all four Houston data centres are prepared to run their generators if necessary. The question is, will they actually have to?

Should they need the generators, the facilities would be depending on fuel deliveries to keep them going. Data centres are second in line for fuel deliveries behind hospitals, police stations and other first-responder facilities, but with the roads flooded, how long before the fuel trucks could actually make it through?

Preparing Customers for Data Recovery

No one could have predicted that Houston would get 4 feet of rainfall from Harvey. In fact, the storm exceeded all expectations normally associated with hurricanes coming off the Gulf of Mexico. Unfortunately, all the preparations in the world cannot account for everything. Knowing what they now know about the devastation, data centre officials are beginning to contact customers in the hope of helping them make data recovery plans should the unthinkable happen.

The lesson in all of this is that nature will do what it wants to do. Data centre designers and operators go to great lengths to protect their facilities against the most severe weather, but sometimes it's just not enough. Hopefully Houston's four data centres will not suffer any interruption of service. Whether they do or not, there will be plenty of lessons to be learned in the aftermath Hurricane Harvey and its historic flooding.

Wednesday, 16 August 2017

From Superfast to Ultrafast – Speedier Broadband on the Way

On the heels of BT offering to invest in the infrastructure needed to bring high-speed internet to those Britons who do not yet have it, researchers have announced the possibility of current technology becoming obsolete within a short amount of time. We aren't talking high-speed Internet any more. We're not even talking superfast. Instead, we are now looking at ultrafast speeds measured in gigabytes rather than megabytes.

Ultrafast wi-fi has been on the radar for quite some time now. Until recently though, making it happen has remained somewhat of a mystery. That mystery may have been solved by switching from traditional microwaves to terahertz. Researchers at Brown University School of Engineering in Providence, Rhode Island (USA) have demonstrated they can "transmit separate data streams on terahertz waves at very high speeds and with very low error rates," according a report on the Telegraph website.

"This is the first time anybody has characterised a terahertz multiplex system using actual data," the researcher said in an official statement, "and our results show that our approach could be viable in future terahertz wireless networks."

What It Means to You

If you don't know the difference between a microwave and a terahertz, you are not alone. Here's what it means to you in simple terms: ultrafast internet access that could be upwards of 100 times faster than the best high-speed service now available. We are looking at speeds of 50 GB per second as opposed to 500 MB per second, the highest speed available with state-of-the-art microwave technology.

If science is successful in developing terahertz applications, the implications of the new technology would be incredible. First and foremost, terahertz networks would bring to an end the very real danger of outstripping microwave capacity with current high-speed applications.

Secondly, we would be able to develop platforms capable of much higher data densities. Terahertz waves operate at higher frequencies than microwaves and higher frequencies means more data packed into the same stream.

Thirdly, a successful venture into terahertz technology would mean high definition streaming on-the-go for everything from live television to superfast data crunching for banks, businesses and other large consumers of data. That alone would do wonders for worldwide financial markets.

Proving It Works

Proof-of-concept experiments out of Brown University involved two HD television broadcasts that were encoded on two different terahertz frequencies and then sent out across a wi-fi network together. Researchers obtained error-free results at 10 GB per second. Errors were only slight at 50 GB per second and well within the range of standard error correction systems.

From high-speed to superfast to ultrafast, the speeds at which we can send data through the air will only be going up over the next several years. Imagine a wi-fi connection 100 times faster than you currently use. It is possible through terahertz; at least in principle. Now it is up to scientists to make terahertz technology viable for the mass market. It appears as though they are very much on course.

Tuesday, 8 August 2017

Statement of Intent: New UK Consumer Data Protection Rules to be Enforced

A recently issued statement of intent from the Department for Digital, Culture, Media & Sport aims to change the way those who collect data online use it for the foreseeable future. The statement outlines plans to initiate legislation that will further protect consumers against the misuse of their personal information, with or without their consent.

Among the new protections is greater control over data by individual consumers. As an example, consumers will be able to maximise the 'right to be forgotten' by requesting social media sites erase their information.

All the new data protection rules will be encapsulated in legislation to be known as the Data Protection Bill. The government says the bill will instil confidence in people that they have absolute control over their data. According to the statement of intent, research currently shows that up to 80% of the public lacks that confidence right now.


Shifting the Responsibility

Once the new legislation becomes law, it will shift the burden of responsibility from the consumer to the organisation that collects data. It will require organisations to obtain explicit consent for processing certain kinds of personal data. At the same time, it will eliminate the default tick boxes organisations currently use to obtain consent. It is the government's contention that such tick boxes are largely ignored.

The legislation will also:


  • Make it easier for consumers to withdraw consent
  • Give consumers the ability to request their personal data be erased
  • Give parents and guardians control over data collected from minors
  • Expand the definition of 'personal data' to include cookie information, ip addresses, and dna information
  • Give consumers more power to force organisations to reveal what kinds of data they have
  • Make it easier for consumers to move data between organisations
  • Update and strengthen existing data protection laws to bring them in line with the state of the digital economy
The government is extremely serious about shifting responsibility from consumers to data collectors. They have created significant penalties for violators, and the Information Commissioners Office (ICO) will be granted greater authority to enforce the rules. The statement of intent makes it clear that organisations will be held accountable for the data they gather.

"Our measures are designed to support businesses in their use of data, and give consumers the confidence that their data is protected and those who misuse it will be held to account," said Minister of State for Digital Matt Hancock.

Hancock went on to explain that the legislation will give consumers more control over their data while also preparing the UK for Brexit. On that latter point, the legislation brings UK consumer data protection laws in line with the EU's General Data Protection Regulation.

All that remains to be seen now is whether the final Data Protection Bill lives up to the promises of the statement of intent. If it does, the UK will have one of the strongest consumer data protection laws in the world.

Thursday, 3 August 2017

BT Counters Universal Service Obligation Mandate with Proactive Offer

If you had to choose between the government's proposed universal service obligation mandate and BT proactively building the necessary infrastructure to offer universal broadband, which would you choose? It is an interesting question that MPs and Ofcom now have to wrestle with. Thanks to a new offer by BT, the government may not have to implement the universal service obligation after all.

BT has proposed investing some £600 million in new infrastructure that will provide high-speed broadband to up to 99% of UK households by 2020. Universal access would be available by 2022.

BT defines high-speed broadband as a connection that gets at least 10Mbps. They say approval of their plan would mean most UK households getting broadband through fibre and fixed wireless technologies. Those that could not be reached through traditional infrastructure could be offered satellite broadband.

As an alternative, the government has already proposed the universal service obligation. Should they decide to implement it, every UK household without broadband would be able to request it beginning in 2020. BT would have no choice but to install the necessary infrastructure to meet that request.

Why Make the Offer?

It is a fascinating exercise to try and figure out why BT would make such an expensive offer? Agreeing to spend £600 million is an awfully pricey proposition when you consider that BT already provides high-speed broadband to 95% of UK households. To understand BT's move, it's important to understand how they will come up with the money.

Under the universal service obligation mandate, new broadband customers who get service after making a request could not be charged more than customers already receiving the same service. That would mean BT having to swallow the cost of building infrastructure on a case-by-case basis. Their proposal pays for the infrastructure in a different way.

According to the BBC, all of BT's current customers would pay for the construction through increased monthly bills. Those who eventually get access would then be paying the higher bills as well. From a business standpoint, it just makes better sense for BT to voluntarily create universal broadband access rather than being forced into it.

A Few Concerns

BT's proposal has thus far met with mostly positive reactions. However, there are a couple of concerns. First is the possibility that 10Mbps service would be obsolete before the company finished building the infrastructure. That would mean all of those newly connected households would once again be behind in a game of perpetual catch-up.

The second worry is obviously the higher cost to broadband customers. While the first concern can be addressed through well-planned technology upgrades, there's nothing anyone can do about the second. Someone has to pay to build and maintain the infrastructure. If customers don't pay for it directly, they will have to pay for it through tax contributions to cover government subsidies.

We shall see what the government decides to do with BT's proposal. Either way, universal broadband – even in the hardest to reach places – is the ultimate goal.

Wednesday, 26 July 2017

ICO: Don't Illegally Share Personal Information

Having access to the personal information of clients or customers is a privilege that allows businesses to stay in business. It is also a privilege that must be protected. According to the Information Commissioner's Office (ICO), there are businesses and individuals guilty of not protecting personal information. The ICO is now warning those who have access to personal information to be more careful.

The recent ICO warning comes on the heels of the successful prosecution of a recruitment manager who illegally disclosed personal information to a third-party recipient without the knowledge and consent of victims. The man, 39-year-old Stuart Franklin from the West Midlands, provided the information to a hiring agency while he was in the employ of HomeServe Membership Ltd.

An official report from the ICO says that Franklin sent copies of 26 CVs to a recruiting company during his time with HomeServe. Those electronic documents contained sensitive personal information on the individual applicants. Franklin had no legitimate business reason to do so, and he never sought the permission of the owners of that information.

After the successful prosecution based on S55 of the Data Protection Act, Franklin was ordered to pay a total of £994 covering his fine, court costs, and a victim surcharge. As for the ICO, Head of Enforcement Steve Eckersley produced a statement which said, in part:

"We're asking people to stop and think about the consequences before taking or sharing information illegally. Most people know it's wrong but they don't seem to realise it's a criminal offence and they could face prosecution."

The Human Factor: A Big One

Most of what we hear about in terms of data centre and network security is directly related to hacking by outside sources. That is a big problem indeed. But equally problematic is the human factor. As the Franklin case demonstrates, you do not need sophisticated hackers with equally sophisticated hacking tools to create a serious security breach that could ruin lives. Sometimes all it takes is a careless employee who passes along confidential information without giving it a second thought.

Organisations should absolutely take every effort to ensure networks and data are completely secure. Doing so goes beyond hiring competent IT staff and installing the right kind of hardware and software. It is also a matter of educating employees about their responsibilities for safeguarding personal information, then routinely updating training and conducting audits.

If we are to truly secure our data against theft and misappropriation, we all need to do a better job of protecting it with whatever means are available to us. Employees need to be careful about illegally sharing information they are not authorised to share. Individuals have to be more diligent about the information they share and the reasons for doing so.

In the meantime, the ICO is reminding employers and other organisations that passing along personal information belonging to someone else is not legal unless consent has been obtained and there is a legitimate business reason for doing so.

Tuesday, 6 June 2017

Budapest Convention to Change Digital Evidence Sharing Rules

When crimes are committed in Europe, police investigators are sometimes limited in the kinds of digital evidence they can collect and use for prosecutorial purposes. Despite the Budapest Convention on Cybercrime being opened and maintained for the last 16 years, a lack of clear rules relating to how digital evidence can be used continues to be a problem for European police officials. Now the Convention aims to change that.

News reports say that the Convention is getting ready to sign a new deal that will make it a lot easier for police officials to collect, use and share digital evidence with other participating countries, even if that evidence does not reside on a server located within the borders of the investigating country.

Why the Changes Are Necessary:

Being involved in the data centre sector, we are painfully aware of national laws that require operators in certain countries to make sure data belonging to domestic customers is stored only on domestic servers. We are constantly reminded about national laws requiring the security of that data. It is just part of the game.

Under the current rules, police officials have to be concerned about how digital evidence is shared across European borders. There are times when a police agency could freely access digital data in another country but fail to do so out of fears that such evidence would not be admissible in court. There are other times when accessing cross-border data is actually against the law.

In order to get around the rules, police agencies in member countries take advantage of what are known as Mutual Legal Assistance Treaties. However, going through the treaty process is painfully slow. It is so slow, in fact, that cases can fall apart while police agencies are waiting for approval to get the necessary evidence.

What the New Rules Do:

If new rules are agreed upon without any changes to the current proposals, they will allow police agencies to speed up investigations through faster access to digital data. The rules cover everything from mobile phone use to e-mail to websites and social media. Essentially, any kind of data that can be transmitted online will be subject to better and faster collection by police agencies.

The rules will also put in place policies for reacting to emergency situations. The Budapest Convention is looking to the US for guidance here. That country already has emergency policies in place, policies that enabled France to quickly get information they needed during the Charlie Hebdo attack a couple of years ago.

Based on the known trouble that police agencies go through to collect and use digital evidence, it is quite obvious that some rule changes are needed. There is a danger though. As America's NSA has proven, not carefully thinking through the rules to account for the possibility of digital information being used improperly can lead to all sorts of unintentional spying. The Budapest Convention does need to act, but they need to do so carefully and circumspectly.


Wednesday, 31 May 2017

Microsoft Looking at DNA Data Storage

How would you feel about donating some of your DNA to eventually be utilised as a personal storage space for all your digital data? The idea may seem a bit far-fetched, but Microsoft recently revealed that they are working on a system that, in theory, could make exactly what has just been described pretty routine at some point in the near future.

Microsoft has revealed a research project aimed at using strands of DNA for large-scale data storage. According to a report published by MIT Technology Review, the US-based software company expects to have a workable DNA data storage system in place by the end of this decade.

The system involves using individual strands of nucleic acids to store data as nucleic acid sequences in much the same way a magnetic strip stores data as sequences of positive and negative charges. The benefit of the DNA model is primarily one of capacity. As an example, a Harvard geneticist investigating the possibility of DNA data storage a number of years ago converted and stored his book on the subject using 55,000 DNA strands.

According to reports, a single gram of DNA is capable of holding 215 PB of data. For the record, a petabyte is 1 million GB. That is a tremendous volume of the data stored on something incredibly small. At the rate data is exploding these days, we are going to need something that impressive just to keep up with it all.


Overcoming Current Limits:

Using nucleic acids to store digital data is very promising in that proof of concept has already been established. But like any new technology, it is too cost prohibitive to be mainstream at the current time. There are some inherent limits to DNA data storage that must be overcome before you and I will be donating our own DNA to the cause.

Right now, the biggest challenge seems to be speed. Sending data to the storage system has been as slow as 400 bytes per second. In order to come up with a workable solution that could be embraced by the retail market, researchers have to get to at least to 100 MB per second. And as every year ticks by, that number will increase alongside other technologies.

The other big challenge is the price of materials. Researchers currently invest roughly £620,000 in the materials needed to build a DNA data storage and retrieval system. That cost is way too much to make for a feasible mass market product. The price will have to be reduced to several hundred dollars, at the most, if the idea is to ever be marketable.

Human DNA has been storing critical data since the dawn of man. How ironic it would be if we could take something as fundamental to our existence as DNA and use it to store and retrieve digital information that is becoming equally critical to our everyday lives. Microsoft hopes to make it happen within the next few years.



Thursday, 25 May 2017

ICO to Look at Data Analytics in Politics

Big data is everywhere. If you do anything online, whether with a mobile phone or laptop computer, there are entities out there in the digital universe collecting data about you and analysing it for marketing purposes. There are also political entities making use of that data, according to the Information Commissioner's Office (ICO).  The ICO has therefore announced the start of a formal investigation with the intent to learn just how data analytics are used for political purposes.

An informal investigation was originally announced by the ICO earlier this year. According to Commissioner Elizabeth Denham, her office believes that what they have learned since March warrants a formal investigation now. Denham acknowledges that data analytics have a significant impact on individual privacy and, as such, people have a right to know how data is being used to influence votes.

"Having considered the evidence we have already gathered, I have decided to open a formal investigation into the use of data analytics for political purposes," Denham wrote in an official release. "This will involve deepening our current activity to explore practices deployed during the UK's EU Referendum campaign but potentially also in other campaigns."

The commissioner has indicated that her investigation will be ongoing even in the midst of campaigning for the snap General Election coming up. She also maintains that her decision to launch a formal investigation has nothing to do with that election or it possible outcome.


What It All Means

Without coming out and saying so directly, the Government has taken the position that politics has become more orientated toward marketing in the digital age. Indeed, that is the entire point of big data anyway. Analysts gather as much data on individuals as they possibly can and then find ways to decipher and apply that data in order to be more effective in their outreach.

While big data is alive and well in all sorts of fields, it has only been perfected – at least as much as is possible right now – within the marketing environment. Therefore, it stands to reason that the ICO will be looking at data analytics from that standpoint. They want to know if politicians are marketing their messages to voters based on what they learn from data analytics.

Finding out they do would not be much of a surprise. Politics has always been about messaging. What may be a surprise is the extent to which data analytics is being used. If it is determined that individuals or political campaigns are misusing data in order to target their messaging, there could be some significant consequences in the future.

At any rate, Denham also took the occasion of her official release to remind all political parties that their current activities in relation to the upcoming election must adhere to all applicable laws. The ICO offers updated guidance on political campaigning that parties can avail themselves of. As an individual, you are also welcome to download that guidance from the ICO website.  Simply follow the below link to the original source of this blog.

Tuesday, 16 May 2017

Fire Takes Out Aussie Data Centre and Disrupts Business

A data centre fire in southern Australia disrupted numerous businesses last week, including account access among customers of UniSuper, a superannuation provider with more than AUS $56 billion in assets. Fortunately, no customer information was lost as a result of the failure and the data centre was back online a day later.

The affected data centre remains undisclosed at this time, but news reports did identify it as a facility somewhere in the Port Melbourne area. Port Melbourne is a suburb of Melbourne in the state of Victoria. News reports also indicate that the data centre is in the same general vicinity as two companies in which UniSuper is heavily invested.


No Information on Cause:

At of the time of this writing, the cause of the fire remains unknown and it could be some time before that information is released. All that is known at this point is that the data centre caught fire and, in the aftermath, UniSuper and several other businesses suffered partial shut-downs. The fact that the centre resumed operation the following day indicates the fire was not as severe as it could have been.

Data centres the world over are equipped with fire suppression systems in order to minimise the damage fire and smoke could cause. These are chemical or water systems that can extinguish fires without damaging computer hardware. It is assumed such a system is that which saved the Australian data centre.

Unfortunately, fire suppression systems themselves do not always work. A number of years ago, a Romanian data centre operated by ING suffered extensive damage from a fire suppression system test. The system made such a loud boom that the sound waves actually damaged hardware!


Fire Is Always a Risk:

Those of us within the data centre community are fully aware that fire is always a risk. The general public, on the other hand, may not realise just how much of a problem fire can be. For starters, think about the tremendous amount of heat that data centres produce on a daily basis.

Data centres have to be kept cool because excess heat can damage sensitive network hardware. But, more importantly, allowing excess heat to build up could spark a catastrophic fire. The larger a data centre is, the greater the potential for fire if cooling solutions are not designed and implemented properly.

We have seen notable data centre fires all over the world in the past. In 2016, Ford experienced a fire at its US corporate headquarters in Dearborn, Michigan. A government data centre in Ottawa (Canada) also went down in 2016 after hardware suffered severe damage due to inexplicable smoke. And, of course, who can forget the 2015 fire in Azerbaijan that decimated the country's internet service.

Thankfully the data centre fire in Australia was not serious enough to cause widespread damage and knock out services for an extended period. Hopefully, facility owners will identify what caused the fire and take corrective action to prevent it from occurring in the future.

Wednesday, 10 May 2017

Barclays Announces New Cyber Crime Initiative

With cyber crime seemingly increasing on a daily basis, one UK high street bank has decided to fight back. Barclays has launched a new nationwide initiative designed to educate consumers, businesses and authorities in how cyber crimes are carried out and what can be done to prevent them. The initiative includes £10 million for an extensive advertising campaign throughout the UK.

According to Barclays, cyber crime in the form of digital fraud is at an all-time high. In fact, digital fraud now makes up at least half of the total crime reported in the UK. Barclays suspects the numbers could be even higher when one considers how often cyber crimes go unreported. The kinds of crimes that Barclays is referring to include things like scams and digital identity theft.

Surprisingly, older people are not the most vulnerable to cyber crimes involving digital fraud. According to Barclays, that distinction belongs to young people between the ages of 25 and 34. Even more surprising is that highly educated young people in the Greater London area are the most vulnerable group in the UK.


What Barclays Will Do:

It's clear that Barclays alone cannot make a dent in cyber crime and digital fraud. Real change will be the result of banks, businesses, authorities, and the public all working together. With that said, Barclays is committed to doing its part by way of their new Digital Safety initiative.

The first part of the initiative calls for giving Barclays customers more control over how their debit cards are used. Customers will be able to set their own daily withdrawal limits and turn remote purchasing capabilities on and off by way of the Barclays app. On the education front, Barclays has a lot planned.

They now offer an online quiz designed to help people understand their own level of risk. The quiz is followed by helpful tips designed to make individuals more secure based on their answers. Barclays is hoping to help as many as 3 million consumers with the quiz.

As previously mentioned, Barclays will invest £10 million in an advertising campaign that will involve billboards, printed adverts, TV, and online efforts. The ad campaign will target the most vulnerable demographics with essential information they need to understand and the precautions they should be taking.

An updated website will include 'fraud awareness takeovers' in order to promote fraud prevention. Barclays believes that it is more important to make people secure than to sell new products, so these new takeovers will replace many of the existing elements that currently market new products to consumers.

Lastly, Barclays will begin offering educational seminars and support clinics for both businesses and retail consumers. The company hopes to reach as many as one million small and medium-sized businesses with targeted educational opportunities designed to help them reduce their fraud risks.

It is clear that Barclays is serious about addressing cyber crime and digital fraud. Kudos to them for stepping up and committing themselves so extensively.

Sources: 

Wednesday, 3 May 2017

New Apple Data Centre Will Help Heat Homes

It is no secret that Apple is looking to be the dominant technology company where green energy is concerned. Their new corporate headquarters in Cupertino, California (USA) is already slated to run on 100% renewable energy and Apple has made great strides in using more environmentally friendly packaging. Now they have their eyes on a brand-new data centre being built in the Jutland region of Denmark, a data centre that will utilise green energy and recycle its excess heat to help keep local homes warm.

The data centre is being partly powered by recycling agricultural waste from local farms. Apple has partnered with Aarhus University to develop a system that converts the waste into methane gas by way of a biochemical 'digester'. The methane gas can then be harnessed and used to power the facility. What the digester leaves behind becomes fertiliser for local farms.

Apple also says that the data centre will put no stress on the local power grid. Instead, it will be powered by 100% renewable energy. As such, Apple is giving back to the community in multiple ways. It is a great partnership that will benefit local residents, businesses, farmers, the University, and even Apple itself.


A Company-Wide Goal:

We should not be surprised by what Apple is doing in Denmark. After all, the company has stated numerous times that they fully intend to eventually operate all their data centres on 100% renewable energy. All their existing data centres already use renewable power to one extent or another and Apple claims as many as 96% of them are already exclusively renewable.

The renewable energy goals are not what is so surprising about the Denmark project. Rather, it is remarkable that Apple will harness the excess heat their data centre produces and return it to the community as municipal heat for homes. Apple could just as easily have turned around and used that heat as another source of power on their own premises. Instead, the local community will benefit from it.

Apple is not alone in harnessing data centre heat for other purposes. There are others who use excess data centre heat to keep their own offices warm and still others who use it to generate the hot water their facilities need. And when you stop to think about it, heat recycling strategies make perfect sense.

Data centres are not only insatiable users of power; they also produce a tremendous amount of heat. There really is no viable reason to allow that heat to escape when it can be reclaimed for so many purposes. The fact that it has taken technology companies so long to get to this point is the only thing that really surprises us about heat recycling.

Apple's new Denmark data centre will be a model of renewable energy and recycling when it finally opens. Apple might be hard-pressed to call themselves the world leader in green technology at this moment in time, but they are certainly among the industry's major players.

Tuesday, 11 April 2017

Keeping Sensitive Data Hidden

Network troubleshooting, performance monitoring, and security are daily tasks in the data centre. Add data privacy and other regulations in the healthcare, government, education, finance and other sectors and you are adding another level of complexity to your network monitoring. Network visibility solutions that recognise data patterns can help reduce business risks by inspecting the packet payload, providing insights on specific data patterns, masking data to improve data privacy and support compliance to HIPAA1, PCI2 and internal best practices or recognising patterns that alert security. 

Pattern matching uses regular expressions to define search patterns. These patterns can then be used to find strings of characters in files, databases and network traffic. One of the earliest uses for pattern matching was text editing. A user could use a regular expression to search and replace a particular string throughout an entire document using a single command.

An example of a regular expression is “\b\d{5}\b.” This expression can be used to find any five digit US zip code, such as 49017. This regular expression can be expanded to search for a nine digit zip code like 49017-3822. The expanded version of the expression is “\b\d{5}-\d{4}\b.”

After a desired string of characters is matched by a regular expression, several types of actions can be taken. Depending on the system, these actions can include:

·        Generate an alert message
·        Highlight the data
·        Mask the data by replacing each of its characters with a different character
·        Remove the data altogether

An example use for masking data is complying with privacy regulations like HIPAA or PHI. These regulations require companies and organization to protect private information, such as social security numbers, credit card numbers, and health related information.


Pattern Matching Applications:

Today, pattern matching is used in numerous applications like text editing, compiling computer programs, and protecting private data during network monitoring activities.

Protecting private data, while monitoring networks, represents one of the growing uses for pattern matching. In order to solve a network problem, a trouble shooter must monitor network traffic and examine its packet headers (e.g. Ethernet Header, IP Header, etc.). However, the payload portion of a packet may include a person’s personal information that needs to be protected.

Pattern matching can be used to mask personal data in the payload portion of each packet prior to the packet being examined. This capability assists organizations with complying with regulations like HIPPA and PHI.

Another use for pattern matching is filtering. When a match occurs, the action can be to either drop the packet or pass it. This type of application is applicable when a virus or malware is identified in a packet. In some cases, the action may include dropping the entire network session.


Typical Regular Expressions:

A typical regular expression library could include the ability to search for the following types of data:

·        Credit Card Numbers
·        Phone Numbers
·        Zip Code Numbers
·        Email Addresses
·        Postal Addresses


Typical Pattern Matching Features:

A user should easily be able to perform the following functions with a pattern matching system:

·        Have commonly used regular expressions available in a library.
·        Add additional regular expressions to the regular expression library by copying them from the plethora of expressions found on the Internet.
·        Test whether a regular expression matches a particular string without having to configure a network to send the string through the system.
·        Allow the user to mask data using a user selectable character.

APCON delivers a pattern matching feature as part of its network and security visibility solution. This allows the inspection of the packet payload to look for specific data patterns and masks the matched data, improving data privacy and supporting compliance to HIPAA, PCI and internal best practices. For an example of a network pattern matching system, check out Apcon’s new pattern matching feature on the HyperEngine packet processor blade or contact Kevin Copestake, UK & Ireland Sales Manager kevin.copestake@apcon.com / +44 (0) 7834 868628 for more information.


Compliance Regulations
1Health Insurance Portability and Accountability Act (HIPAA)
2Protected Health Information (PHI)

Guest blog by APCON.  For a link to the original blog plus related diagrams, please visit https://www.apcon.com/blog-entry/keeping-sensitive-data-hidden