Wednesday, 20 December 2017


The uptake of digital technology, the government’s upcoming Industrial Strategy and strong export demand all add up to an expanding manufacturing sector here in the UK. However, this increase in demand will no doubt lead to added pressure on UK power supply, so it becomes more important than ever to have robust power infrastructure in place.
Downtime can come at a significant cost for manufacturers, with some statistics showing that just one unplanned event can cost in the region of GBP £1.6m.
What’s more, the UK is reported as the worst-performing economy in Europe when it comes to productivity, so it is even more critical to keep downtime to a minimum.
At a large-scale manufacturing plant, for example, a power shutdown or breakdown in the supply of monitoring/control information can have a disastrous effect on productivity which ultimately could impact on a business’ bottom line.
Therefore, industrial processes should be fully protected to ensure productivity remains at its best, as well as risks and cost implications around machinery failure are reduced.
There are a number of measures that manufacturers can take to ensure continuous power – an uninterruptible power supply (commonly referred to as UPS) being one of them. A UPS device will not only protect against power outages, but also provide instant emergency power should the mains power fail.
The UPS will run for a few vital minutes to allow safe shutdown, ensuring that all data is backed up and that the generator has fired up properly and is providing power. But when you consider that 45% of blackouts typically occur due to voltage disturbances, the UPS is also a vital piece of equipment to correct power problems.
Manufacturing machinery is vulnerable to numerous electrical anomalies – from voltage sags and spikes to harmonic distortion and other interruptions. In this situation, a UPS can really come into its own – not only to protect against power outages, but also to operate as an effective power conditioning unit.
By smoothing out sags, surges and brownouts to provide a clean and stable power supply, the UPS prevents damage to sensitive and expensive equipment.
In the pharmaceutical industry, for example, when producing a batch of a very expensive drugs in glass or in a semiconductor, a small dip in the voltage will cause an imperfection in the finished product making it unusable and could even result in the batch being discarded altogether.
Even in steel or brick production, if there is a micro break in the power that causes the furnace controllers to shut down, the process has to be stopped. The material being processed will be scrapped and the whole process started again, which can take days and be very costly.
The UPS can also be deployed solely as a power conditioner without batteries, which will come in handy in environments over 40°C, which is the highest temperature a battery can be kept in.
An example of this is ‘cleaning’ power to prevent light flicker in offices next to heavy industry – cranes moving cargo at docks, for instance. In this situation, a UPS can act as a power conditioner on the power supply to the offices, preventing any flickering.
As we enter this exciting period of growth and see greater uptake of digital technologies, it is wise for those working in the industrial sector to take a step back and make sure their processes and equipment is as protected as it can be.
Manufacturers can do this by having a solid power protection solution in place in the form of a UPS device. This will not only give you peace of mind if machinery does fail, but will give the added reassurance that instances of downtime will be reduced, paving the way for a stronger manufacturing future.
Guest blog by Leo Craig, general manager of Riello UPS.  For more information, please email or call 0800 269394

Thursday, 14 December 2017

Vertiv Anticipates Advent of Gen 4 Data Centre in Look Ahead to 2018 Trends

The next-generation data centre will exist beyond walls, seamlessly integrating core facilities with a more intelligent, mission-critical edge of network. These Gen 4 data centres are emerging and will become the model for IT networks of the 2020s. The advent of this edge-dependent data centre is one of five 2018 data centre trends identified by a global panel of experts from Vertiv, formerly Emerson Network Power.

“Rising data volumes, fuelled largely by connected devices, has caused businesses to reevaluate their IT infrastructures to meet increasing consumer demands,” said Giordano Albertazzi, president of Vertiv in Europe, Middle East and Africa. “Although there are a number of directions companies can take to support this rise, many IT leaders are opting to move their facilities closer to the end-user – or to the edge. Whatever approach businesses take, speed and consistency of service delivered throughout this phase will become the most attractive offering for consumers.”

Previous Vertiv forecasts identified trends tied to the cloud, integrated systems, infrastructure security and more. Below are five trends expected to impact the data centre ecosystem in 2018:

  1. Emergence of the Gen 4 Data Centre: Whether traditional IT closets or 1,500 square-foot micro-data centres, organisations increasingly are relying on the edge. The Gen 4 data centre holistically and harmoniously integrates edge and core, elevating these new architectures beyond simple distributed networks.

This is happening with innovative architectures delivering near real-time capacity in scalable, economical modules that leverage optimised thermal solutions, high-density power supplies, lithium-ion batteries, and advanced power distribution units. Advanced monitoring and management technologies pull it all together, allowing hundreds or even thousands of distributed IT nodes to operate in concert to reduce latency and up-front costs, increase utilisation rates, remove complexity, and allow organisations to add network-connected IT capacity when and where they need it.

  1. Cloud Providers Go Colo: Cloud adoption is happening so fast that in many cases cloud providers can’t keep up with capacity demands. In reality, some would rather not try. They would prefer to focus on service delivery and other priorities over new data centre builds, and will turn to colocation providers to meet their capacity demands.

With their focus on efficiency and scalability, colos can meet demand quickly while driving costs downward. The proliferation of colocation facilities also allows cloud providers to choose colo partners in locations that match end-user demand, where they can operate as edge facilities. Colos are responding by provisioning portions of their data centres for cloud services or providing entire build-to-suit facilities.

  1. Reconfiguring the Data Centre’s Middle Class: It’s no secret that the greatest areas of growth in the data centre market are in hyperscale facilities – typically cloud or colocation providers – and at the edge of the network. With the growth in colo and cloud resources, traditional data centre operators now have the opportunity to reimagine and reconfigure their facilities and resources that remain critical to local operations.

Organisations with multiple data centres will continue to consolidate their internal IT resources, likely transitioning what they can to the cloud or colos while downsizing and leveraging rapid deployment configurations that can scale quickly. These new facilities will be smaller, but more efficient and secure, with high availability – consistent with the mission-critical nature of the data these organisations seek to protect.

In parts of the world where cloud and colo adoption is slower, hybrid cloud architectures are the expected next step, marrying more secure owned IT resources with a private or public cloud in the interest of lowering costs and managing risk.

  1. High-Density (Finally) Arrives: The data centre community has been predicting a spike in rack power densities for a decade, but those increases have been incremental at best. That’s changing. While densities under 10 kW per rack remain the norm, deployments at 15 kW are not uncommon in hyperscale facilities – and some are inching toward 25 kW.

Why now? The introduction and widespread adoption of hyper-converged computing systems is the chief driver. Colos, of course, put a premium on space in their facilities, and high rack densities can mean higher revenues. And the energy-saving advances in server and chip technologies can only delay the inevitability of high density for so long. There are reasons to believe, however, that a mainstream move toward higher densities may look more like a slow march than a sprint. Significantly higher densities can fundamentally change a data centre’s form factor – from the power infrastructure to the way organisations cool higher density environments. High-density is coming, but likely later in 2018 and beyond.

  1. The World Reacts to the Edge: As more and more businesses shift computing to the edge of their networks, critical evaluation of the facilities housing these edge resources and the security and ownership of the data contained there is needed. This includes the physical and mechanical design, construction and security of edge facilities as well as complicated questions related to data ownership. Governments and regulatory bodies around the world increasingly will be challenged to consider and act on these issues.

Moving data around the world to the cloud or a core facility and back for analysis is too slow and cumbersome, so more and more data clusters and analytical capabilities sit on the edge – an edge that resides in different cities, states or countries than the home business. Who owns that data, and what are they allowed to do with it? Debate is ongoing, but 2018 will see those discussions advance toward action and answers.

About Vertiv:

Vertiv designs, builds and services critical infrastructure that enables vital applications for data centres, communication networks and commercial and industrial facilities. Formerly Emerson Network Power, Vertiv supports today’s growing mobile and cloud computing markets with a portfolio of power, thermal and infrastructure management solutions including the Chloride®, Liebert®, NetSure™ and Trellis™ brands. Sales in fiscal 2016 were $4.4 billion.

Guest blog by Vertiv.  For more information, please visit or contact Hannah Sharland on +44 (0) 2380 649832 or email

Tuesday, 5 December 2017

RBS: Online Banking Partly to Blame for 62 Closures

Royal Bank of Scotland's (RBS) decision to close 62 mostly rural branches in Scotland has been met with plenty of protests amongst both customers and activist groups. RBS says that online banking is partly to blame for the closures, but at least one citizen's group doesn't believe them. They are accusing RBS of closing the branches strictly out of greed.

It is always a touchy situation when a large company with an extensive list of brick-and-mortar locations decides to close some of their local outlets. In the RBS case though, the sting of closing 62 branches is much more painful due to the bank's promise – a promise they reiterated many times in years past – that they would not close a branch even if they were the last bank in town.

That promise is at the forefront of action being taken by the Unite union to try to force RBS to maintain the status quo. Unite is hoping Scotland's government will get behind their efforts as well. The Scottish government is a part owner in RBS.

Business Minister Paul Wheelhouse initially responded to the Unite request by reminding those concerned that authority over banking remains the domain of the UK government. There's not much the Scottish government can do other than work with customers and citizen groups to try to convince RBS to change its course of action.

Dwindling Customer Use

For their part, RBS has said that closing the local branches is the result of changes to how people are using bank services. Prior to the internet age, the local bank branch was the lifeblood of both retail and commercial banking transactions. That is no longer the case.

RBS maintains that the number of customers making use of branches in Scotland has dropped by nearly half over the last five years. In announcing the closures, the bank noted that branch use had fallen by 44% over last five years while mobile banking has increased 39% in just the last two years.

Should RBS go ahead with its plans, customers will not be left without banking solutions. The bank says that customers would still have access to a community banker or mobile branch. RBS customers will be able to continue accessing bank services online as well.

So the question is this: are the closures really all about money as Unite contends, or is RBS justified in trying to cut its operating expenses by eliminating branches that are now seeing half as much traffic as they were seeing back in 2012? Unfortunately, there is no simple answer.

The internet age is a wonderful age in which to live. However, the expansion of online access is not without its drawbacks. It is not reasonable for us to expect an organisation to make themselves as efficient as possible through online means while, at the same time, continuing to do things in older, less efficient ways to satisfy those unwilling to embrace the new. We cannot move forward without leaving something behind.

Friday, 24 November 2017

Data Breach and Cover-Up Further Eroding Uber Image

Ride-hailing pioneer Uber has recently suffered a serious a blow to its reputation after officials in London failed to renew the company's operating licence following a discovery that illegal software was being used to circumvent official policy that bars government workers from using the service. In short, Uber has been accused by London of cheating the system. Their reputation will not fare any better on recent news that tens of millions of customers and drivers have been hacked – and the company has known about it for more than a year.

The BBC and other news outlets report that some 57 million Uber customers and drivers are victims of a data breach that occurred back in 2016. Not only did the company know about the breach at the time, but they also failed to report the fact to regulators as is required by law. Making matters worse is the fact that Uber paid the hackers $100,000 (£75,000) to delete the data they stole.

Both law and common sense would dictate that Uber report the breach when first discovered. They probably also should not have paid the ransom without making at least some attempt to fight the hackers. Why they paid and chose not to tell regulators is anyone's guess.

A Series of Missteps

This latest episode with Uber is just another in a long list of mishaps over the last three or four years. Former chief executive Travis Kalanick deserves much of the blame, as his management style and Lone Ranger mentality have upset customers, employees and investors alike.

Kalanick was at the helm when the data breach occurred. The BBC speculates that he may have prevented chief security officer Joe Sullivan or anyone else from reporting it because the company, at that time, was trying to secure a new round of funding. For his part, Sullivan resigned when news of the data breach broke.

Bigger Issues in Play

The BBC's Dave Lee says the biggest part of the problem is not the data breach itself, but the cover-up allegedly orchestrated by Kalanick. He says that most customers and drivers would eventually have forgiven Uber if they had been up front and forthright about what happened. Now that we know they refused to do so, forgiveness and future trust may be harder to come by.

All the Uber-specific implications aside, there are some bigger issues in play here. Most important is how the hackers managed to steal the information. They did it by hacking into Github, an online portal where software developers publish and share their work. Once inside, the hackers were able to find Uber's login credentials to Amazon Web Services. This is the cloud computing service Uber uses to host its software – and data.

Github and Amazon Web Services are equally culpable here. If either one knew about the hack when it occurred, neither reported it. Moreover, Amazon Web Services accounts for a significant portion of cloud software solutions used across the globe. They have some answering to do as well.

Wednesday, 15 November 2017

Johannesburg Cable Heist: Money or Something Else?

Officials in Johannesburg, South Africa have been left scratching their heads, following a brazen cable heist that resulted in the loss of 2 million rand (£110,000) worth of power cables during a burglary some are calling an inside job. The theft occurred at a brand-new data centre in Braamfontein.

News sources say the data centre is a combination data and recovery centre designed to increase the server space and infrastructure necessary for the city to end its reliance on outside service providers. The city essentially wants to host its own data on city-owned servers powered by city infrastructure.

Those plans took a step back after burglars broke into the data centre by entering through an emergency exit on the ground floor. However, there were no signs of forced entry. Once inside the building, the thieves broke into a room where contractors had been storing their tools. They used some of those tools to cut the cables that they eventually stole.

Apparently, the cables were attached to new generators that contractors were testing. There was no loss of power, indicating that the generators were turned off prior to the theft. There were no reports detailing whether the generators were damaged or not. Investigators are now left to speculate as to the motive behind the theft.

Several Possibilities

The first assumption is that the thieves stole the cables for money. After all, they are worth more than £100,000. But how would the thieves off-load the stolen cables without being discovered? This is a question that investigators are still trying to answer. However, there is another possible motive...

In an official statement released after the burglary was discovered, Mayor Herman Mashaba indicated that the heist was an inside job given how little damage was done. He maintained that whoever stole the cables knew exactly what they were looking for and where to find them. He believes the theft may have had nothing to do with money.

Mayor Mashaba has suggested that perhaps the heist occurred in order to dissuade the city of Johannesburg from continuing to build. If it was not to dissuade them, then at least to slow down the progress. If the mayor is right, this would indicate an action taken by one of the companies providing data centre services to the city. They do not want the city to succeed because that would mean a loss of contracts for them.

An Impressive Theft

Right now, there is no clear indication as to the motive behind the theft. Whether it was for money or competitive purposes, one thing is certain: the theft was a rather impressive event in terms of what it took to get in, find the tools, cut the cables and run.

The Mayor has made it clear that the theft will not deter his city's efforts to finish the data and operational centre. It is probably a safe bet that the city will beef up security until the centre is up and running, perhaps even beyond that.


Wednesday, 1 November 2017

WhatsApp and Facebook: Non-Compliance with EU?

Are WhatsApp and Facebook guilty of non-compliance with EU law? That is what a special task force wants to know, according to a 26 October (2017) story published by the BBC. That story says that a data protection task force has been established to consider practices related to data sharing between WhatsApp and Facebook.

Facebook purchased the WhatsApp messaging app in 2014 in order to better compete against Microsoft and other rivals. At the time of purchase, company officials pledged to keep the two platforms completely independent from one another. That changed in 2016 when officials at WhatsApp announced plans in August to begin sharing user information with Facebook.

Under EU law, any such information sharing can only be conducted with the explicit consent of users. Then UK Information Commissioner Elizabeth Denham complained that WhatsApp's plan for obtaining user consent was insufficient to comply with the law. Still, WhatsApp and Facebook went ahead with their plans to share friend suggestions and advertising information on the two platforms.

Deficient User Consent

According to the BBC report, the Information Commissioner's new task force has invited officials from both WhatsApp and Facebook to meet with them. There is no word yet about whether they will or not. However, do not rely on the Information Commissioner going easy on Facebook and its subsidiary. People in positions of power are already unhappy and that will not change unless WhatsApp and Facebook change what they are doing.

The BBC report cited a letter the Working Party to WhatsApp officials. That letter apparently pointed out a number of deficiencies with WhatsApp's current user consent practices, including the following:

  • An unclear pop-up notice that does not fully explain that user information will be shared with Facebook;
  • A misleading implication that WhatsApp's privacy policy has been updated to ‘reflect new features’;
  • Requiring users to uncheck a pre-checked box that otherwise gives consent; and
  • A lack of easier means to allow users to opt out of data sharing.

Greater Scrutiny of Digital Companies

The complaints against WhatsApp and Facebook come at a time when the EU is subjecting digital companies to greater scrutiny over privacy concerns. As to whether WhatsApp and Facebook will face any real penalties for their alleged lack of compliance remains to be seen. But the fact that a task force has been established shows that the government believes it has a fairly compelling case.

If the case goes against WhatsApp and Facebook, it could set the stage for other digital companies revamping their privacy policies. That is not necessarily a bad thing. We already know that people are rather careless about protecting their own data online, so it seems to make sense to implement privacy policies that protect users as much as possible, thereby forcing them to make a conscious decision to be less careless.

In the meantime, WhatsApp users should be aware of what the company is doing with their data. They are probably sharing it with Facebook.

Tuesday, 24 October 2017

London Embracing Square Mile Broadband Innovation

London's “Square Mile” city centre is a hotbed of economic activity and cultural development. It is not all that great when it comes to superfast broadband. London ranks 26th out of 33 European capitals for broadband speed, according to a recent report published by City A.M. But city officials intend to change that.

City A.M. reports that the City of London Corporation is on the cusp of launching a brand-new wi-fi network capable of achieving speeds as high as 180 Mbps within the Square Mile. If the plan comes to fruition, it will make London's city centre one of the fastest places in Europe for wi-fi internet access.

In addition, the government will be investing millions of pounds in the Square Mile over the next few years to upgrade fibre optic networks capable of delivering internet at 1 GB per second. City leaders have their eyes firmly focused on 5G wireless as well, with the intent of ensuring that mobile data services are the fastest in the world.

By February, City of London Corporation chair Catherine McGuinness says some 7,500 residents in 12 City Corporation housing estates will enjoy upgraded fibre optic. London eventually expects to expand the faster broadband throughout the City's seven boroughs.

Broadband the Future of Communications

So why exactly is the City of London pouring so much money into broadband and mobile communications? In a phrase, it is the future of communications. The UK has long been a technology leader in broadband and data delivery services and city officials want London to be at the forefront in both the short and long terms. City leaders believe it is worth the money to develop broadband and mobile services in the Square Mile.

You could make the case that part of the recent push by the City of London Corporation is a direct result of 2016's Brexit vote in as much as experts are warning of a business exit from the capital once the UK pulls out of the EU. Whether that exit actually occurs is of no consequence in this regard. Simply the fear of an exit is enough to spur city leaders to do whatever they can to encourage more businesses to stay in the city. If that means upgraded fibre optic broadband networks and faster wi-fi and mobile services, so be it.

Faster broadband and mobile services in the Square Mile area will certainly benefit local residents and businesses and it will benefit the rest of the UK as well. Over time, what is implemented in the City of London will gradually spread across the entire UK. The only question is whether it will happen fast enough to make us the legitimate leader in Europe.

Irrespective of if it does or not, London's city leaders believe it is imperative to keep the Square Mile at the cutting-edge of communications. They are backing up those beliefs with money; now we will see what that money buys. Hopefully it buys remarkably faster data services very soon.

Tuesday, 17 October 2017

Court Clears Way for New Apple Data Centre in Ireland

The Commercial Court with jurisdiction over County Galway in western Ireland recently dismissed two cases, clearing the way for Apple to take the next steps in developing a group of data centres planned for the county. Apple will spend upwards of €850 million (£762 million) to build the 8-facility campus.

New reports say that two law suits were brought against the project after the local Board gave its permission back in August. Commercial Court justice Paul McDermott rejected the lawsuits on different grounds. Apple may now proceed, though there is still no guarantee that the data centres will be built. Other hurdles will have to be cleared.

Local Objections

The first lawsuit to challenge Apple's plan was brought by a local couple whose home is located near the proposed site. They claimed that Apple failed to carry out a proper environmental impact assessment, making the original Board decision invalid. Justice McDermott disagreed.

The second case was brought by another local resident who believed that proper planning procedures were not being followed. The plaintiff claimed to not be opposed to Apple's plans per se, he was just convinced that there were some planning procedure issues. Apple maintained that the plaintiff had made no submissions to the Galway County Council in opposition to the project, nor had he appealed to the local Board. The Commercial Court sided with Apple.

Big Plans by Apple

Since the project was first proposed, Apple has had big plans for Galway. They have maintained all along that building the new data centres will add hundreds of jobs to the local area while also helping to meet the growing demand for data processing and storage in Ireland.

Apple has not detailed exactly what they plan to do with the data centre, but it is not beyond the realms of possibility to assume it could be a very important data processing hub for the British Isles, if not most of Western Europe. Some news reports have speculated that Apple wants to use the new facilities to power everything from the iTunes Store to iMessage throughout Europe.

Irish Minister for community development Seán Kyne greeted the Commercial Court ruling with delight, calling it "very positive news for Galway and the West of Ireland." He and some 4,000 local members of an Apple Facebook page are encouraged by the ruling, especially given that the project has been delayed numerous times over the past two years.

It is understandable that there are objections whenever a data centre of this size is proposed. However, the courts have to be very careful about ruling based on public opinion. The digital world is expanding exponentially with every passing quarter and we are going to need a lot more data centres in the very near future to keep up with demand. Unless the world is ready to go back to the pre-digital era, both consumers and courts have to be willing to allow data centres to be built.

Wednesday, 4 October 2017

New Microsoft Data Centre Powered by Natural Gas

No matter what you think of Microsoft software and licencing, it is hard to argue against the fact that the American company is among a small handful of technology leaders paving the way to a greener future. The latest iteration of Microsoft's efforts in the green arena come by way of a brand-new data centre – they are calling it a 'lab' instead – powered entirely by natural gas.

Built in Seattle in the United States, Microsoft's Advanced Energy Lab is a new kind of data centre designed around Microsoft's decades-old 'Tent City' concept. What makes the lab so unique is the fact that it was built from the ground up with the goal of being completely separate from grid infrastructure. Microsoft officials say this is a distinct difference in as much as other efforts to use renewable energy to power data centres have been pursued in parallel with grid energy. Microsoft wanted to be the first to come up with a design that required absolutely no power from the grid.

Natural Gas and Fuel Cells

The Advanced Energy Lab powers its servers with energy derived from natural gas. Servers are hooked directly to a natural gas connection that utilises highly efficient fuel cells for power. The fuel cells convert energy from the gas into electricity for both server power and cooling. The benefits to this design are numerous:

  • Keeping power separate from the grid allows the data centre to continue operating even if the surrounding grid goes down due to natural disaster or infrastructure failure
  • The system is more efficient because it reduces the waste and loss of traditional grid distribution, transmission and conversion
  • The design is a comparatively simple one as well, reducing the likelihood of failure by reducing the number of 'moving parts' in the system
  • Data centres based on this design will cost less to build, operate and maintain across-the-board
Microsoft began working on the lab in earnest after developing a partnership with the National Fuel Cell Research Centre in 2013. Their first promising breakthrough came in 2014 when a pilot project proved that fuel cells do not necessarily require clean natural gas to work. The pilot proved that biogas, a renewable fuel, would work just as effectively.

According to Microsoft, the Advanced Energy Lab encapsulates everything the company has learned thus far about natural gas and fuel cells working in tandem to generate electricity. In the coming months and years, they will be refining the technology with the goal of eventually putting it into service.

Microsoft eventually hopes to put together an energy-independent, green and efficient data centre, capable of meeting our ever-expanding data needs without having any negative impact on the environment. It would appear as though the Advanced Energy Lab is a rather large step in that direction. Where they go from here is anyone's guess, but you can bet whatever Microsoft does will probably break new ground. If nothing else, it will be fascinating to watch…

Wednesday, 27 September 2017

New Undersea Cable Comes at a Critical Time

Hurricane Sandy and the damage she unleashed on the US north-east coast in 2012 was a wake-up call for Microsoft. In the aftermath of the storm, international network communications were disrupted for hours as a result of undersea cables that terminated in New Jersey being damaged. Officials at Microsoft suddenly realised it was not wise to have a single location on the East Coast for incoming data. They set out to change that.

Microsoft has since teamed up with Facebook and Telxius, a Spanish technology company and subsidiary of Telefónica, to lay a brand-new subsea cable linking the US and Spain. They have named the cable Marea. On either side are the now sister cities of Virginia Beach, Virginia in the US and Bilboa in Spain. The two locations were chosen because they are well south of connection points in both countries, increasing the likelihood that a natural disaster will not simultaneously knock out both Marea and other connection points farther north.

According to Microsoft president Brad Smith, the new cable "comes at a critical time." The cable is capable of 160 TB per second which, according to a Microsoft blog, is 16 million times faster than the average residential internet connection. At a time when the amount of data flowing across the world's networks is increasing exponentially, the new cable offers more than just redundancy. It also increases capacity.

"Submarine cables in the Atlantic already carry 55 per cent more data than trans-Pacific routes and 40 per cent more data than between the U.S. and Latin America," Smith said. "There is no question that the demand for data flows across the Atlantic will continue to increase and Marea will provide a critical connection for the United States, Spain and beyond."

More Connected Every Day

It is interesting to note that Microsoft and Facebook are normally internet rivals competing for the largest possible market share. The fact that they teamed up along with Telxius is proof of just how vital a new undersea cable was and is. There is no denying that we are becoming a more connected world with each passing day, and we are going to need ever greater capacity and redundancy to keep up as technology evolves.

Marea is a highly advanced data cable that is remarkably different in a number of ways. Most notably, the technology behind the cable involves a highly flexible design that will allow for better future scalability and modification. Right now, it also represents an entirely new and previously unused route for transatlantic data between Europe and North America. Increased capacity is critical in a day and age in which mobility and the Internet of Things are both growing exponentially.

Marea has come at a critical time. It was obvious following Sandy that something had to be done to increase capacity and redundancy. Had Microsoft and its two partners not acted when they did, someone else probably would have picked up the baton. At any rate, global communications will be the main beneficiary.

Wednesday, 20 September 2017

Sound: The Next Frontier for High Speed Computer Processing?

When most people think about sound in relation to high-speed networking, they think about the quality of sound embedded in their high-definition videos or streaming from their favourite music services. What if we told you that sound could be the next frontier in high-speed computer processing? Well, it looks like that might be the case in the not-too-distant future...

Breakthrough research out of the University of Sydney (Australia) has led to the development of a process capable of converting optical data into storable sound waves. The process makes it very possible that future iterations of network technology could store and send data as light waves to receivers that would then convert those light waves into sound waves that computer chips could interpret as meaningful data.

The idea of using light waves to store and transmit data is nothing new. Scientists have known about the potential for years. The problem has always been converting optical data into a format computer chips could work with, in a way that was both fast and efficient.

University of Sydney researchers described the problem of converting optical data into something more usable as comparable to the difference between thunder and lightning. In other words, the speed of light in air is more than 880,000 times faster than the speed of sound. Even if computer chips could read data in the optical domain, they would not be able to keep up with the speed at which that data travels. Thus, the need to slow it down during the conversion process.

Phononics Is the Answer

The field of storing and transmitting data in the audio domain is known as phononics. Both private enterprise and public institutions have been researching phononics for quite some time in the hope of developing some sort of technological advantage over the current process of converting optical data into electrons. The Australian researchers may finally have come up with the answer via phononics.

Current technologies that transmit optical data before converting it into electrons that can be read and stored by computer chips still produces pretty amazing results compared to our capabilities of a decade ago. The process has an inherent weakness though: it produces an incredible amount of waste heat. That waste heat limits the practical use of optical-electron applications. Phononics has solved the problem.

The process developed by the Australian researchers eliminates waste heat by converting optical data to sound waves. More importantly, computer chips can more quickly read, analyse and use audio data as compared to electron data. Researchers say that the process has proved successful enough to open the door to more powerful signal processing and future quantum computing applications.

Those of us in the data centre industry eagerly await further development from Australia. If it turns out their process possesses both mass-market appeal and financial viability, it will completely transform our understanding and application of high-speed computer processing. There will be huge implications in everything from global financial transactions to scientific research to global media access.

Thursday, 14 September 2017

TPS Violation Costs Company £85,000

It is against the law to call consumers whose phone numbers are registered with the Telephone Preference Service (TPS) without explicit consent from such consumers. Being found in violation could cost tens of thousands of pounds, as one Dartford-based telephone company recently found out.

According to an 11th September release from the Information Commissioner's Office (ICO), True Telecom Ltd made nuisance phone calls to consumers for more than two years despite many of those they called being on the TPS list. More astoundingly, the company continued making the calls even after being warned by the ICO to cease. The calls were placed between April 2015 and April 2017, and during that time, more than 200 complaints were registered with the ICO.

The company attempted to mislead consumers by concealing the number they were calling from and giving people the impression they were from the organisation known as BT Openreach. According to the ICO, True Telecom was unable to prove its innocence to investigators by providing evidence of consent.

The result of True Telecom's actions was an £85,000 fine and an official ICO enforcement notice informing True Telecom that it must stop making illegal phone calls immediately. Continuing the calls could result in court action down the road.

ICO Head of Enforcement Steve Eckersley said in his official statement that the rules pertaining to nuisance calls are clear and that his agency intends to enforce them. He went on to say:

"These calls are at best annoying and at worst downright distressing, and companies who pester the public in this way must understand they won't get away with it. The ICO will take action."

Respecting the Privacy of Consumers

Few would argue the fact that we live in a time when the information age has given way to the age of big data. Few would argue that we need rules and regulations in place to protect the privacy of consumers. The rules surrounding nuisance phone calls certainly apply. They were put in place to prevent companies from pressing consumers with repeated, unwanted phone calls.

It is all well and good that True Telecom has been investigated and fined for their illegal activity. But the ICO's report begs the question of how this company got the contact information they used to make the calls. If that information was supplied directly by the consumers themselves as a result of doing business with True Telecom, that's one thing. But, if the information was sold to them by another organisation, we are talking an entirely different matter.

Could it be that it's time to start enacting rules that prevent companies from selling personal information? If we are truly trying to protect the privacy and security of consumers, why on earth do we allow personal information to be packaged and sold? It makes no sense. As long as we do nothing about this loophole, consumers will continue to be victimised by companies who care nothing about their privacy and security.

Tuesday, 5 September 2017

Accident Reveals Sensitive Information on Council Website

A consumer innocently browsing the internet accidentally stumbled across sensitive personal information left unsecured on a council website. This immediately raised concerns about how such data could be left out in the open, at the same time reminding organisations that no one is immune to breaches of data security. The revelation has also led to a substantial fine.

In a 31st August news release from the Information Commissioner's Office (ICO), it was revealed that Nottinghamshire County Council made protected data – including personal addresses, postcodes, and care requirements of the elderly and disabled – publicly available on an insecure site. The data was uncovered when a member of the public stumbled across it without the need to use a username and password to access the information.

ICO head of enforcement Steve Eckersley wrote in the news release:

"This was a serious and prolonged breach of the law. For no good reason, the council overlooked the need to put robust measures in place to protect people's personal information, despite having the financial and staffing resources available."

Eckersley went on to state that the actions by those responsible were both ‘unacceptable and inexcusable’ given how sensitive the data is. The data pertained primarily to individuals who received services based on the council's Homecare Allocation System (HCAS) first launched in 2011. The most egregious aspect of the mistake is the fact that the information had been left unprotected for five years by the time it was discovered in June 2016.

Nottinghamshire County Council has been fined £70,000 by the ICO for its carelessness. It is not yet known whether the 3,000 people whose data was left unprotected suffered any negative consequences as a result.

Proof of the Need for Due Diligence

As an organisation involved in the data centre industry, it is apparent to us that Nottinghamshire County Council was extremely careless in the handling of the HCAS data. It also seems rather strange that the mistake went unnoticed for so long given how much attention the ICO is supposed to be giving to matters of this sort. If anything, the story is further proof of the need for due diligence among those that store data as well government agencies tasked with protecting the general public.

Whenever any online system is created for the purposes of collecting, storing and utilising personal data, a tremendous amount of responsibility comes with that data. There is never an excuse to allow any sort of sensitive data to be freely available to the general public without need for protected access.

The ICO news release says that Nottinghamshire County Council has ‘offered no mitigation’ to this point. Let's hope that this changes sooner rather than later. The public deserves to know how the Council responded to the original revelation and what procedures are now in place to make sure such exposure never occurs again. If we cannot trust those entrusted to protect us, our data security problems are much bigger than we realise.

Friday, 1 September 2017

Houston Data Centres: When Disaster Strikes

As everyone is no doubt aware by now, America's fourth-largest city, Houston, in Texas, was hit with a major Category 4 hurricane late last week. Though the storm was quickly downgraded after making landfall, it has still caused unprecedented damage through relentless rainfall that is likely to top 4 feet in some areas before it's done. US officials are now saying Houston and south-central Texas have never experienced an event like this.

Being in the data centre business, we are curious as to how the four major data centres located in Houston are faring today. There is both good and bad news.

The good news is that all four data centres are still operational and not reporting any damage. This is not unexpected, given the fact that data centre designers build into their plans certain protections against natural disasters. But that brings us to the bad news: streets in Houston are closed, and many of those streets around the data centres are flooded however the power has stayed on at all four facilities thus far.

Running on Emergency Generators

There is no way to know when power will be restored until after the rain finally stops. It could be days but it is more likely to take weeks. Data centre builders plan for such events by equipping facilities with generators and all four Houston data centres are prepared to run their generators if necessary. The question is, will they actually have to?

Should they need the generators, the facilities would be depending on fuel deliveries to keep them going. Data centres are second in line for fuel deliveries behind hospitals, police stations and other first-responder facilities, but with the roads flooded, how long before the fuel trucks could actually make it through?

Preparing Customers for Data Recovery

No one could have predicted that Houston would get 4 feet of rainfall from Harvey. In fact, the storm exceeded all expectations normally associated with hurricanes coming off the Gulf of Mexico. Unfortunately, all the preparations in the world cannot account for everything. Knowing what they now know about the devastation, data centre officials are beginning to contact customers in the hope of helping them make data recovery plans should the unthinkable happen.

The lesson in all of this is that nature will do what it wants to do. Data centre designers and operators go to great lengths to protect their facilities against the most severe weather, but sometimes it's just not enough. Hopefully Houston's four data centres will not suffer any interruption of service. Whether they do or not, there will be plenty of lessons to be learned in the aftermath Hurricane Harvey and its historic flooding.

Wednesday, 16 August 2017

From Superfast to Ultrafast – Speedier Broadband on the Way

On the heels of BT offering to invest in the infrastructure needed to bring high-speed internet to those Britons who do not yet have it, researchers have announced the possibility of current technology becoming obsolete within a short amount of time. We aren't talking high-speed Internet any more. We're not even talking superfast. Instead, we are now looking at ultrafast speeds measured in gigabytes rather than megabytes.

Ultrafast wi-fi has been on the radar for quite some time now. Until recently though, making it happen has remained somewhat of a mystery. That mystery may have been solved by switching from traditional microwaves to terahertz. Researchers at Brown University School of Engineering in Providence, Rhode Island (USA) have demonstrated they can "transmit separate data streams on terahertz waves at very high speeds and with very low error rates," according a report on the Telegraph website.

"This is the first time anybody has characterised a terahertz multiplex system using actual data," the researcher said in an official statement, "and our results show that our approach could be viable in future terahertz wireless networks."

What It Means to You

If you don't know the difference between a microwave and a terahertz, you are not alone. Here's what it means to you in simple terms: ultrafast internet access that could be upwards of 100 times faster than the best high-speed service now available. We are looking at speeds of 50 GB per second as opposed to 500 MB per second, the highest speed available with state-of-the-art microwave technology.

If science is successful in developing terahertz applications, the implications of the new technology would be incredible. First and foremost, terahertz networks would bring to an end the very real danger of outstripping microwave capacity with current high-speed applications.

Secondly, we would be able to develop platforms capable of much higher data densities. Terahertz waves operate at higher frequencies than microwaves and higher frequencies means more data packed into the same stream.

Thirdly, a successful venture into terahertz technology would mean high definition streaming on-the-go for everything from live television to superfast data crunching for banks, businesses and other large consumers of data. That alone would do wonders for worldwide financial markets.

Proving It Works

Proof-of-concept experiments out of Brown University involved two HD television broadcasts that were encoded on two different terahertz frequencies and then sent out across a wi-fi network together. Researchers obtained error-free results at 10 GB per second. Errors were only slight at 50 GB per second and well within the range of standard error correction systems.

From high-speed to superfast to ultrafast, the speeds at which we can send data through the air will only be going up over the next several years. Imagine a wi-fi connection 100 times faster than you currently use. It is possible through terahertz; at least in principle. Now it is up to scientists to make terahertz technology viable for the mass market. It appears as though they are very much on course.

Tuesday, 8 August 2017

Statement of Intent: New UK Consumer Data Protection Rules to be Enforced

A recently issued statement of intent from the Department for Digital, Culture, Media & Sport aims to change the way those who collect data online use it for the foreseeable future. The statement outlines plans to initiate legislation that will further protect consumers against the misuse of their personal information, with or without their consent.

Among the new protections is greater control over data by individual consumers. As an example, consumers will be able to maximise the 'right to be forgotten' by requesting social media sites erase their information.

All the new data protection rules will be encapsulated in legislation to be known as the Data Protection Bill. The government says the bill will instil confidence in people that they have absolute control over their data. According to the statement of intent, research currently shows that up to 80% of the public lacks that confidence right now.

Shifting the Responsibility

Once the new legislation becomes law, it will shift the burden of responsibility from the consumer to the organisation that collects data. It will require organisations to obtain explicit consent for processing certain kinds of personal data. At the same time, it will eliminate the default tick boxes organisations currently use to obtain consent. It is the government's contention that such tick boxes are largely ignored.

The legislation will also:

  • Make it easier for consumers to withdraw consent
  • Give consumers the ability to request their personal data be erased
  • Give parents and guardians control over data collected from minors
  • Expand the definition of 'personal data' to include cookie information, ip addresses, and dna information
  • Give consumers more power to force organisations to reveal what kinds of data they have
  • Make it easier for consumers to move data between organisations
  • Update and strengthen existing data protection laws to bring them in line with the state of the digital economy
The government is extremely serious about shifting responsibility from consumers to data collectors. They have created significant penalties for violators, and the Information Commissioners Office (ICO) will be granted greater authority to enforce the rules. The statement of intent makes it clear that organisations will be held accountable for the data they gather.

"Our measures are designed to support businesses in their use of data, and give consumers the confidence that their data is protected and those who misuse it will be held to account," said Minister of State for Digital Matt Hancock.

Hancock went on to explain that the legislation will give consumers more control over their data while also preparing the UK for Brexit. On that latter point, the legislation brings UK consumer data protection laws in line with the EU's General Data Protection Regulation.

All that remains to be seen now is whether the final Data Protection Bill lives up to the promises of the statement of intent. If it does, the UK will have one of the strongest consumer data protection laws in the world.

Thursday, 3 August 2017

BT Counters Universal Service Obligation Mandate with Proactive Offer

If you had to choose between the government's proposed universal service obligation mandate and BT proactively building the necessary infrastructure to offer universal broadband, which would you choose? It is an interesting question that MPs and Ofcom now have to wrestle with. Thanks to a new offer by BT, the government may not have to implement the universal service obligation after all.

BT has proposed investing some £600 million in new infrastructure that will provide high-speed broadband to up to 99% of UK households by 2020. Universal access would be available by 2022.

BT defines high-speed broadband as a connection that gets at least 10Mbps. They say approval of their plan would mean most UK households getting broadband through fibre and fixed wireless technologies. Those that could not be reached through traditional infrastructure could be offered satellite broadband.

As an alternative, the government has already proposed the universal service obligation. Should they decide to implement it, every UK household without broadband would be able to request it beginning in 2020. BT would have no choice but to install the necessary infrastructure to meet that request.

Why Make the Offer?

It is a fascinating exercise to try and figure out why BT would make such an expensive offer? Agreeing to spend £600 million is an awfully pricey proposition when you consider that BT already provides high-speed broadband to 95% of UK households. To understand BT's move, it's important to understand how they will come up with the money.

Under the universal service obligation mandate, new broadband customers who get service after making a request could not be charged more than customers already receiving the same service. That would mean BT having to swallow the cost of building infrastructure on a case-by-case basis. Their proposal pays for the infrastructure in a different way.

According to the BBC, all of BT's current customers would pay for the construction through increased monthly bills. Those who eventually get access would then be paying the higher bills as well. From a business standpoint, it just makes better sense for BT to voluntarily create universal broadband access rather than being forced into it.

A Few Concerns

BT's proposal has thus far met with mostly positive reactions. However, there are a couple of concerns. First is the possibility that 10Mbps service would be obsolete before the company finished building the infrastructure. That would mean all of those newly connected households would once again be behind in a game of perpetual catch-up.

The second worry is obviously the higher cost to broadband customers. While the first concern can be addressed through well-planned technology upgrades, there's nothing anyone can do about the second. Someone has to pay to build and maintain the infrastructure. If customers don't pay for it directly, they will have to pay for it through tax contributions to cover government subsidies.

We shall see what the government decides to do with BT's proposal. Either way, universal broadband – even in the hardest to reach places – is the ultimate goal.

Tuesday, 6 June 2017

Budapest Convention to Change Digital Evidence Sharing Rules

When crimes are committed in Europe, police investigators are sometimes limited in the kinds of digital evidence they can collect and use for prosecutorial purposes. Despite the Budapest Convention on Cybercrime being opened and maintained for the last 16 years, a lack of clear rules relating to how digital evidence can be used continues to be a problem for European police officials. Now the Convention aims to change that.

News reports say that the Convention is getting ready to sign a new deal that will make it a lot easier for police officials to collect, use and share digital evidence with other participating countries, even if that evidence does not reside on a server located within the borders of the investigating country.

Why the Changes Are Necessary:

Being involved in the data centre sector, we are painfully aware of national laws that require operators in certain countries to make sure data belonging to domestic customers is stored only on domestic servers. We are constantly reminded about national laws requiring the security of that data. It is just part of the game.

Under the current rules, police officials have to be concerned about how digital evidence is shared across European borders. There are times when a police agency could freely access digital data in another country but fail to do so out of fears that such evidence would not be admissible in court. There are other times when accessing cross-border data is actually against the law.

In order to get around the rules, police agencies in member countries take advantage of what are known as Mutual Legal Assistance Treaties. However, going through the treaty process is painfully slow. It is so slow, in fact, that cases can fall apart while police agencies are waiting for approval to get the necessary evidence.

What the New Rules Do:

If new rules are agreed upon without any changes to the current proposals, they will allow police agencies to speed up investigations through faster access to digital data. The rules cover everything from mobile phone use to e-mail to websites and social media. Essentially, any kind of data that can be transmitted online will be subject to better and faster collection by police agencies.

The rules will also put in place policies for reacting to emergency situations. The Budapest Convention is looking to the US for guidance here. That country already has emergency policies in place, policies that enabled France to quickly get information they needed during the Charlie Hebdo attack a couple of years ago.

Based on the known trouble that police agencies go through to collect and use digital evidence, it is quite obvious that some rule changes are needed. There is a danger though. As America's NSA has proven, not carefully thinking through the rules to account for the possibility of digital information being used improperly can lead to all sorts of unintentional spying. The Budapest Convention does need to act, but they need to do so carefully and circumspectly.

Wednesday, 31 May 2017

Microsoft Looking at DNA Data Storage

How would you feel about donating some of your DNA to eventually be utilised as a personal storage space for all your digital data? The idea may seem a bit far-fetched, but Microsoft recently revealed that they are working on a system that, in theory, could make exactly what has just been described pretty routine at some point in the near future.

Microsoft has revealed a research project aimed at using strands of DNA for large-scale data storage. According to a report published by MIT Technology Review, the US-based software company expects to have a workable DNA data storage system in place by the end of this decade.

The system involves using individual strands of nucleic acids to store data as nucleic acid sequences in much the same way a magnetic strip stores data as sequences of positive and negative charges. The benefit of the DNA model is primarily one of capacity. As an example, a Harvard geneticist investigating the possibility of DNA data storage a number of years ago converted and stored his book on the subject using 55,000 DNA strands.

According to reports, a single gram of DNA is capable of holding 215 PB of data. For the record, a petabyte is 1 million GB. That is a tremendous volume of the data stored on something incredibly small. At the rate data is exploding these days, we are going to need something that impressive just to keep up with it all.

Overcoming Current Limits:

Using nucleic acids to store digital data is very promising in that proof of concept has already been established. But like any new technology, it is too cost prohibitive to be mainstream at the current time. There are some inherent limits to DNA data storage that must be overcome before you and I will be donating our own DNA to the cause.

Right now, the biggest challenge seems to be speed. Sending data to the storage system has been as slow as 400 bytes per second. In order to come up with a workable solution that could be embraced by the retail market, researchers have to get to at least to 100 MB per second. And as every year ticks by, that number will increase alongside other technologies.

The other big challenge is the price of materials. Researchers currently invest roughly £620,000 in the materials needed to build a DNA data storage and retrieval system. That cost is way too much to make for a feasible mass market product. The price will have to be reduced to several hundred dollars, at the most, if the idea is to ever be marketable.

Human DNA has been storing critical data since the dawn of man. How ironic it would be if we could take something as fundamental to our existence as DNA and use it to store and retrieve digital information that is becoming equally critical to our everyday lives. Microsoft hopes to make it happen within the next few years.