Tuesday, 22 December 2015

EU Officials Agree on New Data Protection Rules

In a move likely to have a decisive impact on how consumer data is stored and used, EU officials have finally agreed on a new set of data protection rules that will apply to the entire 28-member European Union. The new regulations are designed to be a replacement for the patchwork of individual rules that now exist from one country to the next. Although the rules are not yet official, they are expected to go through the necessary channels in the European Parliament and member countries sometime this week.

News reports say the new rules will force companies to pay very close attention to how personal data is used. Any company found to be misusing personal data could be fined as much as 4% of global revenue upon conviction. The rules apply to any companies with European headquarters.

It is unclear how the rules define misuse of data, and that is a concern for many. The new law could potentially affect everything from the local data centre to the myriad of hosting companies offering services throughout Europe. Even companies offering managed IT services could find themselves in trouble by engaging in practices that may be marginal under the new rules.

Another key component of the rules is a provision that forces companies to report any and all data breaches, regardless of severity. Again, this will apply as much to the local data centre as it does to the large corporate IT department.

Last but by no means least, the rules codify the right to be forgotten across the entire European Union. Once the rules are official, companies will have to get explicit consent from customers to use their data for any purposes other than conducting business between them. They will also have to hire a data protection officer to make sure that all data protection rules are being adhered to.

Businesses Will Be Affected

It is not possible to enact rules of this nature without impacting business. In this case, some businesses will be more negatively affected than others. Small companies will face the worst of it, having to stretch budgets even further in order to hire data protection officers and develop policies and procedures for keeping data secure. Larger companies will face less of an impact on the implementation of policies and procedures, but they could be more heavily damaged by fines in the event of violations.

The good news for European consumers is that the new rules, if enforced properly, will guarantee greater data privacy in the long run. It may even slow down the race to find out who can use Big Data to the biggest advantage at the consumer level. It will have no effect on cyber criminals who are intent on stealing data regardless of any rules put in place.

It would appear as though the EU is on the verge of enacting significant changes in consumer data protection. Now let's see if the rest of the world follows.

Wednesday, 16 December 2015

Google Announces Quantum Breakthrough… Is It Legit Though?

If you are a person that follows all things Google, you are probably aware of the search engine giant's recent announcement that it has successfully tested a D-Wave 2X quantum computing system the company acquired through a joint purchase with NASA several years back. The new supercomputer is ostensibly capable of solving problems as much as 100 million times faster than current, single core technology.

Before you get overly excited about the potential of a computer being able to complete tasks faster than you can think, there are a couple of things to consider. Firstly, the company behind the D-Wave 2X has been roundly criticised within the industry for overstating the capabilities of its technology. Secondly, some of the tests Google used to achieve its astounding results were theoretical tests only. That said, Google is very pleased with what it has accomplished thus far.

Google officials say that they tested a quantum annealing algorithm that performs more than 100 times faster than simulated annealing on a standard, single core CPU. Their tests focused on solving problems involving approximately 1000 binary variables, according to Google director of engineering Hartmut Neven. He said in an official statement that quantum annealing “is more than 108 times faster than simulated annealing running on a single core.”

Quantum Versus Simulated Annealing

Annealing is a process of solving a complex problem by looking at a series of possible solutions in order to find the best and most efficient solution, regardless of the number of variables being considered. Quantum annealing utilises an advanced algorithm that is capable of looking at all of the possible solutions and applying them to each variable in the problem. Simulated annealing does much the same thing except that it may not have access to all the potential solutions due to resource limitations.

Not only did the Google tests confirm that the D-Wave 2X can solve standard problems 108 times faster than a single core processor utilising simulated annealing, but they also proved the supercomputer can solve advanced quantum problems more than 100 million times faster.

Practical Applications of the Technology

Now that Google has let the cat out of the bag, so to speak, the next obvious question is one of the kinds of which practical applications this technology could be used for? Nothing comes to mind immediately. For example, do we utilise quantum equations for providing everyday tasks such as cloud computing and virtualisation? No, we don't. There also does not seem to be a practical way to implement the D-Wave 2X to improve data communications over long-range networks. Even if computers could solve problems that quickly, our current network infrastructure could not support those kinds of speeds.

There is no doubt that Google and NASA will continue looking at ways to put the D-Wave 2X to good use. In the meantime, a group of lab junkies will certainly be having a good time testing the capabilities of quantum annealing within the supercomputer environment. They will eventually figure out how to use it practically.

Wednesday, 9 December 2015

Mobile Industry and Internet of Things Set to Boost European GDP

The small and large companies that make Europe work already know how important mobile technology is to business success. We imagine they are very pleased with the latest GSMA (GSM Association) report that paints an even brighter future for mobile commerce. According to the report, both 4G technology and the Internet of Things (IoT) will be increasing their contributions to European GDP through until 2020.

The GSMA says the mobile industry will be contributing 20% more to European economies four years from now as more and more subscribers are switched to 4G platforms. In terms of raw numbers, the amount of money the mobile industry contributes to GDP will rise from €500 billion to €600 billion, according to the GSMA report. Furthermore, 4G platforms will account for as much as 60% of the total mobile communications traffic over the next 4 to 5 years. The GSMA expects 95% of mobile users to have access to 4G coverage by the end of the decade.

Europe is already the world leader in mobile data communications and IT services for mobile subscribers. Users in Europe enjoy faster download speeds and greater access to data and mobile applications than users in any other part of the world – including North America. There is no reason to believe that this will change in light of the investments the mobile industry continues to make.

The IoT Connection

The future of mobile communications is clear from the GSMA report. But how does the IoT tie into all of this? By networking together mobile devices and just about anything else that is connected to the internet. The GSMA report cites connected cars as just one example.

A number of mobile solution providers have adopted the GSMA Embedded SIM Specification for M2M, a technology that can be easily modified and adapted for use with smart meters and connected cars in a vehicle-sharing platform. By connecting smartphones and cars, users would be offered easy and instant access to transportation in city environments along with the ability to do everything that needs to be done to secure a car via their smartphones.

Some of that capability already exist on a somewhat limited scale. However, it is a piecemeal situation utilising a combination of mobile data communications and GPS monitoring. Connecting cars and smartphones through a single 4G platform makes the process more efficient and quite a bit faster.

Better Mobile Technology Improved GDP

If there is any doubt that mobile technology is the economic engine of the future, the recent GSMA report should put any such questions to bed. We now live in a day and age in which mobility reigns supreme in everything from commerce to education to self-improvement. And as mobile data communications improve, the IoT will be part of that.

From IT services to personal networking, mobility is advancing at breakneck speed. If the GSMA is right, mobility will have a substantial economic impact across Europe for years to come. It will be exciting to watch it unfold.


Tuesday, 1 December 2015

Internet Access from a Light Bulb: Li-Fi Is Here

Imagine working in an office where turning on the lights also meant turning on internet access; an office where you could go online and do what you do up to 100 times faster than you currently do with wi-fi. It may sound the stuff of futuristic films, but it is now reality. The world of li-fi has arrived and should be ready for consumers within the next couple of years.

Li-Fi is a technology that transmits computer data using the visible light spectrum rather than radio waves. It was first demonstrated at a TED exhibition conducted by Professor Harold Haas at Edinburgh University in 2011. Since that early demonstration showing an LED light transmitting a video, the technology has undergone further development that now makes it incredibly fast and comparatively reliable.

The latest li-fi technology was recently tested by an Estonian start-up known as Velmenni. The company used a light bulb fitted with li-fi to transmit data as fast as 1 GB per second. According to Velmenni, speeds of up to 224 GB per second are theoretically possible based on their laboratory testing. Velmenni chief executive Deepak Solanki was quoted by the BBC as saying he hopes the technology will be ready for consumers “within three or four years”.

Making Light-based Technology Better

The idea of transmitting data via light is nothing new in principle. Since the earliest days of infra-red remote controls, we have been using light beams as a means of sending information from one point to the next. Indeed, the whole idea of optical fibre and data communications is based in the reality that light travels faster than electrical impulses. Some of our fastest internet connections today are based on this understanding.

In terms of wireless communications, it is no different. The radio waves on which we depend for wi-fi signals are incredibly slow when you compare them to how fast light travels. The developers of li-fi know this and are taking advantage of it. But there's something else they know: the spectrum of visible light scientists have to work with is 10,000 times greater than the volume of radio waves we currently use for wi-fi. This means that, when li-fi is finally ready for commercial and individual use, it will be a long time before we run out of usable space in the light spectrum.

There are drawbacks to the technology that Velmenni and others acknowledge. First and foremost, it cannot realistically be used outside because natural sunlight interferes with data transfer. Second, light does not pass through walls or floors. Therefore, it is only feasible in isolated spaces. But those two issues notwithstanding, it could replace traditional wi-fi in office environments, restaurants and caf├ęs, and other public spaces where wi-fi is currently used for public internet access.

The age of li-fi is here. It is only a matter of time before transmitting data with a receiver and a few LED light bulbs will be the norm.

Tuesday, 24 November 2015

Amazon Web Services Building New Data Centre Wind Farm

In a bid to eventually run its entire global infrastructure using only renewable energy sources, Amazon Web Services (AWS) has just announced the latest piece of exciting data centre news. According to AWS, they have contracted with EDP Renewables to build a 100 MW wind farm in the United States to be called the Amazon Wind Farm US Central. EDP Renewables will also operate the farm as a contractor for AWS.

Current plans call for the project to be completed by sometime in 2017. All of the energy generated by the wind farm will go to the grid to supply energy for Amazon Web Service data centres. Officials expect the farm to be able to produce up to 320,000 MWh once up and running. The wind farm will be the centrepiece of current and future cloud data centres in the US AWS network.

Slow and Steady Progress

When AWS announced its ambitious renewable energy plants years ago, there was no telling how long it would take the company to finally reach a point of operating on 100% renewable energy. The company still does not have any solid dates in mind. However, they did release data earlier this year showing that they have already reached the 25% threshold. They expect at least 40% of their energy consumption to be based on renewables by the end of 2016. That number will jump considerably once the new wind farm project is online.

AWS vice president Jerry Hunter said that his company continues “to pursue projects that help to develop more renewable energy sources to the grids that power AWS data centres, and bring us closer to achieving our long-term goal of powering our global infrastructure with 100 per cent renewable energy.” He went on to explain how the new US wind farm project will help drive them closer to their goals.

New Construction in the UK

In addition to the wind farm project, AWS has also said they plan to build at least one new UK-based data centre to service those customers who are required to keep information stored domestically. Currently, AWS customers in the UK are served by cloud data centres in Ireland and Germany. The fact that multiple data centres are being used to serve customers suggests the company will be constructing more than one facility here. All eyes are on AWS to see whether the new facilities will be up and running in 2016 or 2017.

We are also interested to see how AWS will employ renewable energy strategies to their new UK data centres. Being that they have a goal of eventually being 100% renewable, it would not make sense to construct new data centres here without also finding a way to power them using alternative means. Whether that means wind, solar, water, or a combination of all three remains to be seen. Whatever the finished product, it is quite likely that AWS will be setting standards for future data centre development.

Tuesday, 10 November 2015

Victims Learning Even Hackers Make Coding Mistakes

We have come to think of hackers as being coding geniuses who never get anything wrong. Yet to the dismay of some victims, we are learning that even the best hackers make coding mistakes. A case in point is a recently discovered variant of Power Worm.

Power Worm is a piece of malware also known as ransomware. Those responsible for creating ransomware have developed a model of hijacking websites and databases and then holding them hostage, electronically speaking, by using encryption to lock owners out. Only after owners pay a specified ransom is the data decrypted. In the case of Power Worm, however, decryption is easier said than done.

Experts say that a coding error in a new variant results in an encryption key being discarded once data has been encrypted. It matters not whether the victim is a single customer of a very large data centre or a corporate entity with its own cloud computing environment. Once the worm is planted and activated, any data within its path can be locked down with encryption. Security experts are warning people not to pay the ransom if hit by Power Worm or one of its variants.

Apparently, not all forms of malware have a coding issue. But there is no way to know once you've been victimised. Experts say that if the ransom is paid but the encryption key used to get data back had been disposed of, victims will have lost both their data and their money. It is far better to report being victimised by Power Worm to the authorities than to pay the ransom and hope for the best.

For the record, Power Worm and its variants primarily target Microsoft Word and Excel documents. But security experts are seeing new versions of the malware targeting larger data sets associated with other software applications.

Ransomware Big Business

In a world of expensive IT services and costly security initiatives, it may seem reasonable to pay one Bitcoin (approximately £250) to get back ransomed data and get on with the business of the day. But experts say it is exactly that thinking that is fuelling this segment of the cybercrime community. Ransomware is big business in which hackers are making money in volume. One Bitcoin here and another there quickly adds up to a lot of money.

According to a BBC report, the perpetrators of the well-known Crypto Wall ransomware and its variants have already racked up more than £215 million as a result of their activities. They are doing so one Bitcoin at a time. When businesses and other data owners acquiesce to the demands of ransomware creators, they are simply making the market for this kind of software more lucrative. And as with any other crime, a lucrative market will merely attract more players over time.

Hackers do make coding mistakes, as lots of people are now learning. Unfortunately, the Power Worm coding error means data potentially lost forever if ransoms are paid.

Wednesday, 4 November 2015

Google Ready to Move Internet Access to the Clouds

In the latest twist on cloud computing news, Google stands ready to take high-speed internet access to the next level with a full-scale launch of Project Loon next year.  If the project meets expectations, Google may have an airborne flotilla of high altitude balloons providing internet access to areas of the globe underserved by standard wire connections.  Not only that, the balloons may just provide faster and more reliable access than we get here in the UK.

According to Google, the Loon Project has successfully completed test runs of the high-tech balloons even while upgrading and developing the technology over the last several years.  The new system involves high atmosphere balloons equipped with enough technology to be incredibly useful.  Each balloon, which experts say can remain aloft for as long as 187 days, is equipped with:

·        an altitude control system
·        flight computer and GPS tracking system
·        two radio transceivers and a third backup
·        solar power system to keep it all running

Engineers designed the altitude control system to raise or lower a balloon in-flight to take advantage of wind systems.  Google uses the stratospheric winds to determine course and direction.  The flight computer and GPS tracking system ensure the balloon goes where it is supposed to go.

Connecting the World – At Least Part of It

It would seem that Google has the edge on Facebook in the race to determine who will be the primary provider of data communications and internet access in the Third World.  The only question that remains is whether or not the internet giant can keep enough balloons in the air, on a continuous basis, to ensure subscribers never lose connectivity.  Google claims it has the speed and technology to do just that.

The systems used to launch Google's 30 test balloons from New Zealand required as many as 14 people and 60 minutes to launch a single unit.  Now engineers say they can get a balloon up in 15 minutes with just two people.  Combined with an extended life more than 18 times the project's original 7 to 10 days, Google is confident it can meet the needs of customers.

Their system already has commercial acceptance as evidenced by a number of contracts Google has already managed to secure.  Sri Lanka will be utilising the system beginning next year (2016), as will three of the mobile networks operating in Indonesia.  We expect Google to announce additional contracts as soon as it has ink on paper.   They are going to want the world to know that they are the first to bring high-speed internet access to parts of the world that have previously been closed.

Although Google's Project Loon is not technically cloud computing in the strictest interpretation of the term, their plans are exciting nonetheless.  Who would have thought a decade ago that internet access would be brought to some of the most remote parts of the world without high-cost construction?  Nicely done, Google!

Thursday, 29 October 2015

Ongoing Company Hacks Have People Worried and Asking 'Why?'

The latest DDoS attack launched against Talk Talk made all of the media rounds over the weekend. At first, the attack was advertised as being far more serious than it really was, leaving millions of Talk Talk customers wondering whether or not their identities would be stolen and their finances ruined. But this latest attack really is not news within the cyber security community. Thousands of attempted attacks happen every single day all across the globe. And, with every new attack, consumers are left worrying and asking why it continues to happen.

Unfortunately, the answer to the question is very uncomfortable. It comes down to two things: cyber-attacks are profitable and we do not have the will to truly stop the problem. What’s more, until things change, no amount of sophisticated hardware and software will prevent cyber criminals from attempting to do what they do.

The Laws of Nature Make It Easy

It is a well-understood law of nature that the more complex a system is, the easier it is to break that system. This is certainly true in the area of computer networks and cyber security. As technology has raced to make data communications faster and more reliable, we have introduced complex hardware and software systems that require a tremendous amount of effort just to maintain, let alone protect. We have, by our own doing, introduced systems that are so complex that it is almost impossible to recognise every potential security flaw before releasing a new product to the market.

The result of doing things this way is that we design and build network systems with a plethora of security weaknesses that go unidentified until hackers breach the data centre. Furthermore, every closed security breach opens the door to a new one within a very short amount of time. It is a no-win situation if we rely solely on software and hardware to protect the world's data communications from cybercrime.

Human Nature Makes It Possible

More uncomfortable is the universal reality of human nature. No matter how hard we try, there will always be those members of society who either delight in taking down networks with denial of service attacks or live to steal from other people by illegally acquiring their private information. No amount of software or hardware will change human nature. The only thing that will is swift and sure justice.

Successfully battling cybercrime on a global front will require the nations of the world to get together in an effort of information sharing, investigation, arrest and prosecution. And there must be no mercy. Unless the punishment for cybercrime is demonstrably more severe than the perceived benefits of such crime are attractive, law enforcement efforts will be in vain.

Cybercrime and attacks like the one experienced by Talk Talk over the weekend will continue in perpetuity therefore our best defence is to go on the offence in both developing new strategies and punishing wrongdoers.


Tuesday, 20 October 2015

Doubts Remain in the US over NSA Data Centre in Utah

Drive by the National Guard base in Bluffdale, Washington - just south of Salt Lake City - and you will notice a group of nondescript buildings that seem anything but out of place at a US military installation. However, among those buildings is a $1.7 billion data centre operated by the National Security Agency (NSA). Its location, combined with revelations about NSA spying back in 2013, have plenty of Americans suspicious about what goes on at Bluffdale. Doubts remain despite assurances by government officials that the facility is not being used to spy on citizens.

In an effort to present a cohesive message and unified communications, the government recently sponsored a national security conference at the University of Utah campus in Salt Lake City. The conference included NSA Utah director Dave Winberg and Utah Congressman Rep. Chris Stewart. Both attempted to assuage American fears by explaining that the data centre was not used for any domestic spying.

Stewart claims that the data centre only provides support services for NASA activities relating to foreign cyber security threats. He told conference attendees that the centre was used to provide development services to several other NSA operations as well as language translation, description, analysis and reporting. He did not explicitly say any part of the data centre’s activities were not used for domestic spying efforts, merely saying that domestic spying was ‘not the purpose’ of the data centre.

The purposely evasive language used by both Stewart and Winberg was allegedly necessary because of the secure nature of the facility's mission, infrastructure and actual work. But such evasiveness only leads to further speculation among US citizens who still believe, by and large, that the NSA is spying on them. The conference did little to reassure the American public about the nature of the data centre.

Nothing to See Here

As far as data centre news is concerned, the NSA would likely prefer that the world take a 'nothing to see here' approach to operations in Bluffdale. They certainly don't want their own citizens continuing to be suspicious. But how can they not be, given the information made public by whistle-blower Edward Snowden? Mr Snowden confirmed the fears of millions of Americans when he revealed just how much data is being collected by the US government.

Interestingly enough, Rep. Stewart blasted Snowden, calling him one of the "most destructive traitors America has ever seen." He went on to say that those who support Snowden do so only because they do not understand the damage he did to the country. However, that may be only half of the equation. Snowden's supporters also don't understand why the US government is spying on its citizens despite a constitutional obligation not to do so. And until the US government comes clean, their citizens will continue to be suspicious. As for the NSA, it has no plans to abandon the facility in light of its mission to protect the US government and its people.

Source:  News Factor  http://www.newsfactor.com/story.xhtml?story_id=10200CF6ZSG0

Thursday, 15 October 2015

Calls for a Government “Not-Spot” Debate

Over the last five years, the UK government has been working hard to ensure that its goal of ninety-five per cent of households being connected to superfast broadband by 2017 is met. The whole effort has been in contract with just one company – BT.

However, on October 12th (2015) the House of Commons debated the matter of inadequate broadband coverage that still blights swathes of the country. The issue of the success (or not, as the case may be!) of the government roll-out was also up for debate. As of the time of this writing, the outcome of the session is as yet unknown. Nevertheless, a feisty affair was expected.

What is a “Not-Spot”?

Before we continue, it may be prudent to explain what a “not-spot” is. According to the Macmillan Dictionary, a not-spot is “an area that has no broadband internet or 3G mobile phone coverage, or where this is very slow and unreliable.”

The issue for many of the MPs that attended the House of Commons’ debate, and one of the reasons for the call for a not-spot summit, is the question of if the government’s broadband strategy is working as well as BT’s future as the contracted party.

What about BT’s Future?

Let us elaborate a little on the BT question. The contract between BT and the government was worth a total of £830 million to the company and, although some local campaigners disagree, for the most part everyone else agrees that the UK is actually one of the leaders in Europe when it comes to broadband availability, implementation and pricing.

The issues arising, though, championed by BT’s rivals, are the company’s continued use of copper. Many feel that this ‘betrays’ Britain’s broadband ambitions and that it should be all about fibre optics. Although BT does use fibre optics (upon which superfast broadband depends) as far as the street cabinet, from there it is connected to homes via copper cabling.

To Copper or Not to Copper

BT has thus far been extolling the virtues of its copper network, claiming that not only has it proved adaptable, but new technology (known as g.fast) promises to push speeds to over 300MBps.

BT’s competitors are not convinced, though… and it all boils down to competition. They say that adopting fibre optics across the board will open up the industry to competition, which will then improve service and mean more money is invested in the network. Their answer is that BT should be split up by having to sell its OpenReach arm to allow such competition. BT obviously disagrees, 
stating that its current ‘innovations’ might stop occurring.

Ofcom is apparently looking into this and, at the moment, is ‘open-minded’.

The government itself would be against any splitting up of the company, citing the fact that, if it were to transpire, it would be incredibly time-consuming… with the potential to backfire.

It will be interesting to see how all of this pans out, especially in terms of the UK’s current not-spots. Until then, it will be fascinating to watch both sides of the debate bang their heads together.

Monday, 12 October 2015

Personal Data Storage in Russia – the canary in the coal mine for cloud?

The new amendment to the Information Law No. 242-FZ forbids storage of Russian citizens’ personal data outside of Russia. The change has posed new challenges to many foreign and domestic companies which already store their users’ data in borderless clouds.

According to the Russian authorities, up to 2.4 million companies are affected. Despite the fact that Russia is the largest (80 million users) and fastest growing Internet market in Europe, the country has suffered from negative media spin in the past regarding strict online censorship.  However the larger picture - data sovereignty, is becoming a global trend and is creating a seismic shock in the cloud industry. In Canada the government requested Microsoft to store its sensitive data locally; or Spain where the government is looking at locations where personal data of its student body is held. Perhaps Russia is at the forefront of the movement, which would explain some uncertainties still contained within the law. Nevertheless businesses should not view the new government restrictions as impenetrable, but rather look at the ways in which technology can enable them to continue their relationships with the fastest growing online market in Europe.

One of the easy solutions is enlisting the help of an MSP so that companies can host their data in Russia, while leaving the rest of their operations uninterrupted. The benefit of cloud technology means that hosting data abroad is a much smoother process than it was even just a few years ago. By enabling organisations to host data in different parts of the world, they are able to serve a truly global customer base while complying with regional data laws. For larger projects there are now a number of trustworthy and professional data centre operators in Russia, already providing service to many multinationals.

The local Russian regulator, Roskomnadzor, has been very accommodating working with international players who might need extensions on time in order to fully comply, however failure to start a dialogue and ignore the legal changes can prove to be disastrous with many websites shut within first week.

IXcellerate is a local Russian data centre operator - with its headquarters in London.  Here are IXcellerate’s suggested 5 simple steps to help companies to start with an effective compliance process:

1)     Engage with Roskomnadzor if you have not already.  If you have only started to look at the compliance, it is strongly advised to start a dialogue with Roskomnadzor. If you can show a current Russian datacentre contract it is likely you will be given an “extension to comply”.
2)     Find a reliable local partner to assist you with the process and involve the head office team in the selection process. The personal data processing trend is not about to change, as governments are becoming more and more occupied with this topic. The choice of a local reliable partner has a strategic meaning: changing this decision will be hard and costly in the future.
3)     Use existing import channels to move equipment.  Usually your Russia-based data centre will have a number of reliable and previously tested partners to recommend. These should be large local business integrators, or international suppliers who have a dealer network in the country.
4)     Manage complexity by transparent communication: make sure there is full understanding of the installation design by all parties involved.  Language barriers and complex terminology can create major problems between client and contractor in this regard.
5)     Don’t forget about after-migration support: the data centre team and other participating parties should be on stand-by after launch.  A properly-run data centre will have client service thoroughly specified with procedures, documentation, a 24-hour bi-lingual emergency telephone line in place and an online ticketing system to track status.

Guest blog written by Guy Willner, CEO of IXcellerate

Contact: Anna Kazaeva anna.kazaeva@ixcellerate.com

Thursday, 8 October 2015

Facebook – To Boldly Go …

Social network giant Facebook has teamed up with French satellite operator Eutelsat in a bid to boldly go where no social network has gone before. Star Trek connotations aside, the deal is a deadly serious attempt by Facebook to beam - from space - free internet to those parts of sub-Saharan Africa that are still without an internet connection.

The deal, set to kick in sometime during the second half of 2016, will offer access to a variety of services via Facebook’s internet.org initiative, including news, weather, health and, of course, to Facebook itself - all free of charge.

Although large swathes of the African continent do have access to some form of internet connection - be that through mobile or fixed telecom networks - coverage is sketchy at best, and almost non-existent in the more sparsely-populated areas of this vast landmass.

Beamed Internet – More Star Trek?

When the operation does get up and running, the idea is for Facebook and Eutelsat to use capacity from the AMOS-6 satellite. This satellite from Spacecom, an Israeli company, is due in orbit by the end of 2015 and, all going well, will start beaming internet connections straight to the smartphones of Africans located in the east, west and southern portions of the continent.

The idea at the moment is to serve 14 of the most populous countries in sub-Saharan Africa, offering first-time internet to millions of people.

Silicon Valley Space Race Ends Before It Begins:

The news of the Facebook-Eutelsat tie-up follows recent reports that the social network giant has now abandoned its own attempts to build a satellite, which could have potentially cost the company up to one billion dollars. Rivals Google have also recently drawn back from plans to do something similar. 

It’s hard to see how the financials could be keeping such behemoths from their space-trotting 
fantasises, so one could only assume that the logistics of such an operation are beyond even these two.

Although Facebook’s internet.org initiative has come under fire from many quarters, due to a perceived violation of ‘net-neutrality’ principles, head of internet.org Chris Daniels said, “Facebook’s mission is to connect the world and we believe that satellites will play an important role in addressing the significant barriers that exist in connecting the people of Africa.”

He continued: “We are looking forward to partnering with Eutelsat on this project and investigating new ways to use satellites to connect people in the most remote areas of the world more efficiently.”
Grumbles:

In relation to the net-neutrality issues mentioned above, a consortium of advocacy groups recently released a statement which, among other things, mentioned, “It is our belief that Facebook is improperly defining net neutrality in public statements and building a walled garden in which the world's poorest people will only be able to access a limited set of insecure websites and services.”

In what looks like a response, internet.org last week rebranded its free offering to ‘Free Basics by Facebook’, a move designed, in its words, to “better distinguish the internet.org project itself from the service itself.”


Thursday, 1 October 2015

Digital Content Delivery Bombshell – Chicago Implementing ‘Cloud Tax’

To those of us residing in the United Kingdom, America’s system of taxation can be downright confusing – federal taxes, state taxes, city taxes … the list goes on. However, the latest taxation amendment in Chicago should have us all – and by all we mean anyone/thing connected to digital, including data centres – quaking in our boots.

Apparently, the city of Chicago has a whole raft of taxes in effect, one of them being an ‘amusement tax’ – basically, anything related to entertainment is taxed at nine per cent. Forbes have described this tax in a recent article as a tax “upon the patrons of every amusement within the city.” The city has recently ‘amended’ this tax to now include content-related services in the digital world. What do we mean by content-related services in this instance? Well, subscription streaming service Netflix is a good example. In the same article as mentioned above, Forbes describes this as “any paid television programming, whether transmitted by wire, cable, fibre optics, laser, microwave, radio, satellite or similar means.”

But this is where it gets interesting – the tax could also apply to a whole raft of cloud-service providers as well. Anything from cloud apps all the way up to cloud infrastructure such as data centres. The irony of this means that a company that actually hosts its content-based streaming service is in real danger of being taxed twice – both as a provider and a user!

What Effect Will This Really Have?

In trying to break this down, though, what effects will this really have on both users and providers? Well, for starters, it will no doubt put off new digital streaming service start-ups from setting up shop in Chicago. The extra costings will simply not be worth it for smaller organisations.
And should a new service provider bite the bullet and set up shop, the end user is going to ultimately suffer due to higher subscription rates. So Chicago residents could end up paying way more for the same service received by users elsewhere in the States.

At the end of the day, why would any data centre, cloud content delivery service or streaming service set up in a city that is going to over-tax them for the privilege?

Where is this Going?

The worry for such services in the States, but especially data centres, is that this type of tax is going to spread like a contagion. Other cities (and states?) are bound to sit up and take notice, especially in light of the fact that the way content is being processed and consumed digitally is evolving at a fast rate of knots.

The future of content delivery is entrenched in the digital; this means more providers will be required to deliver this content. In the end, it could mean the services we take for granted today being taxed to the hilt, putting them out of reach for many.

For the sake of the British data centre industry, let’s hope we do not suffer the same fate that could eventually sweep America. Fingers crossed!!!

Sources:


Friday, 25 September 2015

Mitigating hostile vehicle attack at data centres

Data Centres are critical in today’s society with almost every aspect of our activities being in some way reliant upon the internet and transfer and storage of electronic data.  Governments, Utilities, Banks, businesses & the general public rely on Data Centres for their Information and Communications Technology (ICT). 

Cyber defence & security is increasingly taking a higher priority to traditional forms of military defence.  A data breach or disruption can cripple or damage an organisation within minutes, cyber security budgets are continuing to be increased to keep up with the changing and often unknown threat.

However, the physical structures of the data centre can also be considered a major target and cannot be overlooked.  Data Centres exist in a variety of locations which represent specific challenges in terms of their physical protection.  A physical attack on a building’s infrastructure could have similar devastating effects to that of a cyber-attack; physical security is therefore a vital component of the overall security strategy that security managers need to consider.  Clients that process or store data need to have the assurance that the sites are not vulnerable and that their data is safe.

The starting point with the protection of any site involves a practical site assessment which would consider the security needs along with the business needs and any potential engineering constraints.  Once this information is gathered and a clear picture has been established, manufactures work with the designers and end-users to develop a solution which provided not only the correct level of physical protection but also ensures that the control methodology meets the required levels.

Many aspects should be considered when physically protecting a site such as a data centre, these would include: -

·       ·     A clear understanding of the area being protected and the specific and vulnerable areas within that site

·        How the enforceable perimeter might affect the surrounding buildings in terms of collateral damage in the event of an attack and also the location of exiting services which may be affected

·        Vehicle access points and emergency access points

·        Types of vehicle, frequency of use and their potential speed of approach (Vehicle Dynamic Assessment)

·        Vetting and identification of drivers

·        Operating procedures for the control measures – any system is only as good as its operator

If the location is remote or has a large perimeter it often provides the opportunity to incorporate landscaping to mitigate the ability for a vehicle to gain entry and/or reduce the potential speed of approach for a vehicle.  Measures such as landscaping can alleviate the need to employ large scale and heavy duty HVM measures whilst providing critical vehicle ‘stand-off’ from the building/asset.

A perimeter of a site can be additionally protected with a high security fencing incorporating intruder detection and CCTV.  Limiting vehicle access to the site is essential and therefore serious consideration should be given to the number of access points provided.  These considerations should also take into consideration the needs of emergency vehicle and vehicle reject lanes.  Any access point needs to be protected and controlled effectively by using products that meet both the physical/operational requirements along with the aesthetical and engineering constraints that the site might pose.

Where a building is in a city centre or urban area and space around the building is at a minimum, other measures can help protect and increase vehicle ‘stand-off’ these might including: planters, street furniture or static bollards.

Equipment used to control access to data centres, whilst still offering a high level of protection against hostile attack using a vehicle includes: -

·        ·        Road blocker systems with a variety of options (deep/shallow and surface)

·        Automatic and static bollards

·        Sliding and hinged gates

·        Rising arm barriers 

·        Manual gates and barriers

·        Plus – a large range of non-rated products to complement and enhance site control

Guest blog written by Neale Ward, Sales Manager, Avon Barrier

Telephone: +44 (0) 117 953 5252  

Thursday, 24 September 2015

Businesses Told to Invest More to Stay Safe Online

As the threat of cyber-attacks continues to increase unabated, the government has been telling British businesses that it is in their own and, by extension, the British public’s, best interests to increase spending on online security.  This comes on the back of a recent survey that discovered an astonishing ninety per cent of big business and seventy-four per cent of small businesses have suffered some form of security breach within the last twelve months.

Making the UK the Safest Country in the World to do Online Business

Ed Vaizey, who is the current Minister for the Digital Economy, has backed the call for businesses countywide to sign up for a government scheme known as Cyber Essentials, which is part of the National Cyber Security Programme. The primary aim of this programme is to educate and, hence, help protect business from cyber threats.

“Good cyber-security underpins the entire digital economy – we need it to keep our businesses, citizens and public services safe,” Vaizey said. He also went on to say that, while the UK is a world leader when it comes to digital technologies and our use of these technologies, we must strive to become a global leader in cyber security, as well.

He concluded, “Trust and confidence in UK online security are crucial for consumers, businesses, and investors. We want to make the UK the safest place in the world to do business online and Cyber Essentials is a great and simple way firms can protect themselves.”

At the time of this writing, over a thousand businesses from across the UK have signed up for the Cyber Essentials scheme. These businesses currently have access to advice and information on the current state of protection from cyber security threats. The government scheme is also looking to promote awareness via a whole range of advertising campaigns.

Helping to Protect the Future with Funding

Business is not the only entity benefitting from current government initiatives when it comes to tackling cyber security threats. Vaizey also recently announced a fund - to the tune of half a million pounds – which is earmarked for universities and colleges in an effort to help raise the awareness of cyber threats through innovative teaching methods.

All third-level education institutions can apply for a grant of up to £80,000 however there are caveats. The institutions will need to match the funding provided and must produce ‘real-world impact across the discipline’.

The goal of the fund is to make sure that all students of higher-level education are afforded the chance to receive the type of high quality and advanced instruction that will ultimately give them the technical skills required to eventually help protect British business and government entities from cyber threats and cyber-attacks.

In the end, it is our hope that both the Cyber Essentials scheme and the third level education grant are ultimately successful in what they are trying to achieve. As we in the data centre industry are all too aware, cyber threats are very real and can cause untold damage if not kept in check.

Thursday, 3 September 2015

Building New Data Centres: Russian Construction Heating Up

Russia is by no means a dominant force in the worldwide data centre industry. Its entire market share, globally speaking, is less than 0.5%. However, things are changing in that part of the world. Russian construction of new data centres has been picking up over the last five years. Furthermore, the commercial sector has been growing at about 25% since 2010.

Russia's IT industry is readily embracing cloud and colocation services as a primary revenue driver for the future. At the end of last year, the commercial market was estimated at some £109 million, representing a 20% increase over the year before. Experts believe that growth will continue for the next 3 to 5 years at minimum, and perhaps longer. They expect the business to government market to do very well also.

2 Factors Driving Construction

Until just a few years ago, Russian businesses and consumers were content to utilise international data centres for their data hosting needs however that seems to no longer be the case. According to some research conducted in Russia, there are two primary factors driving new data centre construction:

  • Natural Environment – As a whole, the natural environment in Russia is rather cool. The region is not known for blistering hot temperatures and extremely high humidity over long periods. Therefore, the power and cooling needs of the typical data centre are lower in Russia than they are in other parts of Europe and Asia. Russian companies are finding it more profitable to build domestically rather than going internationally. Companies outside of Russia are also finding the environment attractive.
  • Data Security Laws – Russia's laws regarding data security have been updated, including a recent regulation that requires all personal data belonging to Russian citizens be stored on servers located in domestic data centres. Like Canada and a number of countries in Europe, requiring data to be stored domestically allows security experts to better protect it. The new law has led to an instant increase in demand for data centre services.
We should note that Russian companies are more willing to embrace cloud computing now that they know it works in other parts of the world. This is yet another factor driving data centre construction.

Data Centres Equals Jobs

Russian officials are understandably excited about the fact that data centre construction is picking up in their country. Not only does new construction increase Russia's market share, but it also contributes to the recovery of Russia's struggling economy. Where there are new data centres, there are also new data centre jobs to be filled. Russia is very optimistic about the emerging IT sector it hopes will be able to compete on a global scale within the next 5 to 10 years.

All eyes will be on the Russian IT industry to see how it does from both a competitive and environmental standpoint. If the country can compete economically without sacrificing environmental responsibility, it should do very well.

Thursday, 27 August 2015

Ashley Madison and the Data Protection Act

More than a few eyebrows were raised earlier this year when hackers revealed they had breached the Ashley Madison adult dating site and stolen personal data relating to tens of millions of subscribers. Things were made worse when that data was finally dumped online a couple of weeks ago. The data dump has already led to two possible suicides as well as plenty of PR trouble for celebrities, politicians, and business professionals. It has even led the Information Commissioners Office to issue a warning to journalists.

The Information Commissioners Office’s (ICO) Group Manager for Technology, Simon Rice, published a blog post on the agency's website on August 21 letting it be known that accessing and publishing the Ashley Madison data dump may not be allowed simply by claiming the journalism exemption of the 1998 Data Protection Act. The Information Commissioners Office offers a detailed explanation of how and when the journalism exemption can be applied to personal data.

Rice says that in cases where the journalism exemption cannot be claimed, and that will be the case most of the time, accessing the data dump information becomes a violation of individual privacy and the Data Protection Act. Any publication of that data would be a further violation of the law. Rice encourages any journalist who believes the exemption applies to their activities to consult with the Information Commissioners Office before accessing or publishing the data.

Protecting Consumers and Their Privacy

When the government implemented the Data Protection Act in 1998, the purpose was to bring the UK data protection laws in line with earlier European directives from 1995. The goal of lawmakers was to prevent the invasion of privacy through the investigation, analysis, or publication of personal data by parties with no legal or legitimate need for that information. The Data Protection Act applies across the board to individual data communications, website data mining, data centres and their day-to-day operations, and every other instance in which personal data is collected and stored.

The Information Commissioners Office is currently working with Canadian officials to make sure the Data Protection Act is strictly adhered to in the UK in light of the Ashley Madison breach. They are determined that no illegal exhibition of data will go unanswered. Hopefully, they will be able to make good on this commitment.

On a broader scale, the Ashley Madison hack should be a wake-up call to consumers all across the UK and, for that matter, the world. Although some may disagree with the intent and content of the Ashley Madison website, the activities of both the site's owners and members is legal under Canadian law. A moral or philosophical disagreement with the content of the website is not sufficient reason for hackers to steal and publish personal information having to do with upwards of 37 million people.

This attack is less about Ashley Madison and more about the fact that we are all vulnerable to such malicious activity. If this can happen to Ashley Madison users, it can happen to each and every one of us.

Sources:



Monday, 24 August 2015

Micro data centres: Rubik’s Cube of the Industry

Every once in a while a product comes along that breaks through industry standards.  In toys, for example, it’s not the usual suspects like Barbie or Monopoly – the biggest seller was Rubik’s Cube with 350 million units in just a few years.  In the data centre industry, it’s micro data centres.

Hundreds of millions of units is a stretch for now, but these new solutions are creating a buzz and there is speculation that they will be widely deployed.

Micro data centres are contained, secure computing environments from 1 to 100 kW. They ship in one enclosure and include all necessary power, cooling, security and associated data centre infrastructure management tools (DCIM). They also include all the storage, processing and networking necessary to run applications.

Another advantage — micro data centres are assembled and tested in a factory environment. This is, in part, because of their physical size, as virtualized IT equipment in cloud architecture that used to require 10 IT racks can now fit into one.

Servers, storage and networking equipment are also being integrated with software for more of an out of the box experience. This all reduces latency, helping to meet business or critical needs and speeds deployment for competitive edge and security. In many cases, micro data centres can utilize “sunk costs” in facility power (switchgear) and cooling (chillers or DX) to be more cap-ex friendly as well.

Micro data centres are ideal for colocation and can sit at the edge of network infrastructures. Use cases will be particularly relevant in manufacturing and retail, and in enterprise and industrial settings.

A Rubik’s cube has 43 quintillion different possible configurations and it would take 1400 trillions of years to go through them all.  Data centres have fewer, but untold structure possibilities as well, and businesses must meet demands as quickly as possible. With seemingly endless possibilities, micro data centres are the Rubik’s cube of our industry.


Guest blog by Steven Carlini, Senior Director, Data Centre Global Solutions, Schneider Electric