Tuesday, 22 December 2015

EU Officials Agree on New Data Protection Rules

In a move likely to have a decisive impact on how consumer data is stored and used, EU officials have finally agreed on a new set of data protection rules that will apply to the entire 28-member European Union. The new regulations are designed to be a replacement for the patchwork of individual rules that now exist from one country to the next. Although the rules are not yet official, they are expected to go through the necessary channels in the European Parliament and member countries sometime this week.

News reports say the new rules will force companies to pay very close attention to how personal data is used. Any company found to be misusing personal data could be fined as much as 4% of global revenue upon conviction. The rules apply to any companies with European headquarters.

It is unclear how the rules define misuse of data, and that is a concern for many. The new law could potentially affect everything from the local data centre to the myriad of hosting companies offering services throughout Europe. Even companies offering managed IT services could find themselves in trouble by engaging in practices that may be marginal under the new rules.

Another key component of the rules is a provision that forces companies to report any and all data breaches, regardless of severity. Again, this will apply as much to the local data centre as it does to the large corporate IT department.

Last but by no means least, the rules codify the right to be forgotten across the entire European Union. Once the rules are official, companies will have to get explicit consent from customers to use their data for any purposes other than conducting business between them. They will also have to hire a data protection officer to make sure that all data protection rules are being adhered to.

Businesses Will Be Affected

It is not possible to enact rules of this nature without impacting business. In this case, some businesses will be more negatively affected than others. Small companies will face the worst of it, having to stretch budgets even further in order to hire data protection officers and develop policies and procedures for keeping data secure. Larger companies will face less of an impact on the implementation of policies and procedures, but they could be more heavily damaged by fines in the event of violations.

The good news for European consumers is that the new rules, if enforced properly, will guarantee greater data privacy in the long run. It may even slow down the race to find out who can use Big Data to the biggest advantage at the consumer level. It will have no effect on cyber criminals who are intent on stealing data regardless of any rules put in place.

It would appear as though the EU is on the verge of enacting significant changes in consumer data protection. Now let's see if the rest of the world follows.



Wednesday, 16 December 2015

Google Announces Quantum Breakthrough… Is It Legit Though?

If you are a person that follows all things Google, you are probably aware of the search engine giant's recent announcement that it has successfully tested a D-Wave 2X quantum computing system the company acquired through a joint purchase with NASA several years back. The new supercomputer is ostensibly capable of solving problems as much as 100 million times faster than current, single core technology.

Before you get overly excited about the potential of a computer being able to complete tasks faster than you can think, there are a couple of things to consider. Firstly, the company behind the D-Wave 2X has been roundly criticised within the industry for overstating the capabilities of its technology. Secondly, some of the tests Google used to achieve its astounding results were theoretical tests only. That said, Google is very pleased with what it has accomplished thus far.

Google officials say that they tested a quantum annealing algorithm that performs more than 100 times faster than simulated annealing on a standard, single core CPU. Their tests focused on solving problems involving approximately 1000 binary variables, according to Google director of engineering Hartmut Neven. He said in an official statement that quantum annealing “is more than 108 times faster than simulated annealing running on a single core.”

Quantum Versus Simulated Annealing

Annealing is a process of solving a complex problem by looking at a series of possible solutions in order to find the best and most efficient solution, regardless of the number of variables being considered. Quantum annealing utilises an advanced algorithm that is capable of looking at all of the possible solutions and applying them to each variable in the problem. Simulated annealing does much the same thing except that it may not have access to all the potential solutions due to resource limitations.

Not only did the Google tests confirm that the D-Wave 2X can solve standard problems 108 times faster than a single core processor utilising simulated annealing, but they also proved the supercomputer can solve advanced quantum problems more than 100 million times faster.

Practical Applications of the Technology

Now that Google has let the cat out of the bag, so to speak, the next obvious question is one of the kinds of which practical applications this technology could be used for? Nothing comes to mind immediately. For example, do we utilise quantum equations for providing everyday tasks such as cloud computing and virtualisation? No, we don't. There also does not seem to be a practical way to implement the D-Wave 2X to improve data communications over long-range networks. Even if computers could solve problems that quickly, our current network infrastructure could not support those kinds of speeds.

There is no doubt that Google and NASA will continue looking at ways to put the D-Wave 2X to good use. In the meantime, a group of lab junkies will certainly be having a good time testing the capabilities of quantum annealing within the supercomputer environment. They will eventually figure out how to use it practically.



Wednesday, 9 December 2015

Mobile Industry and Internet of Things Set to Boost European GDP

The small and large companies that make Europe work already know how important mobile technology is to business success. We imagine they are very pleased with the latest GSMA (GSM Association) report that paints an even brighter future for mobile commerce. According to the report, both 4G technology and the Internet of Things (IoT) will be increasing their contributions to European GDP through until 2020.

The GSMA says the mobile industry will be contributing 20% more to European economies four years from now as more and more subscribers are switched to 4G platforms. In terms of raw numbers, the amount of money the mobile industry contributes to GDP will rise from €500 billion to €600 billion, according to the GSMA report. Furthermore, 4G platforms will account for as much as 60% of the total mobile communications traffic over the next 4 to 5 years. The GSMA expects 95% of mobile users to have access to 4G coverage by the end of the decade.

Europe is already the world leader in mobile data communications and IT services for mobile subscribers. Users in Europe enjoy faster download speeds and greater access to data and mobile applications than users in any other part of the world – including North America. There is no reason to believe that this will change in light of the investments the mobile industry continues to make.

The IoT Connection

The future of mobile communications is clear from the GSMA report. But how does the IoT tie into all of this? By networking together mobile devices and just about anything else that is connected to the internet. The GSMA report cites connected cars as just one example.

A number of mobile solution providers have adopted the GSMA Embedded SIM Specification for M2M, a technology that can be easily modified and adapted for use with smart meters and connected cars in a vehicle-sharing platform. By connecting smartphones and cars, users would be offered easy and instant access to transportation in city environments along with the ability to do everything that needs to be done to secure a car via their smartphones.

Some of that capability already exist on a somewhat limited scale. However, it is a piecemeal situation utilising a combination of mobile data communications and GPS monitoring. Connecting cars and smartphones through a single 4G platform makes the process more efficient and quite a bit faster.

Better Mobile Technology Improved GDP

If there is any doubt that mobile technology is the economic engine of the future, the recent GSMA report should put any such questions to bed. We now live in a day and age in which mobility reigns supreme in everything from commerce to education to self-improvement. And as mobile data communications improve, the IoT will be part of that.

From IT services to personal networking, mobility is advancing at breakneck speed. If the GSMA is right, mobility will have a substantial economic impact across Europe for years to come. It will be exciting to watch it unfold.



Tuesday, 1 December 2015

Internet Access from a Light Bulb: Li-Fi Is Here

Imagine working in an office where turning on the lights also meant turning on internet access; an office where you could go online and do what you do up to 100 times faster than you currently do with wi-fi. It may sound the stuff of futuristic films, but it is now reality. The world of li-fi has arrived and should be ready for consumers within the next couple of years.

Li-Fi is a technology that transmits computer data using the visible light spectrum rather than radio waves. It was first demonstrated at a TED exhibition conducted by Professor Harold Haas at Edinburgh University in 2011. Since that early demonstration showing an LED light transmitting a video, the technology has undergone further development that now makes it incredibly fast and comparatively reliable.

The latest li-fi technology was recently tested by an Estonian start-up known as Velmenni. The company used a light bulb fitted with li-fi to transmit data as fast as 1 GB per second. According to Velmenni, speeds of up to 224 GB per second are theoretically possible based on their laboratory testing. Velmenni chief executive Deepak Solanki was quoted by the BBC as saying he hopes the technology will be ready for consumers “within three or four years”.

Making Light-based Technology Better

The idea of transmitting data via light is nothing new in principle. Since the earliest days of infra-red remote controls, we have been using light beams as a means of sending information from one point to the next. Indeed, the whole idea of optical fibre and data communications is based in the reality that light travels faster than electrical impulses. Some of our fastest internet connections today are based on this understanding.

In terms of wireless communications, it is no different. The radio waves on which we depend for wi-fi signals are incredibly slow when you compare them to how fast light travels. The developers of li-fi know this and are taking advantage of it. But there's something else they know: the spectrum of visible light scientists have to work with is 10,000 times greater than the volume of radio waves we currently use for wi-fi. This means that, when li-fi is finally ready for commercial and individual use, it will be a long time before we run out of usable space in the light spectrum.

There are drawbacks to the technology that Velmenni and others acknowledge. First and foremost, it cannot realistically be used outside because natural sunlight interferes with data transfer. Second, light does not pass through walls or floors. Therefore, it is only feasible in isolated spaces. But those two issues notwithstanding, it could replace traditional wi-fi in office environments, restaurants and caf├ęs, and other public spaces where wi-fi is currently used for public internet access.

The age of li-fi is here. It is only a matter of time before transmitting data with a receiver and a few LED light bulbs will be the norm.