Tuesday, 17 September 2019


A DATA-CENTRIC APPROACH TO MANAGING DATA CENTRES

“Data centres owned and operated by data-centre landlords, cloud services and other technology firms is expected to increase to roughly 9,100 this year, up from 7,500 last year, and are expected to reach 10,000 by 2020, IDC estimates.”  Source: The Wall Street Journal.

If all other data centres including hyperscale and enterprise were added, the total figure could be in the billions.  Businesses around the world rely upon data centres being available.  There is also more focus on the environment and climate change, so there is more focus on efficiency, and carbon-neutral designs – ergo yet more complexity to manage.

There is a reason that DCIM hasn’t been replaced with something new.  It has had a bad rep for many reasons, but it is necessary to help us manage ever more complex, hybrid environments, and so it has to evolve.  It needs to connect to facilities systems, network systems, IT systems and orchestrate changes as they are required.  No longer can the M in DCIM represent ‘monitoring’.  Perhaps the metamorphosis of DCIM more accurately should be DNIO: Data centre, Network and Infrastructure Orchestration.

DCIM is now moving into the IT stack, and integrating with systems, such as ITAM, CMDB and cloud-based systems.  It now offers the ability to analyse data across sites, and provide AI based solutions to controlling the data centre throughout the IT stack – from the BMS, through to application performance.

One of the hardest elements in a DCIM implementation has been integration, and figuring out how processes and procedures should work, and then how to automate them.  This integration piece – in the past - has either been technically challenging, or financially challenging, or seen as scope creep, or it has been something that a vendor or stakeholder has discouraged.

What is really required is an open integration suite that would allow enterprises to pull their own bespoke solutions together, without racking up expensive development bills. It seems this vision is slowly becoming a reality after some M&A activity in the DCIM space, and clients and vendors steadfastly staying the course behind the DCIM vision.

This brings with it a different way of looking at managing the data centre: it’s a data-centric view.  Instead of worrying about whether an integration is possible, it’s reasonable nowadays to assume that it is. Therefore, it is possible to design the system in the most efficient way and make use of automation where it makes sense.

Here are six encouraging areas of progression where more integration is enabling positive leaps forward:

Broader scope of infrastructure managed by DCIM:

The links to CMDB, ITAM and other systems on the IT side are bringing more data analysis opportunities, with a broader scope of data points.

Use of Artificial Intelligence:

AI is being used more readily in a number of areas within the DC.  For example, cooling optimisation, and security.  AI can learn normal network behaviour and detect cyber threats based on deviation from that behaviour. 

Open platform approach:

Instead of a silo’d approach both internally and externally, the data-centric view of the DC should take priority, which means that IT, Facilities, and vendors are all working together.

SDK / Open  API:

A number of vendors are providing SDKs or Open APIs, which are a good step forward to making integrations between systems work, and it shows that they are open to working with other companies.

CMDB and Asset Management:

There is a recent move to focus on asset management and aligning assets in ERP systems too, to provide a single source of the truth.  From a data centre perspective, having the assets managed well, is an essential building block to DCIM and data centre management.

Processes and Procedures:

Data centre operators are viewing the system as a whole and are finding areas where technology can automate processes.  For example, adds, moves and changes can be streamlined, saving around 30% of resource time by using accurate DCIM data and integrated workflows.

In a world where IT systems are becoming more distributed, and IoT is making its mark, data centres must take a data-centric approach to managing the system of systems housed under their roofs.  Silo’d thinking no longer has a place in the modern data centre:  DC and IT managers need to work together, alongside a multitude of vendors who also need to align and integrate their offerings to the clients’ needs.

This open platform approach enabling integration brings many benefits to life.  An integrated workflow capability facilitates automation, reducing resource time required for operational tasks.  With more visibility of systems, capacity management from the CRAC unit through to ports in the meet me rooms, is a reality allowing the DCIM to assist with intelligent commissioning of new assets and patching routes.  Energy optimisation now involves data from the servers themselves, allowing them to shift workloads when compute requirements are low, thus allowing a server to potentially stand down.

With this data-centric approach, the return on investment should not only be better, it should come in sooner as well.  The software-defined data centre is now in view. 


Guest Blog written by:



Assaf Skolnik, CEO, RiT Tech



Venessa Moffat, Head of Product Marketing, RiT Tech

Marketing, Strategy and Growth Hacking specialist, with 20 years’ experience in the Data Centre and tech industries. Venessa holds a BSc in Computer Science, a Post Grad Diploma in Business Administration, as well as an MBA from Essex University, where she specialised in agile IT architectures for maximum business value. She has successfully led strategy development and implementation programmes in multiple international data centre organisations. 



Tuesday, 16 July 2019


LEFT IN THE DARK – WHAT IS THE CHANCE OF A UK-WIDE  ELECTRICITY BLACKOUT?

In the middle of June, nearly 50 million people across South America were plunged into darkness after a massive power failure wiped out supplies across virtually all of Argentina, Paraguay and Uruguay.  Could something similar ever happen here in the UK and, if so, what’s likely to cause such a fundamental failure?

The source of the blackout was said to be an issue with two 500 kV transmission lines that disrupted electricity from the Yacyret√° hydroelectric plant.  Alleged system design flaws then turned what should have been merely a localised problem into a complete grid failure branded as “unprecedented” by Mauricio Macri, the President of Argentina. 

Our new investigation the Blackout report explores the likelihood of a UK-wide electricity network failure and what the consequences of such a severe incident could be. While data centres are probably as well-prepared as any business, with built-in redundancy and backup supplies in the form of UPS systems and generators, they certainly wouldn’t be immune to severe disruption.

We discovered that high-level contingency planning states that a complete power grid shutdown within the next five years is a 1-in-200 possibility. While very unlikely, there’s still a 1-in-240 chance that the average Brit will die in a road accident during the course of their lifetime, so it’s certainly not out of the question.

So, what are the biggest threats to the electricity supply here in the UK?

•             Climate Change & Extreme Weather

The top 10 hottest years recorded in the UK have taken place since 1990, while sea levels around the coast rise by 3mm a year as warm water expands and ice caps melt.
In the coming years, the effects of climate change mean we’re likely to experience more weather at the extreme ends of the spectrum – torrential rain, storm-force winds, scorching heatwaves and prolonged cold snaps.

Such weather events pose significant harm to the network.  Winds bring down trees that take out transmission lines. Floods damage crucial infrastructure and make it harder for engineers to fix faults.

There are numerous such examples of severe weather here in the UK: the Great Storm of October 1987; the 2013 St Jude Storm, which left 850,000 homes without power; winter floods caused by Storm Desmond in winter 2015-16.

We’re likely to experience far more of these sorts of incidents in the future.

•             Space Weather

“Space weather” collectively describes the series of phenomena originating from the Sun. These include asteroids, solar flares, meteors and geomagnetic storms.

Because of modern society’s reliance on GPS and other satellite signals, the potential impact of any space weather incident is huge – even a weak solar flare can knock satellites out of action.
The biggest ever incident of space weather recorded on Earth took place in 1859. Named after astronomer Richard Carrington, the Carrington Event was a massive magnetic storm that disrupted telegraph systems and electrical equipment.

Today, there’s a 1% annual probability for a repeat occurrence of such an event.
Back in 1989, a smaller storm took down the Hydro-Québec electricity network in Canada, leaving nine million people in the dark for up to nine hours.

•             Accidents & Systems Failures

There are a wide range of events that could fall under this category. It could be a component failure or software crash, basic human error, or accidental fires and explosions.

In reality, most of these incidents will produce an impact limited to a specific location. However, even these events could cause disruption to significant numbers of businesses, service and people.

•             Infrastructure Attacks

The threat of terrorism – in its many forms – is something the UK is all too familiar with. Various state and non-state agents could deliberately target a country’s power supplies using explosives or other means to destroy essential infrastructure such as transmission lines or electricity substations.

In recent years terrorists have carried out major attacks on energy infrastructure in places such as Algeria and Yemen while, this spring, anti-government forces were said to have taken out one of Venezuela’s hydroelectric plants, which contributed to a blackout that left 30 million residents without electricity.

•             Cyber-Attacks

You’re probably aware of the incident just before Christmas 2015, when Russian hackers used special malware to shut down 30 substations in Ukraine, leaving 250,000 people without electricity but did you know the network here in the UK was also compromised on 7 June 2017;  the day of the General Election?

While this spring saw the first USA case of electricity-related cyber hacking, with control systems of grids in California and Wyoming penetrated.

These days, it’s not just an elite band of state-sponsored hackers that pose a threat. Anyone armed with a laptop and a degree of know how could use high-grade malware to launch a potentially harmful attack. 

The UK’s energy network is shifting fundamentally to smart grids, while our day-to-day lives are dominated by supposedly ‘smart’ devices such as virtual assistants, smart phones, or energy meters.

These trends offer hackers many more vulnerabilities to exploit. Could hackers gain access to thousands – potentially millions – of smart devices, powering them up in the middle of the night when the grid isn’t prepared for such a power surge?  Or, more subtly, could incorrect data be fed back into smart grids, either inflating or understating the real demand for electricity?

The Blackout report is free to download from www.theblackoutreport.co.uk

Guest blog by Leo Craig, General Manager of Riello UPS Ltd



Tuesday, 15 January 2019

LOOK FORWARD TO 2019 BUT DON’T LOSE THE LESSONS OF THE PAST


At the beginning of every new year, it is the time for predictions and NTT Group have been sharing their thoughts on what will affect the business world over the next year or so (here).  In particular, they have focused on digital transformation and the impact this is having on how we work, live and play.

However, we mustn’t lose sight of the basics, as we build our resilient cyber defence architecture. The digital agenda is a pressing one for all businesses and one that they cannot afford to ignore – the customer is king and the General Data Protection Regulation (GDPR) puts increased pressures on the board to ensure that not only business data is secure but personal data too.

So, while we stand by our predictions, it is also advisable to reflect on some of the basics that we continually see overlooked by organisations as they try and protect their business from constantly evolving cyber threats:

1. Assess the baseline

With an increasing focus on “platforms”, it is crucial that this fits into a resilient cybersecurity architecture and to ensure efficiency in reducing potential threats and vulnerabilities. Performing a baseline assessment will ensure the correct security foundations are in place to help you get the best from your security investments.

2. Scan the environment 

One of the most important basic practices is vulnerability scanning but running a vulnerability scan on its own is not enough. The results should be analysed and assessed against your critical assets.  This approach ensures that risks are put in context and valuable resources are focused on mitigating the right risk.

3. Plan for a breach

Incident response plans are critical for minimising the impact of a breach. Complex cyber threats are difficult and time-consuming to unpick and may require specialist knowledge and resources to comprehensively resolve. By having a well-defined plan, and testing it regularly, as well as recognising that security incidents will happen, organistions will be better prepared to handle incidents in an effective and consistent way.

4. Collaboration 

Most business recognise the shortage in cybersecurity skills and the industry as a whole is collaborating more. We work closely with our technology partners and industry and government bodies to share intelligence. We now focus on prediction and prevention to get ahead of the potential threats. Collaboration will allow businesses to actively manage the threats before it impacts them.

5. Support the basics 

Clearly it is now on the board’s agenda but we need to ensure that everyone is aware of the risks. It is everyone’s responsibility in our digital economy to be responsible for cybersecurity.  This is why we support training and education programmes to ensure that everyone supports the basics of cybersecurity.

6. Reduce the noise

There is the potential for huge amounts of data to be collated and analysed across the enterprise. Focus should be on the quality of this data and the reduction in false positives. Too often organisations are drowning under the wealth of un-actionable security data. Technologies aren’t configured correctly or are simply too complex to manage effectively. Configuring, tuning and managing the security technology either directly or through a trusted partner is also a basic requirement that many organisations are failing to master.
So, while we always start to look forward at this time of year, we should not lose the lessons of the past and ensure that we get the basics right.

About NTT Security:

NTT Security is the specialised security company and the centre of excellence in security for NTT Group.  With embedded security we enable NTT Group companies (Dimension Data, NTT Communications and NTT DATA) to deliver resilient business solutions for clients’ digital transformation needs.  NTT Security has 10 SOCs, seven R&D centres, over 1,500 security experts and handles hundreds of thousands of security incidents annually across six continents.

Guest Blog written by Garry Sidaway, SVP Security Strategy & Alliances, NTT Security

Friday, 16 November 2018

LOOKING BEYOND INITIAL SPEND

The draft withdrawal agreement in relation to Brexit - set out earlier this week (14 November 2018) by UK Prime Minister Theresa May - has been approved by the UK political cabinet and is now waiting for the stamp of approval from MPs followed by the other European Union members, of which there are 27.

However, with still no trade deal in place, the strength of the UK economy is under serious threat. Businesses are having to remain vigilant when it comes to capital investment. Realising long term strategies in this extremely volatile market is becoming untenable.

Unstable trading stocks, global economic slowdown and the prediction of a dramatic drop in the pound are making it inherently difficult to clearly establish true investment values, especially in the case of total cost of ownership (TCO). Defining the TCO for a capital investment must take into consideration all environmental market factors but with very few reassurances from the government on the energy climate, it is unsurprising that considerable caution is being taken where any type of investment is concerned.

As a prominent and highly influential power protection specialist, Power Control Ltd knows only too well about how the cloudy outlook of the country’s economic future can impact businesses. Commenting on this subject, Power Control’s managing director Martin Hanson said: “Buying behaviours towards UPS investment have changed significantly over recent years. It has become apparent that owner/operators are having to account for more complex physical environments in terms of sophisticated data storage, whilst also considering much longer-term financial impacts of their investments.

“The approach to initial spending has changed. It seems that decision makers are becoming more shrewd when it comes to investing and forecasting TCO. Looking at UPS investment in particular, business owners cannot afford to be flippant. The number of power disturbances continues to rise making mains power sources more volatile. This inherently leads to data loss and can cost companies £millions in lost revenue.

“So not only are there pressures to select the most technologically suitable solutions but the need to make the best long-term commercial decisions is becoming increasingly crucial. Despite the economic pressures, resilience must remain the top priority when it comes to selecting UPS.”

Leading UPS manufacturers have anticipated the need for resilience, greater efficiency and more flexibility and have responded with advanced technologies that achieve the highest criteria levels.

Taking a look at solid state UPS for example – these systems have been the root of power protection for many years and where once their efficiencies were poor, advances in technology now mean these models boast ultra-high efficiencies combined with unfailing power protection.

It is the evolution of modular UPS that has muddied the waters further when it comes to power protection selection. In recent years the term modular has been making big waves in the UPS industry and offer a flexible and scalable approach when it comes to UPS investment.

Modular UPS systems also present reduced operating costs and easier overall maintenance. Engineering works can be quickly undertaken and can mean a more reliable power supply.

Additionally, the modular approach offers a smaller footprint, greater flexibility, easy manageability, inherently greater availability, and scalability throughout its operational life.

A glowing outlook for modular UPS so far but this would not be a fair evaluation without considering resilience. A subject that is very often over simplified to the detriment of the end user.

Modular UPS allow for redundancy with spare modules, therefore it is important to ensure that the system is prudently monitored to make sure that there are spare modules at all times, because if all modules are in use, the redundancy will be lost and this would leave no capacity for backup modules. This simplistic view of the protective nature of modular UPS would make many question how resilient a modular solution can be and if it is worth the risk.

It is important to remember that UPS manufacturers design, develop and manufacture power protection solutions to do exactly that – deliver reliable resilience. Other features such as industry leading efficiency, operational performance and flexibility are all additional benefits that come with investing in leading edge technology.

Specialists in the industry are urging businesses to approach UPS investment judiciously, by looking at the complete power protection landscape, environmental factors and physical infrastructure. This will deliver a solution that is exactly what a business needs not just now but in the future with a clear TCO outlook.

Guest blog written by Rob Mather, Solutions Director, Power Control. 

For more information please visit www.powercontrol.co.uk, email info@powercontrol.co.uk or contact Becky Duffield on bduffield@powercontrol.co.uk / +44 (0) 7402 113222


Alternatively please visit https://powercontrol.co.uk/product-category/ups-systems/ for specific product information or email Power Control’s solutions director direct at rather@powercontrol.co.uk

Thursday, 9 August 2018

Tokyo Data Centre Fire Kills 5 and Injures Dozens

A tragic fire that filled the Tokyo sky with thick, black smoke late in July 2018 has tragically resulted in five fatalities and dozens of injuries. The blaze extensively damaged a building believed to be a data centre possibly belonging to Amazon Web Services (AWS).

Fire officials were unable to confirm ownership of the building due to confidentiality restrictions, but numerous Japanese news outlets are claiming they had been told by industry insiders that Amazon is the owner. Construction on the incomplete building began in 2017 and was expected to be finished by October 2018.

A Devastating Fire

The fire, which occurred in the Tokyo suburb of Tama, began during the early afternoon hours on 26th July. It is believed that the blaze started in the third of four basement levels. The building has a total of seven floors – four underground and three above.

Reports say that some 300 workers were on site when the fire broke out. Unfortunately, four bodies were found in the basement and a fifth on the third-above-ground floor. In addition to the deaths, a total of 50 workers were treated for injuries. Nearly two dozen are said to be in serious condition.

As fires go, this one was particularly devastating in that it raged for eight hours. Reports say that one-third of the building suffered damage. However, assessments are still ongoing more than a week after the blaze. As for Amazon's ownership of the building, it has still not been confirmed. Amazon has been contacted by both Japanese and American news organisations but has yet to respond.

Fire officials have still not released the exact cause of the blaze pending the outcome of their investigation. However, initial reports suggest that workers cutting steel beams in the third basement level may have ignited urethane insulation materials. One news report out of Tokyo indicated that fire investigators are considering professional negligence among steelworkers as the main cause of the fire.

Amazon in Japan

Speculation of Amazon's ownership of the damaged building is fuelled in part by the success the company has enjoyed in Japan. AWS first entered the Japanese market with a data centre built in 2011. They followed that with a second installation in Singapore. According to The Stack, AWS maintains a concentration of four ‘availability zones’ in the greater Tokyo area, rivalling their operations in northern Virginia.

From a business standpoint, AWS is doing very well in Japan. The number of customers accessing AWS services has increased some 500% over the last five years. Experts attribute the company's success to deals with Sony and a number of big names in the Japanese financial sector.

The fire in Tokyo is truly a tragedy for the dozens of families affected by it. Investigators will hopefully pinpoint the exact cause of the blaze and make recommendations as to how future incidents can be avoided. In the meantime, all eyes are on Amazon to see if they will offer any kind of official response.

Sources:


Monday, 2 July 2018

AI-managed data centre thermal optimisation – it’s nearer than you think

Despite best efforts, even the best-run data centres still have cooling and thermal management issues. With cooling now representing around 30% of a data centre’s operating cost, it’s more important than ever for organisations to be focused on thermal optimisation.

For true cooling optimisation, however, it is necessary for data centres to start going further and getting more granular. When a data room is carefully mapped with appropriate thermal data fields, a whole new level of understanding and cooling efficiency is possible. This inevitably means more monitoring and reporting of temperature and cooling loads more actively – ideally in real-time.
With ASHRAE now suggesting as many as three temperature sensors per rack, achieving this level of sensing would typically require around 10x more sensors than are currently deployed in today’s data centres. Unfortunately, it’s still rare for data centres to sense to this level. That’s probably a key reason why, when we analysed over 70 major data centres, we found that 11% of racks weren’t actually ASHRAE thermally compliant. That’s a problem because, without comprehensive sensing, you really can’t determine which of your business-critical racks are compliant and which aren’t.
Achieving entirely new levels of data centre thermal compliance:
To address this, organisations need to work out how to build a rack-level detailed map of their data centre estate that displays all their cooling and thermal performance in real-time.  It’s only by then combining this kind of granular cooling and thermal data with smart monitoring and analysis software that organisations can start to track their data centre cooling loads in real-time – a valuable intelligence to enable thermal optimisation decisions to be made.
To achieve this kind of true thermal optimisation requires a proven, safe process that’s based on thousands of real-time sensors and expert spatial models that combine to remove the uncertainty from data centre cooling. Until recently this was a barrier due to the market cost of sensors however using the latest Internet of Things (IoT) enabled sensors to make this possible for less than 20% of the cost of one of the traditional cooling units. For the first time, this level of sensor deployment is accessible.
By combining this kind of sensor installation with the real-time optimisation capabilities of the latest 3D visualisation and monitoring software, you can now not only ensure ASHRAE compliance across your entire data centre estate, but also start to unlock significant data centre cooling energy savings.
Data centres are still spending too much on cooling:
With today’s typical cooling unit utilisation rates only averaging 34%, the reality is that organisations are still spending far more than they need to on expensive data centre cooling systems. To address this, data centres need to become much more precise in their operation, and they certainly shouldn’t be having to uniformly apply space, power, cooling and other inputs across their data rooms.
It’s only when data rooms are carefully mapped with all the appropriate data fields that these new levels of understanding and efficiency becomes possible. To do this properly we estimate that more than 1,000 sensors are required for the typical data centre, enabling the measurement of a range of previously unknown factors including energy usage, heat outputs and airflow (above and below floors) – exactly the kind of information you’ll need to evolve towards the next generation of data centre AI applications.
Delivering true cooling optimisation:
Once this real-time, rack level data is collected and analysed by a 3D spatial model, specialist software can start to determine the quality of a location, identify what needs to be done to improve that quality and even to warn operators of specific areas that are at risk.
Having access to real-time, rack-level data provides exactly the data platform needed for the kind of software-enabled real-time decision-making and scenario planning capabilities that data centres need if they’re to evolve towards true cooling optimisation – effectively removing the uncertainty from data centre cooling and ensuring that all of your racks remain ASHRAE thermally compliant.
A logical next step is to combine real-time sensors with intelligent software to offer in-depth dynamic simulations that can visualise thermal change within data centres.
This is an important step on the journey towards true, AI-managed, precision data centres, providing a foundation for the creation of intelligent feedback loops that can analyse airflow data into ‘Zone of Influence’ modules that, when combined with standard BMS systems, enable automated zone-by-zone data centre cooling. After this comes the addition of true ‘What If?’ scenario analysis, using monitoring data to learn and predict data centre performance.

This software-driven thermal optimisation approach could provide a platform for the kind of real-time decision-making and scenario planning capabilities that organisations will inevitably require as they transition towards AI-managed thermal optimisation within their data centres.
Guest blog by Dr. Stu Redshaw, Chief Technology Officer, Data Centre OptimisationEkkoSense





Thursday, 7 June 2018

'Your Data Matters' and the Era of GDPR


The government has been working with the tech sector over the last two years to prepare for the implementation of the General Data Protection Regulation (GDPR). Now that the regulations are in full force, the government is turning its attention to educating consumers about what it all means; thus they want to introduce you to the 'Your Data Matters' campaign.

Those familiar with the GDPR know that the new rules are the domain of the Information Commissioner's Office (ICO), a government organisation that played a vital role in developing the rules. Under the GDPR, people have more control over how their personal data is used, stored and shared by organisations they encounter online.

Unfortunately, implementation of the GDPR did not involve a whole lot of effort to educate people about how they can exercise greater control. That's what the “Your Data Matters” campaign is all about. According to the ICO, the campaign is a "collaborative public information campaign" that combines both government and private sector resources.

Understanding Your Rights under the Law


In announcing the “Your Data Matters” campaign, Information Commissioner Elizabeth Denham explained that we all leave behind a digital trail every time we go online. Whether we are transacting business with a private sector company or keeping in touch with family members and friends on social media, the digital trail is lengthened with every online interaction.

"We know that sharing our data safely and efficiently can make our lives easier, but that digital trail is valuable. It’s important that it stays safe and is only used in ways that people would expect and can control," Denham said.

According to an official ICO news bulletin, the government agency is now collaborating with a number of public and private sector organisations to produce educational materials for the “Your Data Matters” campaign. Participating organisations can distribute the materials to their customers.

The ICO has also gone social by launching a new Twitter account to go along with its existing @ICOnews account. The new account is @YourDataMatters. The public is being advised to follow the new account to keep up with all the latest information about GDPR and its associated educational campaign.

It's Now in Our Hands


We will probably look back on implementation of the GDPR and decide it has been a good thing. Hopefully we will have the same assessment of “Your Data Matters” but, right now, it is in our hands. The government has implemented new regulations designed to protect us and the data trail we leave behind. They have given us the opportunity to educate ourselves about how we can exercise our rights under the GDPR. Now we have to make the most of the opportunities given.

If there is one thing that organisations know, it's the fact that the best protection against data misuse is an army of vigilant consumers who know and exercise their rights. If the GDPR is going to work, it will require all of us to do just that.