Friday, 16 November 2018


The draft withdrawal agreement in relation to Brexit - set out earlier this week (14 November 2018) by UK Prime Minister Theresa May - has been approved by the UK political cabinet and is now waiting for the stamp of approval from MPs followed by the other European Union members, of which there are 27.

However, with still no trade deal in place, the strength of the UK economy is under serious threat. Businesses are having to remain vigilant when it comes to capital investment. Realising long term strategies in this extremely volatile market is becoming untenable.

Unstable trading stocks, global economic slowdown and the prediction of a dramatic drop in the pound are making it inherently difficult to clearly establish true investment values, especially in the case of total cost of ownership (TCO). Defining the TCO for a capital investment must take into consideration all environmental market factors but with very few reassurances from the government on the energy climate, it is unsurprising that considerable caution is being taken where any type of investment is concerned.

As a prominent and highly influential power protection specialist, Power Control Ltd knows only too well about how the cloudy outlook of the country’s economic future can impact businesses. Commenting on this subject, Power Control’s managing director Martin Hanson said: “Buying behaviours towards UPS investment have changed significantly over recent years. It has become apparent that owner/operators are having to account for more complex physical environments in terms of sophisticated data storage, whilst also considering much longer-term financial impacts of their investments.

“The approach to initial spending has changed. It seems that decision makers are becoming more shrewd when it comes to investing and forecasting TCO. Looking at UPS investment in particular, business owners cannot afford to be flippant. The number of power disturbances continues to rise making mains power sources more volatile. This inherently leads to data loss and can cost companies £millions in lost revenue.

“So not only are there pressures to select the most technologically suitable solutions but the need to make the best long-term commercial decisions is becoming increasingly crucial. Despite the economic pressures, resilience must remain the top priority when it comes to selecting UPS.”

Leading UPS manufacturers have anticipated the need for resilience, greater efficiency and more flexibility and have responded with advanced technologies that achieve the highest criteria levels.

Taking a look at solid state UPS for example – these systems have been the root of power protection for many years and where once their efficiencies were poor, advances in technology now mean these models boast ultra-high efficiencies combined with unfailing power protection.

It is the evolution of modular UPS that has muddied the waters further when it comes to power protection selection. In recent years the term modular has been making big waves in the UPS industry and offer a flexible and scalable approach when it comes to UPS investment.

Modular UPS systems also present reduced operating costs and easier overall maintenance. Engineering works can be quickly undertaken and can mean a more reliable power supply.

Additionally, the modular approach offers a smaller footprint, greater flexibility, easy manageability, inherently greater availability, and scalability throughout its operational life.

A glowing outlook for modular UPS so far but this would not be a fair evaluation without considering resilience. A subject that is very often over simplified to the detriment of the end user.

Modular UPS allow for redundancy with spare modules, therefore it is important to ensure that the system is prudently monitored to make sure that there are spare modules at all times, because if all modules are in use, the redundancy will be lost and this would leave no capacity for backup modules. This simplistic view of the protective nature of modular UPS would make many question how resilient a modular solution can be and if it is worth the risk.

It is important to remember that UPS manufacturers design, develop and manufacture power protection solutions to do exactly that – deliver reliable resilience. Other features such as industry leading efficiency, operational performance and flexibility are all additional benefits that come with investing in leading edge technology.

Specialists in the industry are urging businesses to approach UPS investment judiciously, by looking at the complete power protection landscape, environmental factors and physical infrastructure. This will deliver a solution that is exactly what a business needs not just now but in the future with a clear TCO outlook.

Guest blog written by Rob Mather, Solutions Director, Power Control. 

For more information please visit, email or contact Becky Duffield on / +44 (0) 7402 113222

Alternatively please visit for specific product information or email Power Control’s solutions director direct at

Thursday, 9 August 2018

Tokyo Data Centre Fire Kills 5 and Injures Dozens

A tragic fire that filled the Tokyo sky with thick, black smoke late in July 2018 has tragically resulted in five fatalities and dozens of injuries. The blaze extensively damaged a building believed to be a data centre possibly belonging to Amazon Web Services (AWS).

Fire officials were unable to confirm ownership of the building due to confidentiality restrictions, but numerous Japanese news outlets are claiming they had been told by industry insiders that Amazon is the owner. Construction on the incomplete building began in 2017 and was expected to be finished by October 2018.

A Devastating Fire

The fire, which occurred in the Tokyo suburb of Tama, began during the early afternoon hours on 26th July. It is believed that the blaze started in the third of four basement levels. The building has a total of seven floors – four underground and three above.

Reports say that some 300 workers were on site when the fire broke out. Unfortunately, four bodies were found in the basement and a fifth on the third-above-ground floor. In addition to the deaths, a total of 50 workers were treated for injuries. Nearly two dozen are said to be in serious condition.

As fires go, this one was particularly devastating in that it raged for eight hours. Reports say that one-third of the building suffered damage. However, assessments are still ongoing more than a week after the blaze. As for Amazon's ownership of the building, it has still not been confirmed. Amazon has been contacted by both Japanese and American news organisations but has yet to respond.

Fire officials have still not released the exact cause of the blaze pending the outcome of their investigation. However, initial reports suggest that workers cutting steel beams in the third basement level may have ignited urethane insulation materials. One news report out of Tokyo indicated that fire investigators are considering professional negligence among steelworkers as the main cause of the fire.

Amazon in Japan

Speculation of Amazon's ownership of the damaged building is fuelled in part by the success the company has enjoyed in Japan. AWS first entered the Japanese market with a data centre built in 2011. They followed that with a second installation in Singapore. According to The Stack, AWS maintains a concentration of four ‘availability zones’ in the greater Tokyo area, rivalling their operations in northern Virginia.

From a business standpoint, AWS is doing very well in Japan. The number of customers accessing AWS services has increased some 500% over the last five years. Experts attribute the company's success to deals with Sony and a number of big names in the Japanese financial sector.

The fire in Tokyo is truly a tragedy for the dozens of families affected by it. Investigators will hopefully pinpoint the exact cause of the blaze and make recommendations as to how future incidents can be avoided. In the meantime, all eyes are on Amazon to see if they will offer any kind of official response.


Monday, 2 July 2018

AI-managed data centre thermal optimisation – it’s nearer than you think

Despite best efforts, even the best-run data centres still have cooling and thermal management issues. With cooling now representing around 30% of a data centre’s operating cost, it’s more important than ever for organisations to be focused on thermal optimisation.

For true cooling optimisation, however, it is necessary for data centres to start going further and getting more granular. When a data room is carefully mapped with appropriate thermal data fields, a whole new level of understanding and cooling efficiency is possible. This inevitably means more monitoring and reporting of temperature and cooling loads more actively – ideally in real-time.
With ASHRAE now suggesting as many as three temperature sensors per rack, achieving this level of sensing would typically require around 10x more sensors than are currently deployed in today’s data centres. Unfortunately, it’s still rare for data centres to sense to this level. That’s probably a key reason why, when we analysed over 70 major data centres, we found that 11% of racks weren’t actually ASHRAE thermally compliant. That’s a problem because, without comprehensive sensing, you really can’t determine which of your business-critical racks are compliant and which aren’t.
Achieving entirely new levels of data centre thermal compliance:
To address this, organisations need to work out how to build a rack-level detailed map of their data centre estate that displays all their cooling and thermal performance in real-time.  It’s only by then combining this kind of granular cooling and thermal data with smart monitoring and analysis software that organisations can start to track their data centre cooling loads in real-time – a valuable intelligence to enable thermal optimisation decisions to be made.
To achieve this kind of true thermal optimisation requires a proven, safe process that’s based on thousands of real-time sensors and expert spatial models that combine to remove the uncertainty from data centre cooling. Until recently this was a barrier due to the market cost of sensors however using the latest Internet of Things (IoT) enabled sensors to make this possible for less than 20% of the cost of one of the traditional cooling units. For the first time, this level of sensor deployment is accessible.
By combining this kind of sensor installation with the real-time optimisation capabilities of the latest 3D visualisation and monitoring software, you can now not only ensure ASHRAE compliance across your entire data centre estate, but also start to unlock significant data centre cooling energy savings.
Data centres are still spending too much on cooling:
With today’s typical cooling unit utilisation rates only averaging 34%, the reality is that organisations are still spending far more than they need to on expensive data centre cooling systems. To address this, data centres need to become much more precise in their operation, and they certainly shouldn’t be having to uniformly apply space, power, cooling and other inputs across their data rooms.
It’s only when data rooms are carefully mapped with all the appropriate data fields that these new levels of understanding and efficiency becomes possible. To do this properly we estimate that more than 1,000 sensors are required for the typical data centre, enabling the measurement of a range of previously unknown factors including energy usage, heat outputs and airflow (above and below floors) – exactly the kind of information you’ll need to evolve towards the next generation of data centre AI applications.
Delivering true cooling optimisation:
Once this real-time, rack level data is collected and analysed by a 3D spatial model, specialist software can start to determine the quality of a location, identify what needs to be done to improve that quality and even to warn operators of specific areas that are at risk.
Having access to real-time, rack-level data provides exactly the data platform needed for the kind of software-enabled real-time decision-making and scenario planning capabilities that data centres need if they’re to evolve towards true cooling optimisation – effectively removing the uncertainty from data centre cooling and ensuring that all of your racks remain ASHRAE thermally compliant.
A logical next step is to combine real-time sensors with intelligent software to offer in-depth dynamic simulations that can visualise thermal change within data centres.
This is an important step on the journey towards true, AI-managed, precision data centres, providing a foundation for the creation of intelligent feedback loops that can analyse airflow data into ‘Zone of Influence’ modules that, when combined with standard BMS systems, enable automated zone-by-zone data centre cooling. After this comes the addition of true ‘What If?’ scenario analysis, using monitoring data to learn and predict data centre performance.

This software-driven thermal optimisation approach could provide a platform for the kind of real-time decision-making and scenario planning capabilities that organisations will inevitably require as they transition towards AI-managed thermal optimisation within their data centres.
Guest blog by Dr. Stu Redshaw, Chief Technology Officer, Data Centre OptimisationEkkoSense

Thursday, 7 June 2018

'Your Data Matters' and the Era of GDPR

The government has been working with the tech sector over the last two years to prepare for the implementation of the General Data Protection Regulation (GDPR). Now that the regulations are in full force, the government is turning its attention to educating consumers about what it all means; thus they want to introduce you to the 'Your Data Matters' campaign.

Those familiar with the GDPR know that the new rules are the domain of the Information Commissioner's Office (ICO), a government organisation that played a vital role in developing the rules. Under the GDPR, people have more control over how their personal data is used, stored and shared by organisations they encounter online.

Unfortunately, implementation of the GDPR did not involve a whole lot of effort to educate people about how they can exercise greater control. That's what the “Your Data Matters” campaign is all about. According to the ICO, the campaign is a "collaborative public information campaign" that combines both government and private sector resources.

Understanding Your Rights under the Law

In announcing the “Your Data Matters” campaign, Information Commissioner Elizabeth Denham explained that we all leave behind a digital trail every time we go online. Whether we are transacting business with a private sector company or keeping in touch with family members and friends on social media, the digital trail is lengthened with every online interaction.

"We know that sharing our data safely and efficiently can make our lives easier, but that digital trail is valuable. It’s important that it stays safe and is only used in ways that people would expect and can control," Denham said.

According to an official ICO news bulletin, the government agency is now collaborating with a number of public and private sector organisations to produce educational materials for the “Your Data Matters” campaign. Participating organisations can distribute the materials to their customers.

The ICO has also gone social by launching a new Twitter account to go along with its existing @ICOnews account. The new account is @YourDataMatters. The public is being advised to follow the new account to keep up with all the latest information about GDPR and its associated educational campaign.

It's Now in Our Hands

We will probably look back on implementation of the GDPR and decide it has been a good thing. Hopefully we will have the same assessment of “Your Data Matters” but, right now, it is in our hands. The government has implemented new regulations designed to protect us and the data trail we leave behind. They have given us the opportunity to educate ourselves about how we can exercise our rights under the GDPR. Now we have to make the most of the opportunities given.

If there is one thing that organisations know, it's the fact that the best protection against data misuse is an army of vigilant consumers who know and exercise their rights. If the GDPR is going to work, it will require all of us to do just that.

Thursday, 15 February 2018

Humidity Control + Energy Saving: is there a solution?

ASHRAE has been working for many years on guidelines that allow a wider tolerance for temperature and humidity. Consequently, the need to humidify has decreased, making the value of humidification equipment less significant in the overall HVAC system of the DC.

However, what if maintaining the humidity also reduced the cooling demand?

One of the most effective solutions involves the use of adiabatic humidifiers: adding moisture to an air stream, absorbing heat in the air, increasing humidity and decreasing the temperature for very little energy consumption (c.1kW electrical power for 70kW cooling)

This evaporative cooling is increasingly used in new generation data centres in which the design conditions are close to the limits suggested by ASHRAE: this is made possible by careful design of air flows and good separation between the air entering the racks and the exhaust air (layout with “hot aisles and cold aisles”).

The higher operating temperature and humidity allow the use of outside air for ‘free cooling’ (e.g. when below 25°C), and when the outside air is hotter and drier, evaporative cooling can be adopted, increasing humidity up to 60% and higher while bringing the temperature down to acceptable values, simply through the evaporation of water.

There are several different adiabatic humidification technologies available, from “wetted media” to washers and spray systems: the principle underlying all of these devices is to maximise the contact surface area between air and water, so as to ensure effective evaporation and perfect absorption in the humidified air stream. The choice of the system depends on numerous factors, ranging from available space to required efficiency and the need for modulation.

In general, the solution needs to be evaluated in terms of TCO (Total Cost of Ownership) throughout the system’s working life, also taking into consideration its resilience in terms of continuous operation as well as water consumption, which in many areas may be a critical factor: indeed, many data centres, together with the classic PUE (Power Usage Effectiveness) for energy consumption also monitor WUE as regards water consumption.

Recently, atomisation systems have become quite popular; these use a system of nozzles and high pressure pumps to create minute droplets of water, thus ensuring optimum absorption. These systems can be controlled by inverters to modulate atomised water production and respond to different load conditions. Other benefits of these systems include very low air pressure drop, no recirculation (and consequently a high level of hygiene, something that unfortunately is often neglected) and the possibility to use one pumping unit with two separate distribution systems; one for summer (evaporative cooling) and one for humidification in winter, meaning significant flexibility - even with vertical air flows.

The effectiveness of such systems depends significantly on local temperature-humidity conditions and in much of Europe both free cooling and evaporative cooling can be exploited for most of the year, to the extent where some data centres are designed to use mechanical cooling as an emergency backup system only.

Guest blog written by Enrico Boscaro, Group Marketing Manager and William Littlewood, Business Development manager, Carel

For more information, please contact and

Brochure available at  

Wednesday, 3 January 2018


CNet Training recently welcomed Alexander Taylor, an anthropology PhD student from the University of Cambridge, onto its Certified Data Centre Management Professional (CDCMP®) education program. Alex recently researched the practices and discourses of data centres. In this article, he outlines his research in more detail and explains how the education program contributed to his ongoing anthropological exploration of the data centre industry.

Data Centres as Anthropological Field-sites

Traditionally, anthropologists would travel to a faraway land and live among a group of people so as to learn as much about their culture and ways of life as possible. Today, however, we conduct fieldwork with people in our own culture just as much as those from others. As such, I am currently working alongside people from diverse areas of the data centre industry in order to explore how data centre practices and discourses imaginatively intersect with ideas of security, resilience, disaster and the digital future.

Data centres pervade our lives in ways that many of us probably don’t even realise and we rely on them for even the most mundane activities, from supermarket shopping to satellite navigation. These data infrastructures now underpin such an incredible range of activities and utilities across government, business and society that it is important we begin to pay attention to them.

I have therefore spent this past year navigating the linguistic and mechanical wilderness of the data centre industry: its canyons of server cabinet formations, its empty wastelands of white space, its multi-coloured rivers of cables, its valleys of conferences, expos and trade shows, its forests filled with the sound of acronyms and its skies full of twinkling server lights.

While data centres may at first appear without cultural value, just nondescript buildings full of pipes, server cabinets and cooling systems, these buildings are in fact the tips of a vast sociocultural iceberg-of-ways that we are imagining and configuring both the present and the future. Beneath their surface, data centres say something important about how we perceive ourselves as a culture at this moment in time and what we think it means to be a ‘digital’ society.Working with data centres, cloud computing companies and industry education specialists such as CNet Training, I am thus approaching data centres as socially expressive artefacts through which cultural consciousness (and unconsciousness) is articulated and communicated.

The Cloud Unclothed

CNet Training recently provided me with something of a backstage pass to the cloud when they allowed me to audit their CDCMP®data centre program. ‘The cloud’, as it is commonly known, is a very misleading metaphor. Its connotations of ethereality and immateriality obscure the physical reality of this infrastructure and seemingly suggest that your data is some sort of evaporation in a weird internet water cycle. The little existing academic research on data centres typically argues that the industry strives for invisibility and uses the cloud metaphor to further obscure the political reality of data storage. My ethnographic experience so far, however, seems to suggest quite the opposite; that the industry is somewhat stuck behind the marketable but misleading cloud metaphor that really only serves to confuse customers.

Consequently, it seems that a big part of many data centres’ marketing strategies is to raise awareness that the cloud is material by rendering data centres more visible. We are thus finding ourselves increasingly inundated with high-res images of data centres displaying how stable and secure they are. Data centres have in fact become something like technophilic spectacles, with websites and e-magazines constantly showcasing flashy images of these technologically-endowed spaces. The growing popularity of data centre photography – a seemingly emerging genre of photography concerned with photographing the furniture of data centres in ways that make it look exhilarating – fuels the fervour and demand for images of techno-spatial excess. Photos of science fictional data centre-scapes now saturate the industry and the internet, from Kubrickian stills of sterile, spaceship-like interiors full of reflective aisles of alienware server cabinets to titillating glamour shots of pre-action mist systems and, of course, the occasional suggestive close-up of a CRAC unit.One image in particular recurs in data centre advertising campaigns and has quickly become what people imagine when they think of a data centre: the image of an empty aisle flanked by futuristic-looking server cabinets bathed in the blue light of coruscating LEDs.

With increased visibility comes public awareness of the physical machinery that powers the cloud mirage. This new-found physicality brings with it the associations of decay, entropy and, most importantly, vulnerability that are endemic to all things physical. As counterintuitive as it may seem, vulnerability is what data centres need so that they may then sell themselves as the safest, most secure and resilient choice for clients.

Some (Loosely Connected) Social Effects of Cloud Culture

The combination of the confusing cloud metaphor with the almost impenetrable, acronym-heavy jargon and the generally inward-looking orientation of the data centre sector effectively black boxes data centres and cloud computing from industry outsiders. This means that the industry has ended up a very middle-aged-male-dominated industry with a severe lack of young people, despite the fact that it’s one of the fastest growing, most high-tech industries in the UK and expected to continue to sustain extraordinary growth rates as internet usage booms with the proliferation of Internet-of-Things technologies. This also makes data centres ripe territory for conspiracy theories and media interest, which is another reason why they increasingly render themselves hyper-visible through highly publicised marketing campaigns. You often get the feeling, however, that these visual odes to transparency are in actual fact deployed to obscure something else, like the environmental implications of cloud computing or the fact that your data is stored on some company’s hard drives in a building somewhere you’ll never be able to access.

Furthermore, while cloud computing makes it incredibly easy for businesses to get online and access IT resources that once only larger companies could afford, the less-talked-about inverse effect of this is that the cloud also makes it incredibly difficult for businesses to not use the cloud. Consider, for a moment, the importance of this. In a world of near-compulsory online presence, the widespread availability and accessibility of IT resources makes it more work for businesses to get by without using the cloud. The cloud not only has an incredibly normative presence but comes with a strange kind of (non-weather-related) pressure, a kind of enforced conformity to be online. It wouldn’t be surprising if we begin to see resistance to this, with businesses emerging whose USP is simply that they are not cloud-based or don’t have an online presence.

And the current mass exodus into the cloud has seemingly induced a kind of ‘moral panic’ about our increasing societal dependence upon digital technology and, by extension, the resilience, sustainability and security of digital society and the underlying computer ‘grid’ that supports it. Fear of a potential digital disaster in the cloud-based future is not only reflected by cultural artifacts such as TV shows about global blackouts and books about electromagnetic pulse (EMP), but is also present in a number of practices within the data centre industry, from routine Disaster Recovery plans to the construction of EMP-proof data centres underground for the long-term bunkering of data.

Closing Acknowledgments

With the help of organisations like CNet Training I am thus studying the social and cultural dynamics of data-based digital ‘civilisation’ by analysing the growing importance of data infrastructures. Qualitative anthropological research is participatory in nature and, as such, relies upon the openness of the people, organisations and industries with whom the research is conducted. Every industry has its own vocabularies, culture, practices, structures and spheres of activity and CNet Training’s CDCMP®program acted as a vital window into the complexity of data centre lore. It provided me with a valuable insider’s way to learn the hardcore terms of data centre speak and also with the opportunity to meet people from all levels of the industry, ultimately equipping me with a detailed, in-depth overview of my field-site. Interdisciplinary and inter-industry sharing of information like this, where technical and academically-orientated perspectives and skills meet, helps not only to bridge fragmented education sectors, but to enable rewarding and enriching learning experiences. I would like to sincerely thank the CNet Training team for assisting my research. 

Guest blog by Alexander Taylor, PhD Candidate with the Department of Social Anthropology at the University of Cambridge

For further information on CNet’s training programs, please visit, call: +44 (0) 1284 767100 or email

CNet Training