Thursday, 9 August 2018

Tokyo Data Centre Fire Kills 5 and Injures Dozens

A tragic fire that filled the Tokyo sky with thick, black smoke late in July 2018 has tragically resulted in five fatalities and dozens of injuries. The blaze extensively damaged a building believed to be a data centre possibly belonging to Amazon Web Services (AWS).

Fire officials were unable to confirm ownership of the building due to confidentiality restrictions, but numerous Japanese news outlets are claiming they had been told by industry insiders that Amazon is the owner. Construction on the incomplete building began in 2017 and was expected to be finished by October 2018.

A Devastating Fire

The fire, which occurred in the Tokyo suburb of Tama, began during the early afternoon hours on 26th July. It is believed that the blaze started in the third of four basement levels. The building has a total of seven floors – four underground and three above.

Reports say that some 300 workers were on site when the fire broke out. Unfortunately, four bodies were found in the basement and a fifth on the third-above-ground floor. In addition to the deaths, a total of 50 workers were treated for injuries. Nearly two dozen are said to be in serious condition.

As fires go, this one was particularly devastating in that it raged for eight hours. Reports say that one-third of the building suffered damage. However, assessments are still ongoing more than a week after the blaze. As for Amazon's ownership of the building, it has still not been confirmed. Amazon has been contacted by both Japanese and American news organisations but has yet to respond.

Fire officials have still not released the exact cause of the blaze pending the outcome of their investigation. However, initial reports suggest that workers cutting steel beams in the third basement level may have ignited urethane insulation materials. One news report out of Tokyo indicated that fire investigators are considering professional negligence among steelworkers as the main cause of the fire.

Amazon in Japan

Speculation of Amazon's ownership of the damaged building is fuelled in part by the success the company has enjoyed in Japan. AWS first entered the Japanese market with a data centre built in 2011. They followed that with a second installation in Singapore. According to The Stack, AWS maintains a concentration of four ‘availability zones’ in the greater Tokyo area, rivalling their operations in northern Virginia.

From a business standpoint, AWS is doing very well in Japan. The number of customers accessing AWS services has increased some 500% over the last five years. Experts attribute the company's success to deals with Sony and a number of big names in the Japanese financial sector.

The fire in Tokyo is truly a tragedy for the dozens of families affected by it. Investigators will hopefully pinpoint the exact cause of the blaze and make recommendations as to how future incidents can be avoided. In the meantime, all eyes are on Amazon to see if they will offer any kind of official response.

Sources:


Monday, 2 July 2018

AI-managed data centre thermal optimisation – it’s nearer than you think

Despite best efforts, even the best-run data centres still have cooling and thermal management issues. With cooling now representing around 30% of a data centre’s operating cost, it’s more important than ever for organisations to be focused on thermal optimisation.

For true cooling optimisation, however, it is necessary for data centres to start going further and getting more granular. When a data room is carefully mapped with appropriate thermal data fields, a whole new level of understanding and cooling efficiency is possible. This inevitably means more monitoring and reporting of temperature and cooling loads more actively – ideally in real-time.
With ASHRAE now suggesting as many as three temperature sensors per rack, achieving this level of sensing would typically require around 10x more sensors than are currently deployed in today’s data centres. Unfortunately, it’s still rare for data centres to sense to this level. That’s probably a key reason why, when we analysed over 70 major data centres, we found that 11% of racks weren’t actually ASHRAE thermally compliant. That’s a problem because, without comprehensive sensing, you really can’t determine which of your business-critical racks are compliant and which aren’t.
Achieving entirely new levels of data centre thermal compliance:
To address this, organisations need to work out how to build a rack-level detailed map of their data centre estate that displays all their cooling and thermal performance in real-time.  It’s only by then combining this kind of granular cooling and thermal data with smart monitoring and analysis software that organisations can start to track their data centre cooling loads in real-time – a valuable intelligence to enable thermal optimisation decisions to be made.
To achieve this kind of true thermal optimisation requires a proven, safe process that’s based on thousands of real-time sensors and expert spatial models that combine to remove the uncertainty from data centre cooling. Until recently this was a barrier due to the market cost of sensors however using the latest Internet of Things (IoT) enabled sensors to make this possible for less than 20% of the cost of one of the traditional cooling units. For the first time, this level of sensor deployment is accessible.
By combining this kind of sensor installation with the real-time optimisation capabilities of the latest 3D visualisation and monitoring software, you can now not only ensure ASHRAE compliance across your entire data centre estate, but also start to unlock significant data centre cooling energy savings.
Data centres are still spending too much on cooling:
With today’s typical cooling unit utilisation rates only averaging 34%, the reality is that organisations are still spending far more than they need to on expensive data centre cooling systems. To address this, data centres need to become much more precise in their operation, and they certainly shouldn’t be having to uniformly apply space, power, cooling and other inputs across their data rooms.
It’s only when data rooms are carefully mapped with all the appropriate data fields that these new levels of understanding and efficiency becomes possible. To do this properly we estimate that more than 1,000 sensors are required for the typical data centre, enabling the measurement of a range of previously unknown factors including energy usage, heat outputs and airflow (above and below floors) – exactly the kind of information you’ll need to evolve towards the next generation of data centre AI applications.
Delivering true cooling optimisation:
Once this real-time, rack level data is collected and analysed by a 3D spatial model, specialist software can start to determine the quality of a location, identify what needs to be done to improve that quality and even to warn operators of specific areas that are at risk.
Having access to real-time, rack-level data provides exactly the data platform needed for the kind of software-enabled real-time decision-making and scenario planning capabilities that data centres need if they’re to evolve towards true cooling optimisation – effectively removing the uncertainty from data centre cooling and ensuring that all of your racks remain ASHRAE thermally compliant.
A logical next step is to combine real-time sensors with intelligent software to offer in-depth dynamic simulations that can visualise thermal change within data centres.
This is an important step on the journey towards true, AI-managed, precision data centres, providing a foundation for the creation of intelligent feedback loops that can analyse airflow data into ‘Zone of Influence’ modules that, when combined with standard BMS systems, enable automated zone-by-zone data centre cooling. After this comes the addition of true ‘What If?’ scenario analysis, using monitoring data to learn and predict data centre performance.

This software-driven thermal optimisation approach could provide a platform for the kind of real-time decision-making and scenario planning capabilities that organisations will inevitably require as they transition towards AI-managed thermal optimisation within their data centres.
Guest blog by Dr. Stu Redshaw, Chief Technology Officer, Data Centre OptimisationEkkoSense





Thursday, 7 June 2018

'Your Data Matters' and the Era of GDPR


The government has been working with the tech sector over the last two years to prepare for the implementation of the General Data Protection Regulation (GDPR). Now that the regulations are in full force, the government is turning its attention to educating consumers about what it all means; thus they want to introduce you to the 'Your Data Matters' campaign.

Those familiar with the GDPR know that the new rules are the domain of the Information Commissioner's Office (ICO), a government organisation that played a vital role in developing the rules. Under the GDPR, people have more control over how their personal data is used, stored and shared by organisations they encounter online.

Unfortunately, implementation of the GDPR did not involve a whole lot of effort to educate people about how they can exercise greater control. That's what the “Your Data Matters” campaign is all about. According to the ICO, the campaign is a "collaborative public information campaign" that combines both government and private sector resources.

Understanding Your Rights under the Law


In announcing the “Your Data Matters” campaign, Information Commissioner Elizabeth Denham explained that we all leave behind a digital trail every time we go online. Whether we are transacting business with a private sector company or keeping in touch with family members and friends on social media, the digital trail is lengthened with every online interaction.

"We know that sharing our data safely and efficiently can make our lives easier, but that digital trail is valuable. It’s important that it stays safe and is only used in ways that people would expect and can control," Denham said.

According to an official ICO news bulletin, the government agency is now collaborating with a number of public and private sector organisations to produce educational materials for the “Your Data Matters” campaign. Participating organisations can distribute the materials to their customers.

The ICO has also gone social by launching a new Twitter account to go along with its existing @ICOnews account. The new account is @YourDataMatters. The public is being advised to follow the new account to keep up with all the latest information about GDPR and its associated educational campaign.

It's Now in Our Hands


We will probably look back on implementation of the GDPR and decide it has been a good thing. Hopefully we will have the same assessment of “Your Data Matters” but, right now, it is in our hands. The government has implemented new regulations designed to protect us and the data trail we leave behind. They have given us the opportunity to educate ourselves about how we can exercise our rights under the GDPR. Now we have to make the most of the opportunities given.

If there is one thing that organisations know, it's the fact that the best protection against data misuse is an army of vigilant consumers who know and exercise their rights. If the GDPR is going to work, it will require all of us to do just that.


Thursday, 15 February 2018

Humidity Control + Energy Saving: is there a solution?


ASHRAE has been working for many years on guidelines that allow a wider tolerance for temperature and humidity. Consequently, the need to humidify has decreased, making the value of humidification equipment less significant in the overall HVAC system of the DC.

However, what if maintaining the humidity also reduced the cooling demand?

One of the most effective solutions involves the use of adiabatic humidifiers: adding moisture to an air stream, absorbing heat in the air, increasing humidity and decreasing the temperature for very little energy consumption (c.1kW electrical power for 70kW cooling)

This evaporative cooling is increasingly used in new generation data centres in which the design conditions are close to the limits suggested by ASHRAE: this is made possible by careful design of air flows and good separation between the air entering the racks and the exhaust air (layout with “hot aisles and cold aisles”).

The higher operating temperature and humidity allow the use of outside air for ‘free cooling’ (e.g. when below 25°C), and when the outside air is hotter and drier, evaporative cooling can be adopted, increasing humidity up to 60% and higher while bringing the temperature down to acceptable values, simply through the evaporation of water.

There are several different adiabatic humidification technologies available, from “wetted media” to washers and spray systems: the principle underlying all of these devices is to maximise the contact surface area between air and water, so as to ensure effective evaporation and perfect absorption in the humidified air stream. The choice of the system depends on numerous factors, ranging from available space to required efficiency and the need for modulation.

In general, the solution needs to be evaluated in terms of TCO (Total Cost of Ownership) throughout the system’s working life, also taking into consideration its resilience in terms of continuous operation as well as water consumption, which in many areas may be a critical factor: indeed, many data centres, together with the classic PUE (Power Usage Effectiveness) for energy consumption also monitor WUE as regards water consumption.

Recently, atomisation systems have become quite popular; these use a system of nozzles and high pressure pumps to create minute droplets of water, thus ensuring optimum absorption. These systems can be controlled by inverters to modulate atomised water production and respond to different load conditions. Other benefits of these systems include very low air pressure drop, no recirculation (and consequently a high level of hygiene, something that unfortunately is often neglected) and the possibility to use one pumping unit with two separate distribution systems; one for summer (evaporative cooling) and one for humidification in winter, meaning significant flexibility - even with vertical air flows.

The effectiveness of such systems depends significantly on local temperature-humidity conditions and in much of Europe both free cooling and evaporative cooling can be exploited for most of the year, to the extent where some data centres are designed to use mechanical cooling as an emergency backup system only.

Guest blog written by Enrico Boscaro, Group Marketing Manager and William Littlewood, Business Development manager, Carel

For more information, please contact william.littlewood@carel.com and enrico.boscaro@carel.com

Brochure available at www.carel.com/application/datacenter  

Wednesday, 3 January 2018

CLOUDCRAFTING: THE ANTHROPOLOGY OF DATA CENTRES

CNet Training recently welcomed Alexander Taylor, an anthropology PhD student from the University of Cambridge, onto its Certified Data Centre Management Professional (CDCMP®) education program. Alex recently researched the practices and discourses of data centres. In this article, he outlines his research in more detail and explains how the education program contributed to his ongoing anthropological exploration of the data centre industry.

Data Centres as Anthropological Field-sites

Traditionally, anthropologists would travel to a faraway land and live among a group of people so as to learn as much about their culture and ways of life as possible. Today, however, we conduct fieldwork with people in our own culture just as much as those from others. As such, I am currently working alongside people from diverse areas of the data centre industry in order to explore how data centre practices and discourses imaginatively intersect with ideas of security, resilience, disaster and the digital future.

Data centres pervade our lives in ways that many of us probably don’t even realise and we rely on them for even the most mundane activities, from supermarket shopping to satellite navigation. These data infrastructures now underpin such an incredible range of activities and utilities across government, business and society that it is important we begin to pay attention to them.

I have therefore spent this past year navigating the linguistic and mechanical wilderness of the data centre industry: its canyons of server cabinet formations, its empty wastelands of white space, its multi-coloured rivers of cables, its valleys of conferences, expos and trade shows, its forests filled with the sound of acronyms and its skies full of twinkling server lights.

While data centres may at first appear without cultural value, just nondescript buildings full of pipes, server cabinets and cooling systems, these buildings are in fact the tips of a vast sociocultural iceberg-of-ways that we are imagining and configuring both the present and the future. Beneath their surface, data centres say something important about how we perceive ourselves as a culture at this moment in time and what we think it means to be a ‘digital’ society.Working with data centres, cloud computing companies and industry education specialists such as CNet Training, I am thus approaching data centres as socially expressive artefacts through which cultural consciousness (and unconsciousness) is articulated and communicated.

The Cloud Unclothed

CNet Training recently provided me with something of a backstage pass to the cloud when they allowed me to audit their CDCMP®data centre program. ‘The cloud’, as it is commonly known, is a very misleading metaphor. Its connotations of ethereality and immateriality obscure the physical reality of this infrastructure and seemingly suggest that your data is some sort of evaporation in a weird internet water cycle. The little existing academic research on data centres typically argues that the industry strives for invisibility and uses the cloud metaphor to further obscure the political reality of data storage. My ethnographic experience so far, however, seems to suggest quite the opposite; that the industry is somewhat stuck behind the marketable but misleading cloud metaphor that really only serves to confuse customers.

Consequently, it seems that a big part of many data centres’ marketing strategies is to raise awareness that the cloud is material by rendering data centres more visible. We are thus finding ourselves increasingly inundated with high-res images of data centres displaying how stable and secure they are. Data centres have in fact become something like technophilic spectacles, with websites and e-magazines constantly showcasing flashy images of these technologically-endowed spaces. The growing popularity of data centre photography – a seemingly emerging genre of photography concerned with photographing the furniture of data centres in ways that make it look exhilarating – fuels the fervour and demand for images of techno-spatial excess. Photos of science fictional data centre-scapes now saturate the industry and the internet, from Kubrickian stills of sterile, spaceship-like interiors full of reflective aisles of alienware server cabinets to titillating glamour shots of pre-action mist systems and, of course, the occasional suggestive close-up of a CRAC unit.One image in particular recurs in data centre advertising campaigns and has quickly become what people imagine when they think of a data centre: the image of an empty aisle flanked by futuristic-looking server cabinets bathed in the blue light of coruscating LEDs.

With increased visibility comes public awareness of the physical machinery that powers the cloud mirage. This new-found physicality brings with it the associations of decay, entropy and, most importantly, vulnerability that are endemic to all things physical. As counterintuitive as it may seem, vulnerability is what data centres need so that they may then sell themselves as the safest, most secure and resilient choice for clients.

Some (Loosely Connected) Social Effects of Cloud Culture

The combination of the confusing cloud metaphor with the almost impenetrable, acronym-heavy jargon and the generally inward-looking orientation of the data centre sector effectively black boxes data centres and cloud computing from industry outsiders. This means that the industry has ended up a very middle-aged-male-dominated industry with a severe lack of young people, despite the fact that it’s one of the fastest growing, most high-tech industries in the UK and expected to continue to sustain extraordinary growth rates as internet usage booms with the proliferation of Internet-of-Things technologies. This also makes data centres ripe territory for conspiracy theories and media interest, which is another reason why they increasingly render themselves hyper-visible through highly publicised marketing campaigns. You often get the feeling, however, that these visual odes to transparency are in actual fact deployed to obscure something else, like the environmental implications of cloud computing or the fact that your data is stored on some company’s hard drives in a building somewhere you’ll never be able to access.

Furthermore, while cloud computing makes it incredibly easy for businesses to get online and access IT resources that once only larger companies could afford, the less-talked-about inverse effect of this is that the cloud also makes it incredibly difficult for businesses to not use the cloud. Consider, for a moment, the importance of this. In a world of near-compulsory online presence, the widespread availability and accessibility of IT resources makes it more work for businesses to get by without using the cloud. The cloud not only has an incredibly normative presence but comes with a strange kind of (non-weather-related) pressure, a kind of enforced conformity to be online. It wouldn’t be surprising if we begin to see resistance to this, with businesses emerging whose USP is simply that they are not cloud-based or don’t have an online presence.

And the current mass exodus into the cloud has seemingly induced a kind of ‘moral panic’ about our increasing societal dependence upon digital technology and, by extension, the resilience, sustainability and security of digital society and the underlying computer ‘grid’ that supports it. Fear of a potential digital disaster in the cloud-based future is not only reflected by cultural artifacts such as TV shows about global blackouts and books about electromagnetic pulse (EMP), but is also present in a number of practices within the data centre industry, from routine Disaster Recovery plans to the construction of EMP-proof data centres underground for the long-term bunkering of data.

Closing Acknowledgments

With the help of organisations like CNet Training I am thus studying the social and cultural dynamics of data-based digital ‘civilisation’ by analysing the growing importance of data infrastructures. Qualitative anthropological research is participatory in nature and, as such, relies upon the openness of the people, organisations and industries with whom the research is conducted. Every industry has its own vocabularies, culture, practices, structures and spheres of activity and CNet Training’s CDCMP®program acted as a vital window into the complexity of data centre lore. It provided me with a valuable insider’s way to learn the hardcore terms of data centre speak and also with the opportunity to meet people from all levels of the industry, ultimately equipping me with a detailed, in-depth overview of my field-site. Interdisciplinary and inter-industry sharing of information like this, where technical and academically-orientated perspectives and skills meet, helps not only to bridge fragmented education sectors, but to enable rewarding and enriching learning experiences. I would like to sincerely thank the CNet Training team for assisting my research. 

Guest blog by Alexander Taylor, PhD Candidate with the Department of Social Anthropology at the University of Cambridge


For further information on CNet’s training programs, please visit www.cnet-training.com, call: +44 (0) 1284 767100 or email info@cnet-training.com

CNet Training

Wednesday, 20 December 2017

HOW CAN MANUFACTURERS BEST PROTECT THEIR POWER SUPPLY?

The uptake of digital technology, the government’s upcoming Industrial Strategy and strong export demand all add up to an expanding manufacturing sector here in the UK. However, this increase in demand will no doubt lead to added pressure on UK power supply, so it becomes more important than ever to have robust power infrastructure in place.
THE IMPACT OF DOWNTIME
Downtime can come at a significant cost for manufacturers, with some statistics showing that just one unplanned event can cost in the region of GBP £1.6m.
What’s more, the UK is reported as the worst-performing economy in Europe when it comes to productivity, so it is even more critical to keep downtime to a minimum.
At a large-scale manufacturing plant, for example, a power shutdown or breakdown in the supply of monitoring/control information can have a disastrous effect on productivity which ultimately could impact on a business’ bottom line.
Therefore, industrial processes should be fully protected to ensure productivity remains at its best, as well as risks and cost implications around machinery failure are reduced.
RELIABLE POWER SUPPLY
There are a number of measures that manufacturers can take to ensure continuous power – an uninterruptible power supply (commonly referred to as UPS) being one of them. A UPS device will not only protect against power outages, but also provide instant emergency power should the mains power fail.
The UPS will run for a few vital minutes to allow safe shutdown, ensuring that all data is backed up and that the generator has fired up properly and is providing power. But when you consider that 45% of blackouts typically occur due to voltage disturbances, the UPS is also a vital piece of equipment to correct power problems.
Manufacturing machinery is vulnerable to numerous electrical anomalies – from voltage sags and spikes to harmonic distortion and other interruptions. In this situation, a UPS can really come into its own – not only to protect against power outages, but also to operate as an effective power conditioning unit.
By smoothing out sags, surges and brownouts to provide a clean and stable power supply, the UPS prevents damage to sensitive and expensive equipment.
In the pharmaceutical industry, for example, when producing a batch of a very expensive drugs in glass or in a semiconductor, a small dip in the voltage will cause an imperfection in the finished product making it unusable and could even result in the batch being discarded altogether.
Even in steel or brick production, if there is a micro break in the power that causes the furnace controllers to shut down, the process has to be stopped. The material being processed will be scrapped and the whole process started again, which can take days and be very costly.
The UPS can also be deployed solely as a power conditioner without batteries, which will come in handy in environments over 40°C, which is the highest temperature a battery can be kept in.
An example of this is ‘cleaning’ power to prevent light flicker in offices next to heavy industry – cranes moving cargo at docks, for instance. In this situation, a UPS can act as a power conditioner on the power supply to the offices, preventing any flickering.
PROTECTING THE INDUSTRY
As we enter this exciting period of growth and see greater uptake of digital technologies, it is wise for those working in the industrial sector to take a step back and make sure their processes and equipment is as protected as it can be.
Manufacturers can do this by having a solid power protection solution in place in the form of a UPS device. This will not only give you peace of mind if machinery does fail, but will give the added reassurance that instances of downtime will be reduced, paving the way for a stronger manufacturing future.
Guest blog by Leo Craig, general manager of Riello UPS.  For more information, please email l.craig@riello-ups.co.uk or call 0800 269394



Thursday, 14 December 2017

Vertiv Anticipates Advent of Gen 4 Data Centre in Look Ahead to 2018 Trends

The next-generation data centre will exist beyond walls, seamlessly integrating core facilities with a more intelligent, mission-critical edge of network. These Gen 4 data centres are emerging and will become the model for IT networks of the 2020s. The advent of this edge-dependent data centre is one of five 2018 data centre trends identified by a global panel of experts from Vertiv, formerly Emerson Network Power.

“Rising data volumes, fuelled largely by connected devices, has caused businesses to reevaluate their IT infrastructures to meet increasing consumer demands,” said Giordano Albertazzi, president of Vertiv in Europe, Middle East and Africa. “Although there are a number of directions companies can take to support this rise, many IT leaders are opting to move their facilities closer to the end-user – or to the edge. Whatever approach businesses take, speed and consistency of service delivered throughout this phase will become the most attractive offering for consumers.”

Previous Vertiv forecasts identified trends tied to the cloud, integrated systems, infrastructure security and more. Below are five trends expected to impact the data centre ecosystem in 2018:

  1. Emergence of the Gen 4 Data Centre: Whether traditional IT closets or 1,500 square-foot micro-data centres, organisations increasingly are relying on the edge. The Gen 4 data centre holistically and harmoniously integrates edge and core, elevating these new architectures beyond simple distributed networks.

This is happening with innovative architectures delivering near real-time capacity in scalable, economical modules that leverage optimised thermal solutions, high-density power supplies, lithium-ion batteries, and advanced power distribution units. Advanced monitoring and management technologies pull it all together, allowing hundreds or even thousands of distributed IT nodes to operate in concert to reduce latency and up-front costs, increase utilisation rates, remove complexity, and allow organisations to add network-connected IT capacity when and where they need it.

  1. Cloud Providers Go Colo: Cloud adoption is happening so fast that in many cases cloud providers can’t keep up with capacity demands. In reality, some would rather not try. They would prefer to focus on service delivery and other priorities over new data centre builds, and will turn to colocation providers to meet their capacity demands.

With their focus on efficiency and scalability, colos can meet demand quickly while driving costs downward. The proliferation of colocation facilities also allows cloud providers to choose colo partners in locations that match end-user demand, where they can operate as edge facilities. Colos are responding by provisioning portions of their data centres for cloud services or providing entire build-to-suit facilities.

  1. Reconfiguring the Data Centre’s Middle Class: It’s no secret that the greatest areas of growth in the data centre market are in hyperscale facilities – typically cloud or colocation providers – and at the edge of the network. With the growth in colo and cloud resources, traditional data centre operators now have the opportunity to reimagine and reconfigure their facilities and resources that remain critical to local operations.

Organisations with multiple data centres will continue to consolidate their internal IT resources, likely transitioning what they can to the cloud or colos while downsizing and leveraging rapid deployment configurations that can scale quickly. These new facilities will be smaller, but more efficient and secure, with high availability – consistent with the mission-critical nature of the data these organisations seek to protect.

In parts of the world where cloud and colo adoption is slower, hybrid cloud architectures are the expected next step, marrying more secure owned IT resources with a private or public cloud in the interest of lowering costs and managing risk.

  1. High-Density (Finally) Arrives: The data centre community has been predicting a spike in rack power densities for a decade, but those increases have been incremental at best. That’s changing. While densities under 10 kW per rack remain the norm, deployments at 15 kW are not uncommon in hyperscale facilities – and some are inching toward 25 kW.

Why now? The introduction and widespread adoption of hyper-converged computing systems is the chief driver. Colos, of course, put a premium on space in their facilities, and high rack densities can mean higher revenues. And the energy-saving advances in server and chip technologies can only delay the inevitability of high density for so long. There are reasons to believe, however, that a mainstream move toward higher densities may look more like a slow march than a sprint. Significantly higher densities can fundamentally change a data centre’s form factor – from the power infrastructure to the way organisations cool higher density environments. High-density is coming, but likely later in 2018 and beyond.

  1. The World Reacts to the Edge: As more and more businesses shift computing to the edge of their networks, critical evaluation of the facilities housing these edge resources and the security and ownership of the data contained there is needed. This includes the physical and mechanical design, construction and security of edge facilities as well as complicated questions related to data ownership. Governments and regulatory bodies around the world increasingly will be challenged to consider and act on these issues.

Moving data around the world to the cloud or a core facility and back for analysis is too slow and cumbersome, so more and more data clusters and analytical capabilities sit on the edge – an edge that resides in different cities, states or countries than the home business. Who owns that data, and what are they allowed to do with it? Debate is ongoing, but 2018 will see those discussions advance toward action and answers.

About Vertiv:

Vertiv designs, builds and services critical infrastructure that enables vital applications for data centres, communication networks and commercial and industrial facilities. Formerly Emerson Network Power, Vertiv supports today’s growing mobile and cloud computing markets with a portfolio of power, thermal and infrastructure management solutions including the Chloride®, Liebert®, NetSure™ and Trellis™ brands. Sales in fiscal 2016 were $4.4 billion.

Guest blog by Vertiv.  For more information, please visit VertivCo.com or contact Hannah Sharland on +44 (0) 2380 649832 or email Hannah.Sharland@vertivco.com