The world is increasingly dependent on critical data infrastructure, driving demand for more data centre capacity as well as faster networks to move data around. Traffic volumes are set to reach something in the ballpark of 25 zettabytes by 2025, four times the levels of 2016. More data obviously equates to higher demand for electricity to power and cool the infrastructure that manages the flow of information. In fact forecasts indicate that within five years, one fifth of the world’s electricity could be consumed just in powering our data centres. So what does this mean in terms of environmental impact, given that data centres and associated connectivity already account for something like 3% of all carbon emissions? New approaches are needed and they are needed fast.
For a start, it’s time to rethink where we locate infrastructure, argues Mattias Fridstrom, vice president and chief evangelist at Telia Carrier.
“We saw an opportunity up here is the Nordics,” he says. “We know that data centres need cold air. They need renewable power – from hydro, sun, wind. We’ve got all that, and also have no earthquakes and a fairly stable political climate. We saw Sweden, Denmark and Norway all remove tax on data centres to get them to locate up here. That’s probably the most positive impact that carriers and data centres can have – moving facilities to where they should really be. We’re seeing a movement towards the north of Europe by people like Google, Microsoft and Amazon.”
Network operators, he admits, have their part to play: “In the carrier world we need to make sure our networks are going to these less convenient places,” says Fridstrom. “You can’t put a data centre where there’s no connectivity.”
How data centres are built matters too. Alessandro Bruschini is infrastructure manager with Italian data centre operator Aruba S.p.A whose most recently opened facility, IT3, runs entirely on green energy, produced on-site.
“As around 40% of the total energy data centres consume goes into cooling IT equipment, alternative modes of cooling, such as natural cold-water cooling, can help them reduce one of their main energy expenses,” he claims. “Through on-site power generation, such as photovoltaic or hydraulic power, data centres should be able to reduce not only their carbon footprint but also their energy bill.”
Gavin Murray, regional director of data centre engineering and operations, EMEA at data centre operator Rackspace, agrees that newer cooling technologies will have a radical impact: “One approach is liquid hardware cooling, which involves chilled water entering the cabinets in the data centre cooling design,” he says. “This reduces the distance between the cooling system and the data, decreasing the need for data centre air conditioning and making the cooling process more energy efficient. Another approach is outside air-cooling technology. This highly efficient method uses outdoor air to cool data centres, considerably reducing the total energy consumption, and it has the potential to provide a Power Usage Effectiveness (PUE) of 1.15, compared to the average data centre rating of 1.7.”
Ian Whitfield is CEO of ENGIE Impact, a specialist in helping companies with sustainability transformation. He believes that data centre operators have a growing range of tech advances at their disposal: “Newer IT equipment, for example, has a higher operating temperature, so less energy is required to mechanically cool it,” he says. “Google has dropped its energy usage by 50% since 2014 through advance temperature management practices and, while the average data centre provider can’t invest to the extent Google has, it sets a great industry standard to work towards.”
Less power-hungry compute power is clearly a piece of the jigsaw. For Kevin Deierling vice president of marketing with Mellanox, a vendor of Ethernet and InfiniBand intelligent interconnect products, the solution lies in deploying better technology inside the data centre: “One key way to combat global warming is by making data centre more efficient because if you can use fewer computers to do more work, than the carbon impact will be smaller,” he explains. “The network that connects all the computers together in the data centre plays a large role in the efficiency of the data centre. Using high performance, high bandwidth interconnect actually makes the data centre compute more efficient. Because after all from a data centre energy consumption perspective the best computer is the one that doesn’t need to be installed, because the other computers are running more efficiently.”
Eltjo Hofstee, managing director with hosting company Leaseweb UK, agrees that better technology generally makes for a better environment: “A solution to achieving sustainability is virtualisation,” he suggests. “This enables IT teams to fully utilise the capacity of a physical server, which means in most cases you can run the same environment on less physical servers, without performance decrease. Newer servers also use less energy, which is why Leaseweb is investing significantly in newer models as a way to reduce overall energy consumption.”
It is an inconvenient truth however that some technology designed to promote efficiency is counterintuitively accompanied by a higher carbon overhead. Tate Cantrell, chief technology officer with Icelandic data centre operator Verne Global points out that more and more, companies are turning to AI and machine learning applications to drive innovation: “Behind the scenes, this is ramping up demand for cloud optimised data centre facilities,” he fears. “Yet, the environmental impact of this trend is all too often left out of the picture. The machine learning training processes behind AI applications require an enormous amount of energy in order to function, and if the power-hungry machine learning applications are housed in fossil-fuel-powered facilities that do not take a forward-thinking approach to environmental sustainability, a company’s green credentials are quickly voided.”
Goonhilly Earth Station in Cornwall has a liquid immersion cooling system to mitigate the power demands of high performance computing (HPC), and an onsite array of solar panels that can support the data centre’s full power requirements of 500KW with local wind power to be added to the mix shortly.
Chris Roberts, head of data centre and cloud, says the use case for HPC, AI and machine learning is often altruistic, including researching ways of reducing the impact of climate change and improving crop yield: “The irony is that it takes high performance computing with a high carbon footprint to undertake this intense processing,” he says. “While some organisations pay lip service to the issue with carbon offsetting, this is not enough. Fortunately new ways of reducing carbon emissions are emerging.”
Owen Pettiford, data expert at data migration specialist Syniti, believes that one such way lies in simply managing data better: “It is clear that data processing needs optimising,” he says. “Computing power is needed to collect, prepare, analyse and store data, so, when data comes in large volumes and is poorly managed, it is extremely costly for the environment. Care must be taken to ensure data is of pristine quality in the first instance. Businesses should formulate a solid data governance, conduct an audit into the relevance of their data and ideally employ data advocates to maintain data cleansing is at the top of employee’s agendas.”
Something has to give, simply because customers want it that way. Tesh Durvasula, president, Europe with CyrusOne, a real estate trust that invests in carrier-neutral data centres, says customers are putting a lot of pressure on data centre providers to be more efficient and provide services that are sustainable and cost-effective in the long-run: “Even as we work to getting power efficiency under control, the data centre industry must continue to develop initiatives and technologies to reduce our impact on local communities,” he says. The power industry has its part to play. Vincent de Rul, director of energy solutions at EDF Energy, says data centre managers should look at alternative ways to reduce their energy consumption by understanding exactly when and where energy is being used: “By using real-time monitoring to understand live energy usage, managers can identify simple steps which can help them make significant energy savings, in turn reducing their carbon emissions,” he says.
In a recent study analysing over 4,000 businesses, he says EDF Energy found that these firms could unlock over £45million in savings per year by making simple energy changes: “But there are not only monetary savings to be had,” he concludes. “These energy savings would also have a significant environmental impact, and could save over 147,671 tonnes of CO2 per year. To put this into context, it’s the equivalent to the amount of CO2 offset by more than 3.6 million square metres of woodland, or the environmental cost from 80,256 flights between London and Sydney.” An equally long distance lies ahead for the IT and connectivity sectors if they are to pull their weight ecologically. But there is clearly scant time for that ground to be covered.