Sustainability-in-Tech : Data Centres May Have Their Water Restricted

The recent announcement by Thames Water that it may restrict water flows to London data centres and charge them more at times of peak demand may be the shape of things to come.

Thames Water Ltd – Considering Restrictions 

The announcement follows an investigation last year into how much water was being used by London and M4 corridor datacentres. At the time, Thames Water’s Strategic Development Manager, John Hernon, said that he was looking to work closely “with those consultants planning for new datacentres in the Slough area” and saying, “Our guidance has already resulted in a significant reduction in the amount of water requested by these new centres due to guidance on additional storage and cooling procedures”. 

What’s The Issue? 

The main issue is that datacentres require vast amounts of water for cooling and the water they currently use is drinking water from supply pipes. This can put pressure on supply and infrastructure during hot weather when demand for water surges. It should also be noted that datacentres also require substantial amounts of electricity.


To address these issues, Thames Water Ltd has announced that it has discussed with at least one datacentre operator in London about physically restricting the water flow at peak times. John Hernon has been reported as saying that this could involve introducing flow restrictions on pipes.

What Are The Alternatives To Physically Restricting Water? 

Although Thames Water has discussed restricting water flow to the datacentres, the company has said it would prefer to take a collaborative approach with datacentres, encouraging them to look at water reusage and recycling options on-site. This could involve reusing final effluent from the company’s sewage treatment works or intercepting and surface water before it reaches treatment plants.

Thames Water first raised the issue of datacentres using raw water in August 2022. At the time, a drought had led them to introduce a hosepipe ban affecting 15 million customers in the southeast (including London and the Thames Valley area), bringing the issue of how much drinking water datacentres were using into sharp focus.

In Other Countries 

Although water is the traditional cooling method for data centres, in some other countries and regions they also use different cooling methods depending on factors like climate, local regulations, and available resources. For example, some common cooling methods include air cooling, or specialised cooling solutions like chilled water systems. Some recent experiments in data centre cooling methods have also included immersing servers in liquid / engineered fluid, i.e. immersion cooling (which Microsoft now uses at its datacentre in Quincy, WA in the US), and underwater datacentres such as Microsoft’s Project Natick. Other recent and innovative data centre cooling ideas have included a decentralised model whereby homeowners are incentivised to have business servers attached to their home water tanks for cooling.


Even though it would make sense for datacentres to get their large water requirements from somewhere other than the drinking water supply, some have criticised Thames Water for scapegoating the datacentre industry when the water company doesn’t appear to be doing much about losing 600 million litres of water a day (nearly a quarter of its daily supplies) through leaks.

What Does This Mean For Your Business? 

Pressures such as climate change, the growth of the digital society, cloud computing, the IoT, and the newer pressures of widescale use of generative AI have all fed the demand for more datacentres and have exacerbated the cooling challenges faced by them. Although some innovative alternatives are being tried, datacentres predominantly have huge water and power requirements and, as the Tames Water area example shows, the fact that they tap into drinking water can be a big problem in drought or at peak general demand times. Businesses require reliable computing and access to their cloud resources as well as water (and power) illustrating the importance to the wider economy and society of finding a solution that enables data centres to function reliably while not negatively impacting other infrastructure, businesses, and the economy or, indeed, climate targets.

Using raw water may be one alternative that could help (as could fixing leaks as some critics argue). Some other methods such as immersion cooling or underwater datacentres look promising, but these may take some time to scale up. AI may also prove to be helpful in helping to improve the management of datacentre cooling.

There are, in fact, many methods that could ultimately all help tackle the problem, such as optimising airflow management, implementing intelligent monitoring systems, utilising computational fluid dynamics simulations, and exploring innovative architectural designs. All these methods could help by enhancing airflow efficiency, preventing hotspots, improving heat dissipation, proactively adjust cooling parameters, informing cooling infrastructure design, and dynamically adapting to workload demands to meet the modern cooling challenges faced by data centres.

For Thames water, the idea of working collaboratively with datacentres rather than simply imposing water restrictions sounds a sensible way forward because datacentre cooling is a challenge that now affects all of us and needs intelligent solutions that work and can be put into practice soon.

Tech Insight : The Impact of Generative AI On Datacentres

Generative AI tools like ChatGPT plus the rapid and revolutionary growth of AI are changing the face of most industries and generating dire warnings about the future, but what about the effects on datacentres?

Data Centres And Their Importance 

Data centres are the specialised facilities that house a large number of computer servers and networking equipment, serving as centralised locations where businesses and organisations store, manage, process, and distribute their digital data. These facilities are designed to provide a secure, controlled environment for storing and managing vast amounts of data in the cloud.

In our digital, cloud computing business world, data centres, therefore play a crucial role in supporting various industries and services that rely on large-scale data processing and storage, and are utilised by organisations ranging from small businesses to multinational corporations, cloud service providers, internet companies, government agencies, and research institutions.

The Impacts of Generative AI

There are number of ways that generative AI is impacting data centres. These include:

– The need for more data centres. Generative AI applications require significant computational resources, including servers, GPUs, and data storage devices. As the adoption of generative AI grows, data centres will need to invest and expand their infrastructure to accommodate the increased demand for processing power and storage capacity and this will change the data centre landscape. For example, greater investment in (and greater numbers of) data centres will be needed. It’s been noted that the massive data-crunching requirements of AI platforms like ChatGPT, for example, couldn’t continue to operate without using Microsoft’s (soon-to-be-updated) Azure cloud platform. This has led to Microsoft now building a new 750K SF hyperscale data centre campus near Quincy, WA, to house three 250K SF server farms on land costing $9.2M. Presumably, with more data centres there will also need to be greater efforts and investment to reduce and offset their carbon consumption.

– Greater power consumption and more cooling needed. Generative AI models are computationally intensive and consume substantial amounts of power. Data centres also have backup power sources to ensure a smooth supply, such as uninterruptible power supplies (UPS) and generators. With more use of generative AI, data centres will need to ensure they have sufficient power supply and cooling infrastructure to support the energy demands of generative AI applications. This could mean that data centres will now need to improve power supplies to cope with the demands of generative AI by conducting power capacity planning, upgrading infrastructure, implementing redundancy and backup systems, optimising power distribution efficiency, integrating renewable energy sources, implementing power monitoring and management systems, and collaborating with power suppliers. These measures could enhance power capacity, reliability, efficiency, and sustainability. More data centres may also need to be built with their own power plants (like Microsoft did in Dublin in 2017).

In terms of the greater need for cooling, i.e. to improve cooling capacity for generative AI in data centres, strategies include optimising airflow management, adopting advanced cooling technologies like liquid cooling, implementing intelligent monitoring systems, utilising computational fluid dynamics simulations, exploring innovative architectural designs, and leveraging AI algorithms for cooling control optimisation. These measures could all enhance airflow efficiency, prevent hotspots, improve heat dissipation, proactively adjust cooling parameters, inform cooling infrastructure design, and dynamically adapt to workload demands to meet the cooling challenges posed by generative AI.

– The need for scalability and flexibility. Generative AI models often require distributed computing and parallel processing to handle the complexity of training and inference tasks. Data centres therefore need to provide scalable and flexible infrastructure that can efficiently handle the workload and accommodate the growth of generative AI applications. Data centres will, therefore, need to be able to support generative AI applications through means such as:

– Virtualisation for dynamic resource allocation.
– High-performance Computing (HPC) clusters for computational power.
– Distributed storage systems for large datasets.
– Enhanced network infrastructure for increased data transfer.
– Edge computing for reduced latency and real-time processing.
– Containerisation platforms for flexible deployment and resource management.

– Data storage and retrieval. Generative AI models require extensive amounts of training data, which must be stored and accessed efficiently. Data centres now need to be able to optimise their data storage and retrieval systems to handle large datasets and enable high-throughput training of AI models.

– Security and privacy. Generative AI introduces new security and privacy challenges. Data centres must now be able to ensure the protection of sensitive data used in training and inferencing processes. Additionally, they need to address potential vulnerabilities associated with generative AI, such as the generation of realistic but malicious content or the potential for data leakage. Generative AI also poses cybersecurity challenges, as it can be used to create vast quantities of believable phishing emails or generate code with security vulnerabilities. Rather than just relying upon a lot of verification, there is likely to be increased dependency on skilled workers and smart software may be necessary to address these security risks effectively.

– Customisation and integration. Generative AI models often require customisation and integration into existing workflows and applications. This means that data centres need to provide the necessary tools and support for organisations to effectively integrate generative AI into their systems and leverage its capabilities.

– Skillset requirements. Managing and maintaining generative AI infrastructure requires specialised skills and data centres will need to invest in training their personnel and/or attracting professionals with expertise in AI technologies to effectively operate and optimise the infrastructure supporting generative AI.

– Optimisation for AI workloads. The rise of generative AI also means that data centres need to find ways to optimise their operations and infrastructure to cater to the specific requirements of AI workloads. This includes considerations for power efficiency, cooling systems, network bandwidth, and storage architectures that are tailored to the demands of generative AI applications.

– Uncertain infrastructure requirements. The power consumption and hardware requirements of the increasing use of generative AI applications are yet to be fully understood and this means that the impact on software and hardware remains uncertain, and the scale of infrastructure needed to support generative AI is still not clear. The implications of this for data centres are, for example:

– A lack of clarity on power consumption and hardware needs/the specific power and hardware requirements of generative AI applications are not fully understood, makes it challenging for data centres to accurately plan and allocate resources.
– The impact of generative AI on software and hardware is still unclear which makes it difficult for data centres to determine the necessary upgrades or modifications to support these applications.
– Without a clear understanding of the demands of generative AI, data centres cannot accurately estimate the scale of infrastructure required, potentially leading to under-provisioning or over-provisioning of resources.

– The need for flexibility and adaptability. Data centres must now be prepared to adjust their infrastructure dynamically to accommodate the evolving requirements of generative AI applications as more information becomes available.

AI Helping AI 

Ironically, data centres could use AI itself to help optimise their operations and infrastructure. For example, through:

– Predictive maintenance. AI analysing sensor data to detect equipment failures, minimising downtime.

– Energy efficiency. AI optimising power usage, cooling, and workload placement, reducing energy waste.

– Workload Optimisation. AI maximising performance by analysing workload patterns and allocating resources efficiently.

– Anomaly Detection. AI monitoring system metrics, identifies abnormal patterns and flags security or performance issues.

– Capacity Planning. AI analysing data to predict resource demands, optimising infrastructure expansion.

– Dynamic Resource Allocation. AI dynamically scaling computing resources, storage, and network capacity based on workload demands.

What Does This Mean For Your Business? 

Overall, while generative AI offers opportunities for increased efficiency and productivity for businesses, it also poses several challenges related to infrastructure, trust, security, and compliance. Data centres in our digital society and cloud-based business world now play a crucial role in supporting industries, business, services and as such, whole economies so how data centres to adapt quickly and effectively to the challenges posed by AI (or not) is something that could potentially affect all businesses going forward.

As a data centre operator or a business relying on data centres for smooth operations, the impact of generative AI on data centres presents both opportunities and challenges. On the one hand, the increased demand for processing power and storage capacity necessitates investments in infrastructure expansion and upgrades, providing potential business opportunities for data centre operators. It may lead to the establishment of more data centres and the need for greater efforts to reduce their carbon footprint.

However, this growth in generative AI also brings challenges that need to be addressed. Data centres must ensure sufficient power supply and cooling infrastructure to support the energy demands and heat dissipation requirements of generative AI applications. This may require capacity planning, infrastructure upgrades, integration of renewable energy sources, and the adoption of advanced cooling technologies. It also presents huge challenges in terms of trying to provide the necessary capacity in a way that minimises carbon emissions and meeting environmental targets.

Additionally, with the rise of generative AI, data centres now need to consider scalability, flexibility, security, and privacy implications. They must provide the necessary infrastructure and tools for businesses to integrate generative AI into their workflows and applications securely. Skillset requirements also come into play, as personnel need to be trained in AI technologies to effectively operate and optimise the data centre infrastructure.

Overall, understanding and addressing the implications of generative AI on data centres is crucial for both data centre operators and businesses relying on these facilities. By adapting to the evolving demands of generative AI and investing in optimised infrastructure and pursuing innovation, data centre operators can provide reliable and efficient services to businesses, ensuring seamless operations and unlocking the potential of generative AI for various industries.