Hyperscale: The Future of Data Centres?
Modern data centres are evolving from the cumbersome, inefficient data storage facilities of the past into new and innovative solutions as we enter the era of Big Data.
In the data centre industry we all know that change is a big part of the package – with all its joys and frustrations… What we consider to be the future of data centres now will be as different in 10 years time as data centres were to us 10 years ago – such is the evolutionary nature of new technologies and demands on data.
Even though they represent less than 10% of all data centres currently in operation, hyperscale facilities are now attracting the lion’s share of investors’ time and money. It is also where we’re seeing the most exciting innovations. When you think of Industry 4.0 (the fourth Industrial Revolution, focusing on interconnectivity, machine learning and real-time data) – it’s undoubtedly hyperscaling that forms the backbone of this new era.
And this trend is evident when you look at the numbers. In 2020 there were more than 540 hyperscale data centres – at the end of 2021 there were 728, with numbers expected to total over 1,000 by the end of 2024.
Both traditional and hyperscale data centre owners need to ensure they can embrace the impact of big change, and overcome the challenges and eliminate the uncertainties that inevitably follow. Data accuracy and a reliable asset management system is key. So how do we see a new era of data centres playing out?
Out of the Pandemic and Into the Dawn of Big Data
The post-pandemic period has been a major watershed for spending on data infrastructure. A seismic shift in working patterns and an ever increasing demand for data has meant that organisations have accelerated the shift from private server solutions to cloud computing.
Worldwide demands for data are escalating like never before. The growth of 5G, public cloud storage and an uptick in the use of AI in business means that all data companies are embracing a surge in demand.
With the average internet user spending 40% of their day online, there is now a mind-boggling volume of data being consumed. Such is the hunger for data, reports predict that the global usage of data will exceed 175 Zettabytes by 2025 – a number that almost defies logic (and let’s not start on Yottabyte potential).
There is no avoiding the skyrocketing demand for everything from digital infrastructure to streaming services, and nothing to suggest that these trends won’t continue to build. It’s vital that future data centres remain flexible and agile, maintaining the ability to scale operations as needed while future proofing management of data and assets with reliable, accurate and intelligent solutions.
As a reaction to the era of Big Data, even the physical design of data centres is transforming into something more flexible and modular to accommodate various levels of complexity – and to future-proof capabilities. Data companies need to provide a simple and safe IT service, bringing together an interconnected global footprint with seamless and reliable networks.
It goes without saying that corporate accountability cannot be forgotten in the midst of this epic growth. As demand rises rapidly, so does the need for sustainable data centres. Data centres need to keep pace with these data demands, while also considering the environmental impacts of their existence – taking into account everything from carbon emissions to water usage.
So is Hyperscaling the Answer?
For the uninitiated, a quick overview of hyperscaling:
In hyperscale computing, servers are simply networked together using only a few basic conventions. The end result? Easier communication between servers, resulting in more capacity to process data. Hyperscale facilities have the ability to scale appropriately as demand increases on the system, switching servers on and off continuously as load dictates.
Hyperscale data centres support robust, business-critical applications and are employed by some of the biggest data-producing companies such as (surprise, surprise) Amazon, Facebook and Google. Companies such as these employing hyperscale strategies need substantial investment to achieve the proper infrastructure build, and hyperscale data centre architects are seeing increasing demands on their skills.
Hyperscale data centres are, as their name suggests, BIG. They benefit from both the advantages of bespoke engineering and economy of scale and therefore are able to outperform regular enterprise data centres. The volume of data, storage capacity and computing power a hyperscale centre can process is also significantly higher than an enterprise data centre.
So, are hyperscale data centres the crucial next piece of the puzzle for a new era of IT operations?
- Hyperscale data centres have a huge advantage over traditional architecture by nature of being extraordinarily agile. They have the ability to scale by adding more machines, or when needed, the power to expand out to an edge network.
- Hyperscale facilities are able to maximise cooling efficiency, lowering costs and empowering data centre managers to maintain and optimise temperatures throughout the centre.
- AI capabilities and remote server maintenance means less staff needed for day to day operations. More advanced technologies mean fewer tech experts are needed to efficiently manage the facility – processes can be automated or regulated remotely. (Read our blog on AI in Data Centres – The Rise of Smart Assets for more on this subject)
- Servers are worked more evenly – workloads between servers are balanced and distributed more evenly, avoiding overheating risks and reducing heat production.
- Flexible data facilities. If you are piggybacking on a larger organisation’s hyperscale facility (such as Amazon’s AWS), you’ll only pay for the resources you use. Additionally, most providers don’t require long-term contracts.
- Sheer size! Hyperscale data centres typically take up around 10,000 square feet and utilise over 5000 servers. A smaller business (than say, Google!) is unlikely to have the need for that number of servers, or the physical space to stage such a build for themselves. Instead, paying for space in a hyperscale facility would be the way forward.
- Although hyperscale data centres ARE more efficient and emission sensitive than traditional data facilities – they are undeniably just massive… and anything big is going to use a lot of resources. The size of the physical space and the computing power within places immense demands on power for operating, processing and cooling, despite lower PUE advantages.
- Location, location, location! One of the most important considerations for a hyperscale data centre is where it is sited. Again, size plays a massive role here when finding a suitable position that allows the facility to provide an appropriate level of service. Rural areas may be less expensive, but run the risk of creating noticeable processing delays and issues around power supply. Facilities closer to urban areas face planning issues and increased real estate costs.
It’s undeniable that everything – from power supplies to cooling processes – is changing in the face of data evolutions. Data centres need to be more reactive, more embracing of new trends (such as the public cloud) and more self-aware in the face of the crucial need for more sustainable practices across the globe.
Hyperscale data centres are certainly the sirens of the IT seas. However, there are still challenges to overcome. Companies are finding it tough to effectively and accurately manage real-time records, and maintain full control of their assets. An intelligent DCIM software, like Spire™ or our 3DCIM solution, which can monitor, measure and provide accurate and real time results across the entire infrastructure is invaluable.
Whether or not hyperscale data centres are the right fit for your organisation, the learnings that come from the evolution of technology, scaling and future proofing will have ramifications for our industry for years to come – until the next era of data, that is. Domegemegrottebyte, anyone?