As we come to the end of the first quarter of 2019, we take a look at what data centre trends are in store for the rest of the year and how they will affect your IT Asset Disposition Strategy (ITAD).
As you well know, there are few technologies as fast moving as those that power our IT assets. From phones to PCs, from laptops to smart devices, the pressure on businesses to compete by having the latest tools of the trade is ongoing.
Increasingly, our devices are pulling live data which is putting huge pressure on the traditional data centre. Where towns were once rated on their road infrastructure or their electricity supply, the ‘quality’ of data as supplied by data centers is now a major concern for companies.
Furthermore, not all data is delivered equally. Some data such as records and documents can sit on low-performance servers, whereas GPS, video streaming and devices using Artificial Intelligence (AI) – such as voice recognition – require fast access to data.
Against this backdrop, let’s look how this year’s trends will influence your ITAD strategy.
The rise of Artificial Intelligence (AI)
Popular culture often portrays AI as humanoid-like technology which may at the worst, run amok, or at the least, take your job. Yet the rise of AI involves everything from speech recognition, problem solving, perception, and the ability to move objects, among others.
Yet regardless of its purpose, it’s reliance on data and data processing continues to grow. According to Gartner, global AI-derived business value will reach nearly $3.9 trillion by 2022.
AI is almost a self-fulfilling prophecy: the more data there is available, the better quality your AI will be. Therefore, the ability to serve this market will require considerable – and constant – investment in computing power, along with a realistic IT Asset Disposition (ITAD) strategy.
Legacy servers will not be able to service the demands of AI, and will have to be disposed of. With AI, there’s nowhere to hide: if it takes ten minutes for Alexa to answer a question, you know you have a problem.
What was once considered the holy grail of chip cooling – the use of liquid – is quickly becoming a viable solution and a necessary one, too.
Last year Google announced that it was shifting to liquid cooling for its latest hardware for AI due to the heat which was being generated by is Tensor Processing Units (TPUs).
Cooling does not necessarily mean the immersion of computer parts in liquid. In Google’s case, liquid coolant is circulated around motherboards in tubes which allows you to single out and cool specific parts, as opposed to cooling entire rooms that have computer parts requiring different temperature control.
The benefits are clear. You get better server performance, improved efficiency in high density data centres, and reduced cooling costs. And lastly, your servers do not overheat as the relentless demand on data continues.
The continued demise of the on-premise data centre
According to Gartner, by 2025 80% of companies will migrate away from on-premises data centres in favour of colocation, hosting and the cloud. For many, the cloud means greater flexibility (the pay-as-you-go model), and you can easily scale upwards or downwards depending on your business needs or cycle.
Crucially, it’s cheaper. Companies do not have to invest in their own building, equipment, staff and energy costs, while it is in the data centre’s interest to keep their clients happy by providing the latest technology to ensure security, compliance and data speeds.
The new infrastructure
Recently data centres have been snapped up by institutional investors who typically would have invested in traditional infrastructure. For example, Stonepeak Infrastructure Partners – who invest in power, transportation and water – have invested in Cologix who provide colocation services.
Last year EdgeCore announced a $2bn rollout of data centres in six North American markets, backed by the sovereign wealth fund of the government of Singapore.
Big Data is now being backed by big money, and it is a trend set to continue. That data centres are being seen by investors in the same class as water or energy shows how serious our data needs have become.
Edge computing and the end of latency
When it comes to coining a phrase, the wordsmith behind ‘edge computing’ got it right. By placing compute and storage systems close to the data source that they are serving i.e. end users on their devices, it addresses the very real problem of latency.
Not long ago, only financial companies that traded in stocks and shares were concerned with latency, which saw servers being placed as close to trading floors as possible. When fortunes could be at stake due to geographical distances, nanoseconds make a difference.
While the average end user may not be concerned with nanoseconds, they are concerned that their devices stream accordingly, and their data keeps flowing. Edge computing means that smaller data centres have started to appear in towns and cities, thus increasing data performance, and is a trend which shows no sign of abating.
In summary, this year’s trends will require companies to have flexible and realistic ITAD strategies in place which can respond to the intense pressure which data is putting on legacy equipment. Under-performing equipment will have to be removed from service as quickly as possible in order to keep your data centre competitive.
As breakthroughs happen in chip-cooling technology, there will be a race to upgrade equipment to take advantage of the obvious benefits in performance and financial savings. Likewise, your ITAD strategy will have to be agile enough to respond accordingly.
Our obsession with data is ongoing. According to Forbes, there are 2.5 quintillion bytes of data created each day at our current pace. And if you are wondering what a quintillion is – this will save you a google search – it’s a one with 18 zeros.