The number of connected devices in use today is staggering. Around a third of the country’s population – almost 20 million people – claim to use at least five connected devices each. That equates to around 100 million devices in the UK alone. Extrapolate this across the globe, where the devices of more than 7.6 billion people are generally switched on, and sensing, analysing and transmitting data 24/7 all year round. Keeping these connected devices running requires billions of machines which, when in constant communication with each other, puts extraordinary pressure on any network.
A combination of advances in networking technology, such as the mainstream adoption of 5G over the next five to 10 years, and analyst predictions that the market for connected devices could reach $7 trillion by 2020, are only set to increase this pressure, particularly on data centres, which will need to adapt to handle the additional demand. Machines are able to process information almost the moment it is received, especially in data centres, where decisions are made instantaneously. In the past, data centres largely served as storage units for data. Now, however, they compute, analyse and process information, and must do so in real time. Given these requirements, it’s perhaps unsurprising that IDC recently predicted 2018 would see the ‘modernisation’ of data centres, where ‘heavy use’ would be made of ‘predictive analytics to increase accuracy and reduce downtime’.
Wireless networks and wired assets
With test rollouts set to take place this year, 5G is definitely on its way. When it finally arrives, it’s set to change everything; network operators and data centre managers alike should start preparing now for its arrival. The sheer volume of connected devices that will be communicating with each other and with their users will require a much greater deployment of fibre, for example, although before we reach this point, a lot of work needs to be carried out behind the scenes.
Despite being wireless networks, however, a lot of ‘wired’ assets are needed to effectively deliver fibre backhaul to the core and edge. Enabling 5G will require the densification of cell sites and the addition of more cell sites to increase the amount of available capacity. A range of different types of powering solution will come to market, each offering operators a cost-efficient means of powering up the many devices residing at the network’s edge.
The deployment of vast amounts of fibre may be only be a best-case solution, however, and it may not always be the most feasible option. Deploying high density fibre right from the start would be the most efficient means of enabling fast machine-to-machine communication. This investment could then be future-proofed through the use of a modular, high-speed platform capable of supporting multiple generations of equipment.
Acknowledging concerns
There’s no avoiding the fact that, with the number of connected devices continuing to grow, and the volume of machine to machine communication growing at the same time, problems will arise. After all, machines are only ever as good as the programming and algorithms that control them, and this will always make them vulnerable to manipulation by humans and, in some cases, other machines. These vulnerabilities have led to increasing anxiety around data privacy. It’s fair to say that, with more devices in the world than people, we face a growing risk of falling victim to hackers and data thieves.
What’s more, with rising concerns around the possibility that machines may take over jobs previously carried out by humans, there will need to be a change in mind-set before this explosion in machine to machine technology will be wholly accepted.
These concerns aside, however, network operators and data centre managers should be taking steps now, to prepare the ground for the inevitable rise of the machines in what is quickly becoming an increasingly connected world.