There are still many unanswered questions about 5G, but it is clear that it will leave a lasting mark and influence on the world. With a transmission speed of 10 gigabits per second, it is about 100 times faster than the connection speed of 4G networks. There are also bandwidth improvements and latency of less than 1 millisecond. The impact of these on mobile Internet, driverless cars, drones, smart homes, smart cities, smart grids, and many other technologies is indeed impressive. But what about data centers, which have been around for years? How will these important structures change once 5G technology is widely adopted? First, we need to understand the limitations of current infrastructure and what this means for the rollout of 5G. What’s holding back the rollout of 5G? Fault tolerance is one of the reasons why 5G seems to be at a standstill at the moment. The systems mentioned above, such as smart P2P grids, telemedicine, and remote robotic surgery, have high fault tolerance requirements. Surgeons have successfully performed surgeries 30 miles away from the operating room using robotic arms and 5G networks. Needless to say, such systems cannot fail during use. Otherwise, communication between autonomous driving and smart city infrastructure will not be possible.
The principle of 4G technology is relatively "primitive", with devices connecting to one infrastructure at a time, such as a transmission tower, and then transmitting to the next transmission tower, and so on. But with 5G, our devices and equipment must communicate and transmit with multiple base stations and other infrastructure at the same time if they want to achieve "zero fault tolerance". This is also called "spherical coverage". Many of the splashy 5G demonstrations to date have involved connections between handheld or IoT devices and local network routers or base stations. But much of the rest of the internet’s backend, our servers and data centers, aren’t yet fast enough or low enough to handle 5G connections. Simply put, the core of the problem is that data processing and server facilities need to be closer to the edge of the network. How data centers must change to accommodate 5G 4G networks are slow enough that most people don’t notice the delays caused by data packets traveling hundreds or thousands of miles. To solve this problem, in the 5G era, data centers around the world need to be more dispersed than they are now, so that the geographical distribution is more conducive to data transmission. Only in this way can the high speed and low latency presented by 5G be realized. There are no other simple shortcuts. "Micro data centers" are a way forward. They're also called "containerized" data centers. To get an idea of what this might look like in practice, attach a micro data center to each site. Then, imagine more cell towers than you have today. Building this infrastructure will get us there to a certain extent, and will allow us to deploy 5G-powered IoT devices across a fairly large geographic area without latency. But what about larger, industrial-scale data processing tasks? That's a slightly different story.
Businesses that rely on large-scale data transfer and processing can build new data centers with relative ease, but smaller companies may be left behind or turn to managed services to migrate traffic while new infrastructure is integrated. For companies large and small that rely on the accumulation, analysis, and distribution of data, the goal is to move processing equipment closer to where the data is generated: near the end user. Under the existing computing model, services and devices must send data to the cloud, then to the “core” data processing infrastructure, and back again. But this model is not fast enough for 5G, nor for the capabilities and emerging technologies it will help us unlock. What does 5G really mean? “General-purpose technology” (GPT) refers to branches of technology that can impact, transform, and improve an entire country’s economy. When we accept the real needs of 5G and build shared and proprietary infrastructure accordingly, it will enter the realm of GPTs like previous products, such as steam engines, interchangeable parts, cars, and the Internet itself. Despite the obstacles to development, we are seeing a sea change in the way people communicate with data services. By 2025, humanity will have 75 billion connected devices. With 5G technology, these devices will be able to transmit more wireless data, and faster, than at any time in human history. But getting there will require a foundation laid by both the public and private sectors, which means rethinking the size and placement of data centers, as well as new business models for sharing data transmission and processing power. |
<<: The data center dilemma: Is data destroying the environment?
>>: Enterprise network cabling will be affected by five major technology trends
Cybersecurity is more important today than ever b...
Hengchuang Technology has launched a 2021 New Yea...
Servmix is a foreign hosting company founded in...
This article is compiled from the topic "Kua...
The authors describe the challenges of capacity r...
From August 21st to 23rd, the 2018 (4th) China Sm...
【51CTO.com Quick Translation】 [[425497]] Low-code...
[[384223]] This morning, the State Council Inform...
While MPLS has served enterprises well for many y...
There is no doubt that 2019 will be the "Yea...
EtherNetservers is a foreign VPS hosting company ...
The year 2020 has multiple "identities"...
[[180184]] ApplePay On February 18, Apple Pay off...
LOCVPS (Global Cloud) is an early established Chi...
Ookla, the parent company of the well-known speed...