The growing number of “connected” devices is generating an excessive amount of data, and this will continue as Internet of Things (IoT) technologies and use cases grow in the coming years. According to research firm Gartner, by 2020, there will be as many as 20 billion connected devices generating billions of bytes of data per user. These devices are not just smartphones or laptops, but also connected cars, vending machines, smart wearables, surgical medical robots, and more.
The large amount of data generated by countless types of such devices needs to be pushed to a centralized cloud for retention (data management), analysis, and decision-making. Then, the analyzed data results are transmitted back to the device. This round trip of data consumes a lot of network infrastructure and cloud infrastructure resources, further increasing latency and bandwidth consumption issues, thus affecting mission-critical IoT use. For example, in self-driving connected cars, a large amount of data is generated every hour; the data must be uploaded to the cloud, analyzed, and instructions sent back to the car. Low latency or resource congestion may delay the response to the car, which may cause traffic accidents in serious cases. IoT Edge Computing This is where edge computing comes in. Edge computing architecture can be used to optimize cloud computing systems so that data processing and analysis are performed at the edge of the network, closer to the data source. With this approach, data can be collected and processed near the device itself, rather than sending it to the cloud or data center. Benefits of edge computing:
The advent of edge computing does not replace the need for traditional data centers or cloud computing infrastructure. Instead, it coexists with the cloud as the computing power of the cloud is distributed to endpoints. Machine Learning at the Network Edge Machine learning (ML) is a complementary technology to edge computing. In machine learning, the generated data is fed to the ML system to produce an analytical decision model. In IoT and edge computing scenarios, machine learning can be implemented in two ways.
Edge computing and the Internet of Things Edge computing, together with machine learning technology, lays the foundation for the agility of future communications for IoT. The upcoming 5G telecommunication network will provide a more advanced network for IoT use cases. In addition to high-speed and low-latency data transmission, 5G will also provide a telecommunication network based on mobile edge computing (MEC), enabling automatic implementation and deployment of edge services and resources. In this revolution, IoT device manufacturers and software application developers will be more eager to take advantage of edge computing and analytics. We will see more intelligent IoT use cases and an increase in intelligent edge devices. Original link: http://www.futuriom.com/articles/news/what-is-edge-computing-for-iot/2018/08 |
There is a big surprise ahead, developers please ...
= [[335538]] This article is reprinted from the W...
Now that we are working from home due to the pand...
On the occasion of the Chinese New Year, the trib...
Since the beginning of this year, 5G has become t...
Since the three major operators issued 5G commerc...
[51CTO.com original article] June in Beijing is w...
With the development of communication technology,...
On September 25, the 2019 Digital Economy Summit ...
OneTechCloud has recently added Hong Kong Interna...
In the ever-changing information age, companies t...
[[438351]] My conclusion from the above is: HTTP ...
According to a new study from GSMA, global 5G con...
With the arrival of 5G, maintaining 2G, 3G, 4G, a...
On the 11th of this month, ShockHosting sent an e...