Skip to main content

Posts

What is Edge computing ?

  Cloud computing has been all the rage in the past ten years or so, as more and more people grow to understand it, but ‘edge computing’ could be about to disrupt this. Edge Computing is a relatively new term, which refers to data being processed at the edge of a network, instead of doing all of that processing in a big central cloud. In simplest terms, edge computing moves some portion of storage and compute resources out of the central data center and closer to the source of the data itself. Rather than transmitting raw data to a central data center for processing and analysis, that work is instead performed where the data is actually generated Edge computing is a distributed computing paradigm that brings computation and data storage closer to the location where it is needed to improve response times and save bandwidth
Recent posts

How edge computing works ?

  While a single device producing data can transmit it across a network quite easily, problems arise when the number of devices transmitting data at the same time grows Not only will quality suffer due to latency, but the costs in bandwidth can be tremendous. Edge-computing hardware and services help solve this problem by being a local source of processing and storage for many of these systems. An edge gateway, for example, can process data from an edge device, and then send only the relevant data back through the cloud, reducing bandwidth needs. Or it can send data back to the edge device in the case of real-time application needs. These edge devices can include many different things Edge gateways themselves are considered edge devices within an edge-computing infrastructure.

Why edge computing was developed ?

  Edge computing was developed  due to the exponential growth of IoT devices, which connect to the internet for either receiving information from the cloud or delivering data back to the cloud. And many IoT devices generate enormous amounts of data during the course of their operations