What is cloud computing?
Cloud computing is the provision of IT infrastructure as a service, which includes, for instance, memories, databases, servers or software. Usually, users only pay for services they use.
What are the advantages of cloud computing?
One advantage of cloud computing is, for example, the fact that no costs are incurred for the establishment and operation of local data centers, the acquisition of hard- and software or the management of infrastructure.
Another positive effect of cloud computing is global scaling. This means, so to speak, needs-based capacities. In plain language, this means that the appropriate quantity of IT resources is provided at precisely the moment when they are needed.
Furthermore, the advantageous performance of cloud computing has to be mentioned, because most cloud computing services are carried out in a global network. Such services consist of secure data centers that are constantly updated to the latest computing hardware generation.
Furthermore, cloud computing offers high security. Numerous cloud providers make technologies or control elements available that ensure this protection. Moreover, they help protect stored data against possible threats for instance. In addition, the productivity of cloud computing speaks for itself. In contrast to local data centers, no substantial establishment and administration expenses will arise.
What disadvantages does cloud computing entail?
One disadvantage of cloud computing is that users need to have a stable and fast Internet connection to enable an undisturbed and efficient use of the cloud.
In addition, data protection must be mentioned. Hackers for example could read important enterprise data or use it otherwise to damage companies. Therefore, particular attention must be paid to ensure that the cloud provider offers secure data storage.
Another disadvantage of cloud computing is its dependency on the cloud provider. If a provider increases its prices for instance, users will either pay a higher price or switch to a new provider. This switch involves a high expenditure of time and costs.
The above-mentioned dependency results in a loss of know-how, which is another disadvantage of cloud computing. Due to the cloud, an enterprise renounces IT knowledge. If, for example, cloud services fail, an enterprise without its own IT department would have considerable problems to compensate for this failure.
What is meant by edge computing?
Edge computing is decentralized data processing taking place at the network’s “edge”. This means that the data is not collected, analyzed or retrieved by a central server or in the cloud, but at local hubs.
Where is edge computing used?
Edge computing can be used, for instance, in areas where a large dataset must be processed in real time and within a short time delay. This includes, for example, the Internet of Things, autonomous driving or the management of energy grids. Furthermore, edge computing is used in Industry 4.0, e.g. for automated processes.
Excursion: What does “Internet of Things” mean?
The term “Internet of Things” (IoT) describes the networking of “intelligent” (everyday) items via the Internet. By means of the IoT, these devices can communicate with each other or even accept commands. Furthermore, tasks can be carried out without external intervention and applications can be automated.
What are the advantages of edge computing?
In contrast to cloud computing, edge computing does not require a transfer of sensitive customer or enterprise data into the cloud for instance. Furthermore, the networked IoT devices even work during an Internet failure.
Another advantage is data entry or aggregation. Edge computing enables the entry of data near its place of origin. In case of classic cloud architectures, the data is sent to a central data center for evaluation. Nevertheless, edge computing enables data upload into the cloud. Data will only be uploaded if it is to be archived or cannot be evaluated locally.
Edge computing offers local data storage. In the cloud, a real time transfer of large datasets from the core data center is not possible in most cases. Therefore, the respective data is retained decentrally at the network’s edge, i.e. locally in edge computing.
Furthermore, edge computing enables KI-based monitoring. This is ensured by decentralized computing units of an edge computing environment receiving and evaluating data, which creates continuous monitoring. If this monitoring is combined with a machine learning algorithm, the status can be monitored in real time.
One last noteworthy advantage is M2M communication. M2M means “machine to machine” and stands for automated information exchange between terminals.
What are the disadvantages of edge computing?
A big disadvantage is security. Edge computing is used for IoT devices for example. Hackers could gain access to the transmitted data and manipulate it, which might lead to wrong decisions. In case of autonomous driving, this could involve great dangers if, for instance, the transmitted data for a car’s distance is changed.
Moreover, important information on the user might get lost by using edge computing. The local device decides what data is classified as “disposal data” and what data is not classified as such. As a consequence, “disposal data” is not transferred to the data center.
Another disadvantage is the increased costs for implementing the edge infrastructure, which are caused by its complexity for example. Furthermore, IoT devices require more local hardware to make them work properly.
What are the differences between cloud computing and edge computing?
- Cloud computing: decentralized in the cloud/on the server
- Edge computing: decentralized at the network’s edge
- Cloud computing: non-time-controlled data
- Edge computing: time-critical data
- Cloud computing: high (far distance between user and application)
- Edge computing: low (short distance between user and application)
- Cloud computing: for remote locations
- Edge computing: for locations with restricted or no connection to a central location
The use of cloud computing and/or edge computing is reasonable for advanced and large enterprises in particular as disadvantages are outweighed by the advantages described above.
As a whole, edge computing bears more risks as hacker attacks might cause serious damages. Nevertheless, edge computing will be a very important tool in the future, especially for the Internet of Things.