<<

: RESEARCH AND OUTLOOK

A Thesis

Presented to the

Faculty of

California State Polytechnic University, Pomona

In Partial Fulfillment

Of the Requirements for the Degree

Master of Science

In

Computer Science

By

Sunit Bhopal

2020 SIGNATURE PAGE PROJECT

THESIS: EDGE COMPUTING:RESEARCH AND OUTLOOK.

AUTHOR: Sunit Bhopal

DATE SUBMITTED: Fall 2020

Department of Computer Science.

Professor Yu Sun ______Thesis Committee Chair Computer Science

Professor Lan Yang ______Computer Science

Professor Gilbert Young ______Computer Science

ii

ACKNOWLEDGEMENTS

I would first like to thank my thesis advisor Professor Yusun whose valuable guidance helped me whenever I ran into a trouble spot or had a question about my research or writing.

Professor Yusun consistently allowed this paper to be my own work but steered me in the right the direction whenever he thought I needed it.

I would also like to thank my committee members:- Professor Lan Yang, Professor Gilbert

Young for their participation. Without their passionate participation and input, this thesis could not have been successfully conducted.

iii ABSTRACT

In recent years, the Edge computing paradigm has gained considerable popularity in academic and industrial circles. It serves as a key enabler for many future technologies like

5G, of Things (IoT), and vehicle-to-vehicle communications by connecting facilities and services to the end users. The Edge computing paradigm provides low latency, mobility, and location awareness support to delay-sensitive applications. Significant research has been carried out in the area of Edge computing, which is reviewed in terms of latest developments such as ,

Cloudlet, and , resulting in providing researchers with more insight into the existing solutions and future applications. This article is meant to serve as a comprehensive survey of recent advancements in Edge computing highlighting the core applications. It also discusses the importance of Edge computing in real life scenarios where response time constitutes the fundamental requirement for many applications. The article concludes with identifying the requirements and discussing open research challenges in Edge computing.

iv

TABLE OF CONTENTS

ACKNOWLEDGEMENTS iii

ABSTRACT iv

LIST OF TABLES viii

LIST OF FIGURES ix

CHAPTER 1 1

1. Introduction to Edge Computing 1

CHAPTER 2 5

2. Origin and Background 5

CHAPTER 3 8

3. Importance of Edge Computing 8

CHAPTER 4 10

4. Architecture of Edge Computing 10

4.1. IBM’s implementation of an edge computing architecture 13

CHAPTER 5 15

5. IOT() 15

5.1. How big is the Internet of Things? 16

5.2. What is the Industrial Internet of Things? 16

5.3. What are the benefits of the Internet of Things for consumers? 18

5.4. Issues or Challenges of Internet of things 18

5.5. IoT evolution: Where does the Internet of Things go next? 21

CHAPTER 6 22

v

6. Benefits of Edge computing 22

6.1. Low Latency 22

6.2. Security 23

6.3. Scalability 24

6.4. Versatility 25

6.5. Reliability 26

CHAPTER 7 28

7. Cloud computing 28

7.1. Service Models of Cloud Computing 28

7.2. Deployment Models of Cloud Computing 29

7.3. Benefits of Using Cloud Computing 29

CHAPTER 8 31

8. Edge Computing vs Cloud Computing 31

CHAPTER 9 32

9. Requirements for Edge Computing 32

9.1. User Requirements 37

CHAPTER 10 43

10. Edge Computing Technologies 43

10.1. Multi-Access Edge Computing 43

10.2. Fog Computing 44

10.3. Cloudlets 46

CHAPTER 11 48

11. Edge computing uses cases. 48

vi

11.1. Device Management 48

11.2. Security 49

11.3. Priority Messaging 51

11.4. Data Aggregation 53

11.5. Cloud Enablement 54

11.6. IoT Image and Audio Processing 56

CHAPTER 12 58

12. Industries using Edge Computing 58

12.1. Amazon Web Services 58

12.2. IBM 62

12.3. Computers 66

12.4. Network 67

CHAPTER 13 68

13. Challenges 68

13.1. Proliferation of devices, platforms and protocols 68

13.2. Open vs. Proprietary Systems 68

13.3. Time-Critical Performance 68

13.4. Hardware Constraints 69

13.5. Open edge ecosystems 69

CHAPTER 14 70

14. Conclusion 70

REFERENCES 71

vii

LIST OF TABLES

TABLE 1 COMPANIES USING AMAZON WEB SERVICES – I.……………….61

viii

LIST OF FIGURES

FIGURE 1. BASIC EDGE COMPUTING...…………………………………………...... 4

FIGURE 2. ORIGIN OF EDGE COMPUTING...………………………………………...7

FIGURE 3. EDGE COMPUTING LAYOUTS…...……………………………………..13

FIGURE 4. IBM’S IMPLEMENTATION OF AN EDGE………………………………14

FIGURE 5. BENEFITS OF LOW LATENCY…………………………………………..33

ix

CHAPTER 1

1. Introduction to Edge Computing

The notion of network-based computing dates to the 1960s, but many believe the first use of “cloud computing” in its modern context occurred on August 9, 2006, when then Google

CEO Eric Schmidt introduced the term to an industry conference. It enabled information to be stored and processed on remote servers, which meant our devices could offer services beyond their technical capabilities. Using the cloud, a device with only a few gigabytes of memory can effectively host an infinite amount of data. As time has gone by, though, the cloud has started to impede certain technologies, especially IoT[1]. The scope of

IOT(Internet of Things) is so vast that cloud computing alone cannot be a means of data processing. The data sent by IOT over a Wi-Fi or cellular network can slow down the entire network. Without the access to the central cloud IOT devices are useless because of the devices not having an internet connection. This is where edge computing comes in.

According to Wikipedia,” Edge Computing is a paradigm in which processing and computation are performed mainly on classified device nodes known as smart devices or edge devices as opposed to processed in a centralized cloud environment or data centers”[2]. It helps to provide resources, data analysis, and artificial intelligence to data collection sources and cyber-physical sources like smart sensors and actuators. It brings the service and utilities of cloud computing closer to the end user and is characterized by fast processing and quick application response time. This doesn’t completely eliminate the need for a cloud, but it can reduce the amount of data that needs to be sent to the cloud. Edge computing allows for cloud-like functionality on our own

1 devices or at the network “edge,” which is a term used to describe the point where a device or network communicates with the internet. That could be a device’s processor, a , an ISP, or a local edge server. Instead of sending data to a remote server, data is processed as close to the device as possible or even on the device itself.

For example, say you have an autonomous car with a rearview camera used for accident prevention. Relying on a cloud computing system to process the image data and return results to the onboard systems for action would be impractical since the slow/intermittent data connection would result in poor performance. This setup would also use a lot of data for transferring large video files back and forth from the cloud, and it would strain the cloud server to process data from several cameras at once and send back critical results almost instantaneously. But if the car’s computing system can perform most of the process itself and send information to the cloud only when truly necessary, it results in faster and more reliable performance, lower costs for data transfer, and less strain put on cloud servers.

Further there are some of the key components that form the edge ecosystem:

Cloud

This can be a public or private cloud, which can be a repository for the container-based workloads like applications and machine learning models. These clouds also host and run the applications that are used to orchestrate and manage the different edge nodes.

Workloads on the edge, both local and device workloads, will interact with workloads on these clouds. The cloud can also be a source and destination for any data that is required by the other nodes.

2

Edge

An edge is a special-purpose piece of equipment that also has compute capacity that is integrated into that device. Interesting work can be performed on edge devices, such as an assembly machine on a factory floor, an ATM, an intelligent camera, or an automobile.

Often driven by economic considerations, an typically has limited compute resources. It is common to find edge devices that have ARM or x86 class CPUs with 1 or

2 cores, 128 MB of memory, and perhaps 1 GB of local persistent storage. Although edge devices can be more powerful, they are the exception rather than the norm currently.

Edge node:-An edge node is a generic way of referring to any edge device, edge

server, or edge gateway on which edge computing can be performed.

Edge Cluster/Server:-An edge cluster/server is a general-purpose IT computer that

is located in a remote operations facility such as a factory, retail store, hotel,

distribution center, or bank. An edge cluster/server is typically constructed with an

industrial PC or racked computer form factor. It is common to find edge servers

with 8, 16, or more cores of compute capacity, 16GB of memory, and hundreds of

GBs of local storage. An edge cluster/server is typically used to run enterprise

application workloads and shared services.

Edge Gateway :-An edge gateway is typically an edge cluster/server which, in

addition to being able to host enterprise application workloads and shared services,

also has services that perform network functions such as protocol translation,

network termination, tunneling, firewall protection, or wireless connection.

3

Although some edge devices can serve as a limited gateway or host network

functions, edge gateways are more often separate from edge devices.

Edge Devices or IOT Devices

The network of physical objects that are embedded with sensors, software, and other

technologies for the purpose of connecting and exchanging data with other devices and

systems over the internet. This is the basic idea behind the concepts of edge computing. In

this paper, we will explore in greater detail the concepts and benefits of edge computing

and share a variety of insights about its future.

Figure 1 (Basic Edge Computing)

Cloud Processing, Business Logic data, Storage

Edge Layer Realtime data processing at source/on premises

Edge Devices (IOT)

4

CHAPTER 2

2. Origin and Background

The concept of edge computing reaches back to the late 1990s,when a company named akamai introduced content delivery networks (CDN) to accelerate and ramp up the web performance. A CDN uses nodes at the edge close to users to prefetch and web content. These edge nodes can also perform some content customization, such as adding location-relevant advertising[3]. CDNs are especially valuable for video content, because the bandwidth savings from caching can be substantial. Edge computing generalizes and extends the CDN concept by leveraging cloud computing infrastructure.

In 1997, Brian Noble and his colleagues first demonstrated edge computing’s potential value to mobile computing. They showed how speech recognition could be implemented with acceptable performance on a resource-limited mobile device by offloading computation to a nearby server.

In the year 2000, centralized computing occurred in data centers, and Akamai implemented the first iteration of edge computing to battle the "world wide wait”[4].Akamai’s platform provided situational performance by assembling and delivering content based on business logic for each customer using a computing paradigm we called advanced metadata.

Advanced metadata is an implementation of edge computing that uses XML to describe business logic for a given customer.

In 2001, Akamai worked with other industry leaders to develop a standard edge computing implementation for developers called Edge Side Includes (ESi). ESi helped the customers scale, increase performance, and save money by moving fine-grained business logic that would have been done locally to our edge, reducing the amount of data that their

5 infrastructure needed to process in a datacenter. Fast forward 20 years to today, and this capability is commonly called serverless computing, since the customer can abstract scaling business logic from scaling infrastructure.

In 2002, Akamai incorporated the use of Java and .NET technology at the edge and started to use the term "edge computing" to describe their approach. By leveraging Akamai's

Intelligent Edge as a programmable edge, customers could maximize quality and reliability for their consumers, evaluate and optimize personalized web and mobile experiences, and maximize data autonomy and infrastructure and app security at a global scale.

During the mid 2000’s when the concept of edge computing was still new, the cloud became the obvious choice of infrastructure for a lot of companies which included Apple’s

Siri and Google’s speech-recognition services both offload computation to the cloud.

Unfortunately, there were latency issues due to large average separation between a mobile device and its cloud datacenter.

At this point it was obvious that cloud computing was not the perfect choice for the applications which required tight control of the latency.

These observations about end-to-end latency and cloud computing were first articulated in a 2009 article by Mahadev Satyanarayana, Paramvir Bahl, Rámon Cáceres, and Nigel

Davies that laid the conceptual foundation for edge computing[5].That suggested a two level architecture in which the first level was the cloud infrastructure and second level of architecture was called cloudlets which is a small-scale cloud datacenter that is located at the edge of the Internet or closer to the device. Moreover, a cloudlet only stores a soft state such as cached copies of data.

6

In 2012, Cisco introduced the term fog computing for dispersed cloud infrastructures. The aim was to promote IoT scalability, i.e., to handle a huge number of IoT devices and big data volumes for real-time low-latency applications.

This leads us to where we are today: the edge computing era, where there is a corollary back to the client/server days. With the rise of IoT, the increase in the number of businesses with multiple sites, and the fact that more and more data is being generated outside the , computing systems need to be deployed at the edge.

Figure 2 (Origin of Edge Computing)

1990s: Content delivery networks Edge computing can be said to have started with Akamai’s content delivery network (CDN), which delivered cached images and videos from a network of distributed servers close to end-users. CDNs made websites run faster.

Early 2000s:Peer-to-Peer overlay networks Peer-to-peer(P2P) overlay networks let participating members (peers) find other members.P2P members could distribute content and balance workloads. The internet’s domain name system (DNS) is an example of a P2P overlay network.

Mid 2000s: Public clouds the first public cloud was the Elastic Compute Cloud. Amazon rented out computing and storage resources to individuals and small companies to run their own applications.

Early 2010s: Fog computing Cisco introduced fog computing, a distributed cloud brought closer to a data source, such as smart utility meters or IoT devices. Fog provided low-latency network connections with local compute to limit the volume of data that needed to be transferred to a cloud.

Today: Edge Computing is a topology, an idea for how network connectivity is arranged. Edge Computing recognizes the need for decentralized computing resources close to data sources to support many users and IOT applications.

7 CHAPTER 3

3. Importance of Edge Computing

Edge computing is important because it creates new and improved ways for industrial and enterprise-level businesses to maximize operational efficiency, improve performance and safety, automate all core business processes, and ensure “always on” availability. It is a leading method to achieve the digital transformation of how you do business. Powers the next industrial revolution, transforming manufacturing and services. Optimizes data capture and analysis at the edge to create actionable business intelligence. Promotes an agile business ecosystem that is more efficient, performs faster, saves costs, and is easier to manage and maintain. Increasing computing power at the edge is the foundation needed to establish autonomous systems, enabling companies to increase efficiency and productivity while enabling personnel to focus on higher value activities within the operation.

Further as IoT adoption across industries continues to increase, placing AI or analytical applications powered by edge computing, will drive a huge impact on cost and other parameters.

In collaboration with Oxford Economics, IBM Institute of Business Value conducted a survey among 1,500 executives across the C-Suite roles with direct knowledge of their organizations’ strategies, investments, and operations concerning edge computing[6]. We found:

91% organizations will implement edge computing.

84% believe edge computing applications to have a positive impact on their operational

responsiveness.

8

75% say they will invest in AI in the next three years to create new business models at

the edge, combining intelligent workflows, automation, and edge device

interconnectivity.

54% will use edge computing applications for energy efficiency management.

Most edge disruptors expect ROI of over 20% in the next three years.

Edge computing can play a key role in these strategies, in different ways for different industries. In summary, those companies say edge computing can help:

Drive operational responsiveness: Edge-induced responsiveness can lead to significant

business benefits – reduce operating costs, automate workflows and accelerate decision

making.

Increase energy efficiency: Edge can help organizations manage energy efficiency and

reduce power consumption. As more data is processed on the edge, less moves to and

from the cloud, thus decreasing data latency and energy consumption.

Drive business model innovation: Edge computing enables new business models that

will capture untapped value from machine data.

Executives across industries are building strategies to generate faster insights and actions, maintain continuous operations and personalize customer experiences. To learn more on how companies across industries are planning to adopt edge.

With enhanced interconnectivity enabling improved edge access to more core applications, and with new IoT and industry-specific business use cases, edge infrastructure is poised to be one of the main growth engines in the server and storage market for the next decade and beyond.

9

CHAPTER 4

4. Architecture of Edge Computing

Edge computing has a geographical connotation. It's computing that is done at or near the source of the data instead of relying on the “larger” clouds to do all the work. Edge computing places enterprise applications closer to where the data is created and where actions need to be taken. Who or what is producing all that data? Billions of devices. The devices are outfitted with sensors, and the sensors gather data and send it across for analysis. No matter the size, the devices are equipped with different kinds of sensors: sensors that drive actuators, sensors that capture and send information by way of audio and video feeds, and sensors that relay raw data that requires analysis and immediate action.

Compared to IoT devices, edge devices encompass a broader range of device types and functions. Edge computing analyzes the data at the device source[7]. The primary layers of edge computing architecture are as follows:-

Edge Devices:-An edge device is a piece of hardware that controls data flow at the

boundary between two networks. These are actual devices running on-premises at the

edge such as cameras, sensors, and other physical devices that gather data or interact

with edge data. Simple edge devices gather or transmit data, or both. The more complex

edge devices have processing power to do additional activities. In either case, it is

important to be able to deploy and manage the applications on these edge devices.

Devices can be small. Examples include smart thermostats, smart doorbells, home

cameras or cameras on automobiles, and augmented reality or virtual reality glasses.

Devices can also be large, such as industrial robots, automobiles, smart buildings, and

oil platforms. No matter the size, the devices are equipped with different kinds of

10 sensors: sensors that drive actuators, sensors that capture and send information by way of audio and video feeds, and sensors that relay raw data that requires analysis and immediate action. Edge computing analyzes the data at the device source. The new 5G or fifth generation cellular network technology now facilitates much of this communication. The edge and IoT devices are equipped to run analytics, apply AI rules, and even store some data locally to support operations at the edge. The devices could handle analysis and real-time inferencing without involvement of the edge server or the enterprise region.

Local Edge: The systems running on-premises or at the edge of the network. The

edge network layer and edge cluster/servers can be separate physical or virtual

servers existing in various physical locations or they can be combined in a

hyperconverged system. Edge servers are used to deploy apps to the devices. They

are in constant communication with the devices by using agents that are installed

on each of the devices. Thousands of edge servers maintain a pulse on the millions

of devices. If something more than inferencing is needed, data from the devices is

sent to the edge server for further analysis.

There are two primary sublayers to this architecture layer. Both the components of

the systems that are required to manage these applications in these architecture

layers as well as the applications on the device edge will reside here.

Application layer: Applications that cannot run at the device edge because the

footprint is too large for the device will run here. Example applications include

complex video analytics and IoT processing.

11

Network layer: Physical network devices will generally not be deployed due to

the complexity of managing them. The entire network layer is mostly

virtualized or containerized. Examples include routers, switches, or any other

network components that are required to run the local edge.

Cloud: This architecture layer is generically referred to as the cloud, but it can run on- premise or in the public cloud. This architecture layer is the source for workloads, which are applications that need to handle the processing that is not possible at the other edge nodes and the management layers. Workloads include application and network workloads that are to be deployed to the different edge nodes by using the appropriate orchestration layers. It should also be noted that network function is a key set of capabilities that should be incorporated into any edge strategy and thus our edge architecture. The adoption of tools should also take into consideration the need to handle application and network workloads in tandem. further the cloud can also be categorized into two types:-

Edge network or micro data center:- New networking technologies have resulted

in the edge network or micro data center, which can be viewed as a local cloud for

devices to communicate with. The edge network reduces the distance that data from

the devices must travel and thus decreases latency and bandwidth issues especially

with the advent of 5G. This region also offers more analytical capabilities and more

storage for models.

Enterprise hybrid multi cloud:- This region offers the classic enterprise-level model

storage and management, device management, and especially enterprise-level

12

analytics and dashboards. This region can be hosted in the cloud or in an on-

premises data center.

Figure 3 . Edge Computing Layouts showing different levels of architecture. Adapted from IBM(2020)

4.1. IBM’s implementation of an edge computing architecture

In 2019, IBM partnered with Telecommunications companies and other technology participants to build a Business Operation System solution. The focus of the project was to allow CSPs to manage and deliver multiple high-value products and services so that they can be delivered to market more quickly and efficiently, including capabilities around 5G.

Edge computing was part of the overall architecture as it was necessary to provide key services at the edge[8]. The following illustrates the implementation with further extensions having since been made.

A B2B customer goes to a portal and orders a service around video analytics using

drones.

The appropriate containers are deployed to the different edge nodes. These containers

include visual analytics applications and network layer to manage the underlying

network functionality required for the new service.

The service is provisioned, and drones start capturing the video.

13

Initial video processing is done by the drones and the device edge.

When an item of interest is detected, it is sent to the local edge for further processing.

Figure 4 IBM’s implementation of an edge computing architecture. Adapted from LF edge (2020).

14

CHAPTER 5

5. IOT(Internet of things)

The Internet of things (IoT) describes the network of physical objects—“things”—that are embedded with sensors, software, and other technologies for the purpose of connecting and exchanging data with other devices and systems over the Internet.

The idea of adding sensors and intelligence to basic objects was discussed throughout the

1980s and 1990s (and there are arguably some much earlier ancestors), but apart from some early projects -- including an internet-connected vending machine -- progress was slow simply because the technology wasn't ready. Chips were too big and bulky and there was no way for objects to communicate effectively[9].

Processors that were cheap and power-frugal enough to be all, but disposable were needed before it finally became cost-effective to connect up billions of devices. The adoption of

RFID tags -- low-power chips that can communicate wirelessly -- solved some of this issue, along with the increasing availability of broadband internet and cellular and wireless networking. The adoption of IPv6 which, among other things, should provide enough IP addresses for every device the world (or indeed this galaxy) is ever likely to need -- was also a necessary step for the IoT to scale.

Kevin Ashton coined the phrase 'Internet of Things' in 1999, although it took at least another decade for the technology to catch up with the vision.

In IoT, with the help of edge computing, intelligence moves to the edge. Like if you have massive amounts of data and for this, you have to leverage in such end to endways or highly

15 sensor intensive or data-intensive environments where data is generated at the edge, which is due to IoT as data sensing at the edge.

And also, with real-time information, the increasing unstructured data of which sensor and

IoT data are part, traditional approaches don’t meet the requirements which are needed.

There are various scenarios where speed and high-speed data are the main components for management, power issues, analytics, and real-time need, etc. helps to process data with edge computing in IoT.

5.1. How big is the Internet of Things?

Big and getting bigger -- there are already more connected things than people in the world.

Tech analyst company IDC predicts that in total there will be 41.6 billion connected IoT devices by 2025, or "things." It also suggests industrial and automotive equipment represent the largest opportunity of connected "things,", but it also sees strong adoption of smart home and wearable devices in the near term[9].

Utilities will be the highest user of IoT, thanks to the continuing rollout of smart meters.

Security devices, in the form of intruder detection and web cameras will be the second biggest use of IoT devices. Building automation – like connected lighting – will be the fastest growing sector, followed by automotive (connected cars) and healthcare

(monitoring of chronic conditions).

5.2. What is the Industrial Internet of Things?

The Industrial Internet of Things (IIoT) or the fourth industrial revolution or Industry 4.0 are all names given to the use of IoT technology in a business setting. The concept is the same as for the consumer IoT devices in the home, but in this case the aim is to use a

16 combination of sensors, wireless networks, big data, AI and analytics to measure and optimize industrial processes.

The IoT generates vast amounts of data: from sensors attached to machine parts or environment sensors, or the words we shout at our smart speakers. That means the IoT is a significant driver of big-data analytics projects because it allows companies to create vast data sets and analyze them. Giving a manufacturer vast amounts of data about how its components behave in real-world situations can help them to make improvements much more rapidly, while data culled from sensors around a city could help planners make traffic flow more efficiently.

That data will come in many different forms – voice requests, video, temperature or other sensor readings, all of which can be mined for insight. IoT metadata category is a growing source of data to be managed and leveraged. "Metadata is a prime candidate to be fed into

NoSQL databases like MongoDB to bring structure to unstructured content or fed into cognitive systems to bring new levels of understanding, intelligence, and order to outwardly random environments[9].

In particular, the IoT will deliver large amounts of real-time data. Cisco calculates that machine-to machine connections that support IoT applications will account for more than half of the total 27.1 billion devices and connections and will account for 5% of global IP traffic by 2021.

If introduced across an entire supply chain, rather than just individual companies, the impact could be even greater with just-in-time delivery of materials and the management of production from start to finish. Increasing workforce productivity or cost savings are two potential aims, but the IIoT can also create new revenue streams for businesses, rather

17 than just selling a standalone product – for example, like an engine – manufacturers can also sell predictive maintenance of the engine.

5.3. What are the benefits of the Internet of Things for consumers?

The IoT promises to make our environment -- our homes and offices and vehicles -- smarter, more measurable. Smart speakers like Amazon's Echo and Google Home make it easier to play music, set timers, or get information. Home security systems make it easier to monitor what's going on inside and outside, or to see and talk to visitors. Meanwhile, smart thermostats can help us heat our homes before we arrive back, and smart lightbulbs can make it look like we're home even when we're out[9].

Looking beyond the home, sensors can help us to understand how noisy or polluted our environment might be. Self-driving cars and smart cities could change how we build and manage our public spaces.

5.4. Issues or Challenges of Internet of things

Security:-Security is one the biggest issues with the IoT. These sensors are collecting

in many cases extremely sensitive data -- what you say and do in your own home, for

example. Keeping that secure is vital to consumer trust, but so far, the IoT's security

track record has been extremely poor. Too many IoT devices give little thought to the

basics of security, like encrypting data in transit and at rest. Flaws in software even old

and well-used code are discovered on a regular basis, but many IoT devices lack the

capability to be patched, which means they are permanently at risk. Hackers are now

actively targeting IoT devices such as routers and webcams because their inherent lack

of security makes them easy to compromise and roll up into giant botnets.

18

All of this applies in business as well, but the stakes are even higher. Connecting industrial machinery to IoT networks increases the potential risk of hackers discovering and attacking these devices. Industrial espionage or a destructive attack on critical infrastructure are both potential risks. That means businesses will need to make sure that these networks are isolated and protected, with data encryption with security of sensors, gateways and other components a necessity. The current state of IoT technology makes that harder to ensure, however, as does a lack of consistent IoT security planning across organizations. That's very worrying considering the documented willingness of hackers to tamper with industrial systems that have been connected to the internet but left unprotected[9].

US intelligence community briefings have warned that the country's adversaries already have the ability to threaten its critical infrastructure as well "as the broader ecosystem of connected consumer and industrial devices known as the Internet of Things". US intelligence has also warned that connected thermostats, cameras, and cookers could all be used either to spy on citizens of another country, or to cause havoc if they were hacked.

The IoT bridges the gap between the digital world and the physical world, which means that hacking into devices can have dangerous real-world consequences. Hacking into the sensors controlling the temperature in a power station could trick the operators into making a catastrophic decision.

Privacy:-With all those sensors collecting data on everything you do, the IoT is a

potentially vast privacy and security headache. Take the smart home: it can tell when

you wake up (when the smart coffee machine is activated) and how well you brush

your teeth (thanks to your smart toothbrush), what radio station you listen to (thanks to

your smart speaker), what type of food you eat (thanks to your smart oven or fridge),

19 what your children think (thanks to their smart toys), and who visits you and passes by your house (thanks to your smart doorbell). While companies will make money from selling you the in the first place, their IoT business model probably involves selling at least some of that data, too. What happens to that data is a vitally important privacy matter. Not all smart home companies build their business model around harvesting and selling your data, but some do. And it's worth remembering that

IoT data can be combined with other bits of data to create a surprisingly detailed picture of you. It's surprisingly easy to find out a lot about a person from a few different sensor readings. In one project, a researcher found that by analyzing data charting just the home's energy consumption, carbon monoxide and carbon dioxide levels, temperature, and humidity throughout the day they could work out what someone was having for dinner[9]. Similarly, badly installed IoT products could easily open up corporate networks to attack by hackers, or simply leak data. It might seem like a trivial threat but imagine if the smart locks at your office refused to open one morning or the smart weather station in the CEO's office was used by hackers to create a backdoor into your network.

Scalability:-An IoT device will likely contain one or more sensors which it will use to collect data. Just what those sensors are collecting will depend on the individual device and its task. Sensors inside a machinery might measure temperature or pressure; a security camera might have a proximity sensor along with sound and video, while your home weather station will probably be packing a humidity sensor. All this sensor data

– and much, much more – will have to be sent somewhere. That means IoT devices will need to transmit data and will do it via Wi-Fi, 4G, 5G and more. Tech analyst IDC

20

calculates that within five years IoT gadgets will be creating 79.4 zettabytes of data.

Some of this IoT data will be "small and busty" says IDC – a quick update like a

temperature reading from a sensor or a reading from a smart meter. Other devices might

create huge amounts of data traffic, like a video surveillance camera using computer

vision. IDC said the amount of data created by IoT devices will grow rapidly in the

next few years. Most of the data is being generated by video surveillance, it said, but

other industrial and medical uses will generate more data over time[9].It said drones

will also be a big driver of data creation using cameras. Looking further out, self-

driving cars will also generate vast amounts of rich sensor data including audio and

video, as well as more specialized automotive sensor data.

5.5. IoT evolution: Where does the Internet of Things go next?

As the price of sensors and communications continue to drop, it becomes cost-effective to add more devices to the IoT – even if in some cases there's little obvious benefit to consumers. Deployments are at an early stage, most companies that are engaging with the

IoT are at the trial stage right now, largely because the necessary technology – sensor technology, 5G and machine-learning powered analytics – are still themselves at a reasonably early stage of development. There are many competing platforms and standards and many different vendors, from device makers to software companies to network operators, want a slice of the pie. It's still not clear which of those will win out. But without standards, and with security an ongoing issue, we are likely to see some more big IoT security mishaps in the next few years.

21

CHAPTER 6

6. Benefits of Edge computing

6.1. Low Latency

For many companies, speed is absolutely vital to their core business. The financial sector’s reliance upon high-frequency trading algorithms, for instance, means that a slowdown of mere milliseconds can have expensive consequences. In the healthcare industry, losing a fraction of a second can even be a matter of life or death. And for businesses that provide data-driven services to customers, lagging speeds can frustrate customers and cause long term damage to a brand. Speed is no longer just a competitive advantage—it is a best practice[10].

The most important benefit of edge computing is its ability to increase network performance by reducing latency. Since edge computing devices process data locally or in nearby edge data centers, the information they collect doesn’t have to travel nearly as far as it would under a traditional cloud architecture.

It’s easy to forget that data doesn’t travel instantaneously; it’s bound by the same laws of physics as everything else in the known universe. Current commercial fiber-optic technology allows data to travel as fast as 2/3 the speed of light, moving from New York to San Francisco in about 21 milliseconds. While that sounds fast, it fails to consider the sheer amount of data being transmitted. With the world expected to generate up to 44 zettabytes (one zettabyte equals a trillion gigabytes) of data by the end 2020, digital traffic jams are almost guaranteed.

There’s also the problem of the “last mile” bottleneck, in which data must be routed through local network connections before reaching its final destination. Depending upon the quality

22 of these connections, the “last mile” can add anywhere between 10 to 65 milliseconds of latency.

By processing data closer to the source and reducing the physical distance it must travel, edge computing can greatly reduce latency. The end result is higher speeds for end users, with latency measured in microseconds rather than milliseconds. Considering that even a single moment of latency or downtime can cost companies thousands of dollars, the speed advantages of edge computing cannot be overlooked.

6.2. Security

While the proliferation of edge computing devices does increase the overall attack surface for networks, it also provides some important security advantages. Traditional cloud computing architecture is inherently centralized, which makes it especially vulnerable to distributed denial of service (DDoS) attacks and power outages. Edge computing distributes processing, storage, and applications across a wide range of devices and data centers, which makes it difficult for any single disruption to take down the network.

One major concern about IoT edge computing devices is that they could be used as a point of entry for cyberattacks, allowing malware or other intrusions to infect a network from a single weak point. While this is a genuine risk, the distributed nature of edge computing architecture makes it easier to implement security protocols that can seal off compromised portions without shutting down the entire network[10].

Since more data is being processed on local devices rather than transmitting it back to a central data center, edge computing also reduces the amount of data actually at risk at any one time. There’s less data to be intercepted during transit, and even if a device is

23 compromised, it will only contain the data it has collected locally rather than the trove of data that could be exposed by a compromised server.

Even if an edge computing architecture incorporates specialized edge data centers, these often provide additional security measures to guard against crippling DDoS attacks and other cyberthreats. A quality edge data center should offer a variety of tools clients can use to secure and monitor their networks in real time.

6.3. Scalability

As companies grow, they cannot always anticipate their IT infrastructure needs, and building a dedicated data center is an expensive proposition. In addition to the substantial up-front construction costs and ongoing maintenance, there’s also the question of tomorrow’s needs. Traditional private facilities place an artificial constraint on growth, locking companies into forecasts of their future computing needs. If business growth exceeds expectations, they may not be able to capitalize on opportunities due to insufficient computing resources.

Fortunately, the development of cloud-based technology and edge computing have made it easier than ever for businesses to scale their operations. Increasingly, computing, storage, and analytics capabilities are being bundled into devices with smaller footprints that can be situated nearer to end users. Edge systems allow companies to leverage these devices to expand their edge network’s reach and capabilities[10].

Expanding data collection and analysis no longer requires companies to establish centralized, private data centers, which can be expensive to build, maintain, and replace when it’s time to grow again. By combining colocation services with regional edge computing data centers, organizations can expand their edge network reach quickly and

24 cost-effectively. The flexibility of not having to rely upon a centralized infrastructure allows them to adapt quickly to evolving markets and scale their data and computing needs more efficiently.

Edge computing offers a far less expensive route to scalability, allowing companies to expand their computing capacity through a combination of IoT devices and edge data centers. The use of processing-capable edge computing devices also eases growth costs because each new device added doesn’t impose substantial bandwidth demands on the core of a network.

6.4. Versatility

The scalability of edge computing also makes it incredibly versatile. By partnering with local edge data centers, companies can easily target desirable markets without having to invest in expensive infrastructure expansion. Edge data centers allow them to service end users efficiently with little physical distance or latency. This is especially valuable for content providers looking to deliver uninterrupted streaming services. They also do not constrain companies with a heavy footprint, allowing them to nimbly shift to other markets should economic conditions change[10].

Edge computing also empowers IoT devices to gather unprecedented amounts of actionable data. Rather than waiting for people to log in with devices and interact with centralized cloud servers, edge computing devices are always on, always connected, and always generating data for future analysis. The unstructured information gathered by edge networks can either be processed locally to deliver quick services or delivered back to the core of the network where powerful analytics and machine learning programs will dissect

25 it to identify trends and notable data points. Armed with this information, companies can make better decisions and meet the true needs of the market more efficiently.

By incorporating new IoT devices into their edge network architecture, companies can offer new and better services to their customers without completely overhauling their IT infrastructure. Purpose-designed devices provide an exciting range of possibilities to organizations that value innovation as a means of driving growth. It’s a huge benefit for industries looking to expand network reach into regions with limited connectivity (such as the healthcare, agricultural, and manufacturing sector).

6.5. Reliability

Given the security advantages provided by edge computing, it should not come as a surprise that it offers better reliability as well. With IoT edge computing devices and edge data centers positioned closer to end users, there is less chance of a network problem in a distant location affecting local customers. Even in the event of a nearby data center outage, IoT edge computing devices will continue to operate effectively on their own because they handle vital processing functions natively[10].

By processing data closer to the source and prioritizing traffic, edge computing reduces the amount of data flowing to and from the primary network, leading to lower latency and faster overall speed. Physical distance is critical to performance as well. By locating edge systems in data centers geographically closer to end users and distributing processing accordingly, companies can greatly reduce the distance data must travel before services can be delivered. These edge networks ensure a faster, seamless experience for their customers, who expect to have access to their content and applications on demand anywhere at any time.

26

With so many edge computing devices and edge data centers connected to the network, it becomes much more difficult for anyone failure to shut down service entirely. Data can be rerouted through multiple pathways to ensure users retain access to the products and information they need. Effectively incorporating IoT edge computing devices and edge data centers into a comprehensive edge architecture can therefore provide unparalleled reliability.

27

CHAPTER 7

7. Cloud computing

Cloud computing refers to the use of various services such as software development platforms, storage, servers, and other software through internet connectivity. Vendors for cloud computing have three common characteristics which are mentioned below:[11]

Services are scalable.

A user must pay the expenses of the services used, which can include memory,

processing time, and bandwidth.

Cloud vendors manage the backend of the application.

7.1. Service Models of Cloud Computing

Cloud computing services can be deployed in terms of business models, which can differ depending on specific requirements. Some of the conventional service models employed are described in brief below.

Platform as a Service or PaaS: PaaS allows consumers to purchase access to platforms,

allowing them to deploy their software and applications on the cloud. The consumer

does not manage the operating systems or the network access, which can create some

constraints on the nature of applications that can be deployed. Amazon Web Services,

Rackspace, and Azure are examples.

Software as a Service or SaaS: In SaaS, Consumers have to purchase the ability to

access or use an application or service, hosted by the cloud.

Infrastructure as a Service or IaaS: Here, consumers can control and manage the

operating systems, applications, network connectivity, and storage, without controlling

the cloud themselves.

28

7.2. Deployment Models of Cloud Computing

Just like the service models, cloud computing deployment models also depend on requirements. There are four main deployment models, each of which has its characteristics[11].

Community Cloud: Community Cloud infrastructures allow a cloud to be shared among

several organizations with shared interests and similar requirements. As a result, this

limits capital expenditure costs as it is shared among the many organizations using

them. These operations may be conducted with a third party on the premises or 100%

in-house.

Private Cloud: Private Clouds are deployed, maintained, and operated solely for

specific organizations.

Public Cloud: Public Clouds can be used by the public on a commercial basis but

owned by a cloud service provider. A consumer can thus develop and deploy a service

without the substantial financial resources required in other deployment options.

Hybrid Cloud: This type of cloud infrastructure consists of several different types of

clouds. However, these clouds have the capability to allow data and applications to

move from one cloud to another. Hybrid Clouds can be a combination of private and

public clouds, as well.

7.3. Benefits of Using Cloud Computing

Despite the many challenges faced by Cloud Computing, there are many benefits of the cloud as well[11].

29

Scalability

Cloud Computing allows companies to start with a small deployment of clouds and expand reasonably rapidly and efficiently. Scaling back can also be done quickly if the situation demands it. It also allows companies to add extra resources when needed, which enables them to satisfy growing customer demands.

Reliability

Services using multiple redundant sites support business continuity and disaster recovery.

Maintenance

The Cloud service providers themselves conduct system maintenance.

Mobile Accessibility

Cloud computing also supports Mobile accessibility to a higher degree.

Cost Saving

By using Cloud computing, companies can significantly reduce both their capital and operational expenditures when it comes to expanding their computing capabilities.

30

CHAPTER 8

8. Edge Computing vs Cloud Computing

The biggest advantage of edge computing over cloud computing is the number of operations required by the client to reach and transfer data from the server. Edge computing distributes these data processes across different locations. This makes the data deliverable to the nearest node and processing at the edge.

Another key difference lying between edge and cloud is that the cloud offers a centrally managed platform for the whole system of data retrieval or whatever the process might be, is initiated through the Centrally Managed System (CMS). In the case of edge computing, only the initial processing is done by the "edge" network while rest is carried by a Centrally Managed System which makes the initial processing more time redundant and efficient.

When it comes to data retrieval in Cloud Computing, files or applications are directly accessed from the server. In the case of edge computing processing done on the "edge" network is through Internet of Things (IoT).

Edge Computing is used for the real-time monitoring and analysis, whereas Cloud

Computing is essentially used for the back-end data access that might not be efficient enough to provide real-time monitoring and analysis.

Edge Computing is regarded as ideal for operations with extreme latency concerns whereas Cloud Computing is more suitable for projects and organizations which deal with massive .

Edge Computing has more robust security plan including advanced authentication methods and proactively tackling attacks as compare to cloud computing.

31

CHAPTER 9

9. Requirements for Edge Computing

Generic design considerations that should be considered when building an edge deployment to support IoT services that need low latency, local data processing and local data storage include[12]:

Use of operating systems that support real-time applications and management

techniques such as support for containerization.

The ability to run image processing through the use of GPU acceleration.

Compute and storage resource optimization.

Communications frameworks to allow edge systems to co-operate on control and data

processing.

Clear security frameworks and access controls beyond the technical, there are a range

of IoT requirements which may be best achieved by the use of an edge deployment.

localized decision making naturally benefits some IoT requirements, which can be

broadly grouped into three broad categories:

Process management and assistance.

Equipment and environmental monitoring.

Real-time data analytics and AI.

32

Figure 5 Benefits of Low Latency

In a manufacturing environment, for example, an application may traverse these different areas by tracking components through a production process to the final product in order to understand if the production line is at full capacity or could handle more whilst maintaining product quality. In a an application may be able to predict air quality based on transport usage patterns or divert traffic away from congested areas. Within each of these categories there are specific requirements that operators need to address to ensure that developers and customers are able to benefit from these complex applications at the edge.

Process Management and Assistance:-Process management at the edge is a new way

of managing both traditional business process management whilst introducing new

possibilities created by dedicated edge process management. In order to take full

advantage of this capability at the edge, new and existing processes need to be designed

33 to be adaptable and automated in order to intelligently use available edge resources and maximize operational efficiencies, whilst taking advantage of unique edge attributes such as low network latency. There are a few major requirements that need to be considered when deploying process management at the edge. Whilst maintaining data security, transparency of process needs to be established so that activities hosted at the edge are visible to members of the ecosystem, to enable them to best design and control related services. Dynamic process operation at the edge is required to ensure that operations remain agile in nature. Processes will need to be adapted and improved as the environment and data sources change around them. Modelling of processes at the edge will also require a different approach as to benefit from latency at the edge, autonomy and resilience needs to be baked in from the outset. What this means in reality is that a great deal of choreography is required between different processes and services to ensure that the service entity as a whole is not undermined. In turn, this means that testing and QA become important attributes in edge delivery. When other ecosystem players are factored in, then process design becomes a critical strength. Data sources and storage and device ontology will need to be clear to all players to ensure that wider systems and applications can be seamlessly integrated down to a granular level[12].

Equipment and Environmental Monitoring:-The key to success of any edge-based service is the quality and timeliness of data available to it. Edge computing is able to radically transform the quality and quantity of data that can be collected from local devices, even from older machinery with proprietary protocols. Today’s new generation of machines are more likely to be connected to the internet than not for data

34 collection and control. The data that has been available for processing has typically been siloed and wider insights have been harder to achieve. Additionally, data has been limited in scope and quantity as the cost of collection has been high on many networks, combined with the fact that only limited bandwidth has traditionally been available for type communications. Edge computing turns this paradigm on its head by collecting, storing and processing data closer to its source. There are several key requirements for equipment and environment monitoring that for the first time can be realized by edge computing[12].

Low latency:-Sensor data can be collected in near real-time by an edge server. The

closer to the edge the server is located, the lower the latency. For services such as

drone control or image recognition, edge servers can be located in very close

proximity to the device, meeting customer’s low latency requirements whilst

offering the same level of control as centralized services.

Network independence:-IoT services do not care for the delivery mechanism of

data. Customers just need the data to be made available by the most effective means

possible. This in many cases will be mobile networks, but in some scenarios such

as smart buildings or homes, Wi-Fi or local mesh networking may be the most

effective mechanism of collecting data to ensure latency and other collection

requirements can be met.

Security:-Data security and data privacy are requirements that must be met with the

same rigor at the edge as they are at the core. However, data security at the edge

has different challenges to the core, not least that data is spread across many more

locations at the same time, and customers need control of security at each of these

35

points. Additionally, physical security requirements are more prevalent at the edge,

as servers may be located outside of highly secure operator data centers. Both

customers and cloud providers will need security assurances backed up by tools

such as constant monitoring of edge nodes, reliable access logs and appropriate

authentication and others. The integration of equipment running proprietary or

legacy protocols with other data sets from modern equipment. A secure edge can

offer a point of protocol translation, allowing data from multiple sources to be

combined and analyzed at the edge.

Real-time analytics and AI :-Both Real-time analytics and AI rely on access to large volumes of data and large amounts of processing power to deliver rich, real-time insights and decisions, which is only possible at the edge. Developers, cloud providers and end customers all rely on the availability of these resources to ensure that service automation and application performance values can be met. Requirements that for analytics and data at the edge can be quite specific as low latency requirements mean that decisions must be made rapidly. This in turns means that pertinent data needs to be readily accessible. In order to facilitate decision making in the required timescales, optimization of local operating environments at the edge need to be made, and flexibility in architecture is required. Edge servers may need to be located in close proximity to devices and sensors, or hardware acceleration used to identify data of interest. Some data, particularly from cameras, needs to have high bandwidth on demand to ensure image processing can take place with the required low latency. Customers require not just proven AI techniques, frameworks and mechanisms, but algorithms that are improved and refined over time to get more

36

accurate outputs. This means that algorithms and other tools must be deployed within

a framework in order to allow intelligent collaboration across edge devices and nodes

where required so that they can be ‘trained’[12]. This means that parameters can be

shared across an ecosystem and applications enhanced to enable learning

activities. Where analytics and AI are used these are both data and processing ‘hungry’.

Edge devices supporting these will need substantial RAM, storage devices, and in many

cases acceleration devices for machine learning such as GPUs.

9.1. User Requirements

There are three groups of users which these requirements need to be adapted for.

Although each will have slightly different requirements, they will still fall into broadly

similar groupings which give the operator opportunities to develop new propositions

as the edge for each group.

Cloud Providers :-Cloud providers will find edge/cloud orchestration for IoT

challenging, as they will need to engage in a more distributed ecosystem that supports

distributed data and distributed processing for the large volumes of IoT data generated.

The coordination required to extend cloud processing across different sites and

ensuring that data at the edge can be accessed effectively will be key to ensuring that

the cloud operator is able to offer a seamless experience to their own customers.

Decentralized Data Storage and Access Deployment of applications and services at the

edge requires a certain amount of decentralization of data and data processing. This

presents challenges to the cloud provider, who must make it easier for their customers

to enable full or partial distribution of data and processing to the edge. This will require

new frameworks to allow for detailed mapping of data sources and locations, ensuring

37 that centralized queries are able to locate the correct resources at the edge. Access to data held at the edge is also important both in order to catalogue and archive it but also to provide better context for complex queries where place and time are important factors in determining the correct actions. Coordination between the Edge and Core

With real-time decision making occurring at the edge, it is likely that only archive data need to be stored in the cloud, and so any queries that require data from a contextual time close to an event will require access to data at the edge. This mix of archive and current data to present the correct context for cloud queries will require a level of coordination between the cloud and edge that may not be present today. Consistency and Flexibility Large centralized systems are not good at the type of dynamic changes needed at the edge to support AI, automation and real-time decision making. Therefore, cloud providers need to work closely with edge infrastructure providers and operators to obtain consistent service levels in order to provide a dependable interface between edge and cloud[12]. The challenge is to ensure that the flexibility required by applications and developers is matched to the consistency required by cloud providers.

Application providers and IoT devices need flexibility in their deployments, and edge providers will need to have flexible approaches to how data is accessed, shared and stored to meet the requirements of different use cases. Cloud Agents Most cloud providers today make use of edge agents to ensure that the edge resources are allocated appropriately and that the core and edge remain coordinated. In order to make use of any edge infrastructure, cloud providers will need support for these agents to be offered natively at the edge. This means that the edge operating environment needs to be designed with support for these agents in mind.

38

Customer In reality:-End-customers may have little say over the shape of the edge unless they have very specific requirements that will dictate dedicated and bespoke edge resources to support their operations. Most applications will utilize shared infrastructure that is available across a range of use cases. There are several core requirements that need to be met regardless of the class of infrastructure deployed at the edge[12].

Data Quality :-Data stored at the edge is subject to the same needs as data stored

elsewhere – it must be secure and relevant to the customer applications operating

at the edge. Data quality at the edge is a key requirement to ensure that customer

operations are able to operate in demanding environments. To maintain data quality

at the edge, applications must ensure that data is authenticated, replicated as

required and assigned into the correct classes and types of data category.

Data Security:-End-customers require data at the edge to be kept to the same

security requirements as when it is stored and used elsewhere. This presents

challenges due to the larger vector and scope for attacks at the edge. Security

policies need to be adapted to take this into account but otherwise data

authentication and user access is just as important at the edge as it is on the device

or at the core. Additionally, physical security of edge infrastructure needs to be

considered, as it is likely to be held in less secure environments than equipment

hosted in dedicated data centers.

Low Latency:-Taking advantage of low latency applications at the edge is one of

the key benefits of edge deployments and customers will require applications to

benefit from this. Any edge deployment needs to ensure that these benefits are not

39

lost through poor development practice or inadequate processing resources at the

edge. Maintaining data quality and security at the edge whilst enabling low latency

is a challenge that operators need to address whilst building their edge

architectures.

Data Access:- Ownership of data at the edge is always retained by the customer or

data generator . This means that they need a level of control over the data to ensure

that the customer is able to access it and handle it as required. This presents

challenges when data is distributed by the very nature of the edge. Therefore, edge

services need to allow scope for customers to understand how their data is held and

how they can allow third parties to access it. Customers also want key edge

performance statistics available to them. Reporting on criteria such as system

availability, performance trends and resource utilization will be needed to highlight

performance issues and consistency of A range of edge

enablement tools are likely also required to enable the customer to administer for

example the network configuration or audit the system for having up to date

patches. Customers will also likely have existing tools and reporting platforms, and

integration with these will need to be considered.

Developers / DevOps :-In reality, end-customers may have little say over the shape of the edge unless they have very specific requirements that will dictate dedicated and bespoke edge resources to support their operations. Most applications will utilize shared infrastructure that is available across a range of use cases. There are several core requirements that need to be met regardless of the class of infrastructure deployed at the edge. Development Support Developing edge applications will likely necessitate a

40 fundamental shift in development mindset towards tooling and programming languages designed to span multiple systems, rather than a traditional programming model which is solely concerned with creating logic that runs as a single process on a single machine and distributes work via a shared-memory model. New programming paradigms like

Unison, efforts to extend the Erlang process model to planetary scale like Lasp, and the

EU Light Kone project are some current examples of attempts to shift the edge compute programmatic model in this direction. Mapping Developing an application for the edge can be unwieldy compared to developing for centralized systems. Developers require access to a ‘map’ of the edge which allows them to understand key edge parameters such as edge server locations including location of data storage and data processing capabilities, device types and device IDs, network bandwidth availability and message pathways[12]. Holding all of this information will allow the developer to create applications which use appropriate edge resources and create outputs not constrained by the environment that processes them. API Access There are a large number of edge

APIs available to developers, partly dependent on the reference architecture deployed at the edge and the platforms being used to support edge functionality. Support of these standard APIs is beneficial as it speeds up development time and reduces the prospect of errors. However, customer requirements may necessitate the use of bespoke APIs to manage data and devices at the edge, particularly where complex, data intensive tasks are being deployed, such as in industrial IoT or medical use cases. Network access –

MEC Network connectivity is essential at the edge, but unlike traditional centralized systems, network access can be from a variety of different bearers, each with different characteristics and bandwidth availability on both uplink and downlink. Developers

41 need access to this information along with device support for each type of connection, as bandwidth and thus message delivery times and formats will change accordingly.

Making this information available means that applications will be able to correlate different service classes into a single experience.

42

CHAPTER 10

10. Edge Computing Technologies

There are plenty of reasons as to why organizations might begin to employ edge computing technologies into their network infrastructure. From real-time AI data analysis and enhanced application performance, to much lower operating costs and scheduled downtime.

we’ll look at a few edge computing in the IoT technologies, such as multi-access or mobile edge computing (MEC), fog computing, cloudlets .We’ll see what they are and how they work in order to better understand their role in enabling next-generation network infrastructures.

10.1. Multi-Access Edge Computing

MEC enables IT and cloud computing capabilities at the RAN edge in a close proximity to end users, offering an ultra-low latency environment with high bandwidth and real- time access to radio and network analytics. The use of MEC can provide the potential for developing a wide range of new applications and services, bringing innovation and driving new businesses. In particular, contextual information and specific content, proximity and location awareness can create business value opportunities, offering a customized mobile broadband experience. Service providers may also benefit from MEC in collecting more information regarding customers in terms of content, location and interests, in order to differentiate their portfolio or introduce new services or simply use such information for commercial reasons.

MEC offers an open radio network edge platform, facilitating multi-service and multi- tenancy by allowing authorized third parties to make use of the storage and processing

43 capabilities, introducing new businesses on-demand and in a flexible manner. MNOs can then provide cloud services and addition- ally monetize the broadband experience providing an insight of RAN and network conditions to third parties to facilitate service enhancements. MEC can also benefit from the ubiquitous coverage of cellular networks to become the key enabler for supporting M2M and IoT services that have become sufficiently mature to shape vertical segments/services including energy utilities, automotive, and smart city services[13].

As network demands look to increase significantly as more IoT and 5G-enabled technologies and devices are developed, the multi-edge computing allows users to deal with this excessive traffic and resource demands more intelligently. It also helps with laying the foundations for future intelligent and next-generation networks.

Multi or Mobile edge computing could also provide the enhanced location, augmented reality and Internet of Things services support. It gives those industries both a head start and time to adapt to new technologies before 5G networks begin to roll out.

For Example:-Lanner’s Edge Server HTCA-6200 is a Hybrid TCA platform with a high throughput network appliance for mission-critical edge computing applications.

10.2. Fog Computing

Fog computing, alternatively known as fog networking or “fogging”, is a term originally introduced by Cisco, proposed to enable a cloud computing architecture away from centralized cloud datacenters, considering a large number of geographically widespread edge nodes as part of a distributed and collaborating cloud. Fog computing, fog networking or merely “fogging”, is a term used to describe a decentralized computing infrastructure.

44

The Open Fog Consortium defines fog computing as a system-level horizontal architecture that distributes resources and services of computing, storage, control and networking anywhere along the continuum from the cloud to things. Fog computing provides tools for distributing, orchestrating, managing, and securing resources and services across networks and between devices that reside at the edge. The goal behind fog computing is to both extend cloud computing and services to the edge of a network and attempt to reduce the data transported to the cloud for processing, analysis and/or storage. The data captured from IoT sensors and other devices are usually sent to the cloud to be analyzed and processed. However, these devices can often be much too far away geographically to respond in a useful amount of time. Fog computing can enable short term analysis and processes at the edge of a network so as to reduce the amount of data being sent back to the cloud[14].

From the functional point of view, a fog node has several functions, including networking, computing, accelerating, storing, and control. Fog nodes can communicate with each other through wired or wireless transmission. Moreover, fog nodes have some general computing capabilities. In particular, those fog nodes engaged in enhanced analytics need to configure accelerator modules such as graphics processing units, field programmable gate arrays, and digital signal processors to provide supplementary computational throughput. Many types of storage are required in fog nodes to meet the required reliability and data integrity of the system and scenario. Generally, there are a rich set of sensors and actuators at the edge of the network in an application scenario. These sensors and actuators are connected to the fog node via a multitude of interfaces, such as PCIe, USB, and

45

Ethernet. Fog nodes can be worked in a mesh manner to provide load balancing, resilience, fault tolerance, data sharing, and minimization of cloud communication.

An open architecture based on fog computing enable interoperability in IoT, 5G, AI, tactile internet, virtual reality, and other complex data and network intensive applications. IoT applications generate unprecedented amounts of data that can be useful in many ways.

The fog computing-based face identification and resolution framework have been explored to solve some security and privacy issues. In addition, because the fog is localized, new services that require mobile networks supporting high data rates and low latency become possible, such as virtual reality.

Fog computing provides business value for some applications that require real-time decision making, low latency, improved security, and are network constrained.

10.3. Cloudlets

Computer Scientist Mahadev Satyanarayana introduced the concept of cloudlet as the middle layer in a three-tier architecture consisting of an end-device, an edge cloud platform, i.e., cloudlet, and a centralized datacenter. The objective of cloudlet is to extend the remote datacenter cloud services in close proximity to the end users. Cloudlet is viewed as a trusted and resource-rich node with stable Internet connectivity, offering computing as well as access and storage resources to nearby mobile devices. It was proposed as an edge cloud node, which can reside in community places, e.g., coffee shops and shop- ping malls, and highly populated areas, e.g., train stations and exhibition halls. Mobile devices merely act as thin clients, i.e., lightweight devices which heavily depend on remote access

46 to a cloud server for offloading resource-intense tasks in order to increase execution speed and save energy. Cloudlets are instantiated based on a soft state implementation that relies on Virtual Machines (VMs). Cloudlets merely act as micro datacenter in a box , offering access to end users over Wi-Fi for deploying and managing their own VMs. Using a VM- based approach, cloudlets offer two distinct approaches. The first one is based on VM migration that focuses on suspending and transferring an existing state of processor, disk and memory of the end device to the destination cloudlet, whereby execution is resumed from the exact state[13]. The second approach considers the delivery of a small VM overlay image instead of the mentioned states. In this case, the cloudlet is ready with a VM base

(OS). The overlay image is integrated on top and the execution continues seamlessly from that point.

Further there are some differences between cloudlets and cloud computing:-

Compared to the cloud computing, a cloudlet needs to be much more agile infits

provisioning because the association with mobile devices is highly dynamic with

considerable churn due to user mobility.

To support user mobility, a VM handoff technology needs to be used to seamlessly

migrate the offloaded services on the first cloudlet to the second cloudlet as a user

moves away from the currently associated cloudlet[14].

Because cloudlets are small DCs distributed geographically, a mobile device first has

to discover, select, and associate with the appropriate cloudlet among multiple

candidates before it starts provisioning.

47

CHAPTER 11

11. Edge computing uses cases.

11.1. Device Management

There are many different types of device attributes and configurations that can be

controlled at the edge. This is called device management which extends the

functionality of platforms. Below are examples of four different attributes that device

management at the edge will need to support[15]:-

Edge node or gateway management – Device management platforms can be used to

manage the operator’s edge infrastructure as well as the IoT device.

Distributed updates – The use of the edge gateway to distribute firmware

updates locally, with distribution being managed by the edge node as opposed to the

queuing system typically used when a firmware update is distributed centrally.

Device configuration updates – Devices at the edge will need to be configured locally

as services change. The edge could be able to manage this remotely.

Diagnostics of connected devices – Troubleshooting or analytics can be done at the

edge to identify specific problems with devices in the field through machine learning

or pattern recognition.

Further there are some benefits of the edge computing to device management :-

Better control at the edge - Edge Computing allow Device management locally at the

edge which means IoT devices are better controlled resulting in increased efficiency

and improved quality of service. Because device management is executed at the edge

problems or glitches can be identified and additional resources can be brought to reduce

problematic areas and better manage service scalability. Additionally, IoT services

48

such as data access by third parties can be managed effectively if control is available

at the edge node.

Better management of device performance - Enforced consistency of device

configurations and performance criteria at the edge mean that there are fewer variables

to contend with, which means in turn that applications can be confidently optimized to

obtain best the best performance. Additionally, device performance data can be

collected effectively at the edge for further analysis.

Effective deployment of applications at the edge - The use of a device management

platform at the edge can help service providers to distribute their services at

appropriate edge locations. For Example, If an IoT deployment has a number of

different edges to manage, an application can be deployed to the edge closest to the

device to obtain lowest latency, if a customer is paying for a premium service.

Integrated hardware and software ecosystem - Device management that manages all

aspects of the edge, including devices, applications and connectivity means that a single

ecosystem can be created, where control is able to be exerted across all elements of an

edge deployment. This means that the service can be optimized for different uses,

enhancing quality of service as a result.

11.2. Security

The edge has an important role to play in data security. Many advanced tools and

techniques can be applied to ensure that the edge contributes to the security of the

overall IoT deployment. With a vast array of different equipment and devices connected

to the IoT, security services at the edge can be used to comprehensively secure or even

isolate complex industrial environments such as smart factories and buildings, as well

49 as used to ensure that data privacy is maintained in applications handling personal data such as CCTV or automated license plate readers. Security at the IoT edge should be treated the same as any other secure environment, but there are new tools that can be used to ensure security at new edge and device levels. For example, using strong identity management for devices at the edge means that authentication is more straightforward, as does robust definition of edge processes to ensure that they remain secure[15].

A number of security issues can be addressed at the edge in an IoT environment:-

Firmware and other updates Secure update of firmware and other device updates from

the edge using public key certification or secure transmissions such as SSL ensure that

firmware upgrades are carried out securely.

Data Authentication :-Data Authentication of data and updates at the edge is important

to retain secure environments. Authentication is likely to be via a certificate-based

system. Implementation of this will need careful consideration to prevent poor

performance of edge processing and latency.

Access Control :-Access Control Identity and permissions management at the edge is

important to ensure that access to data at the edge is managed securely. Granting data

access to third parties means that full access control policies must be in place.

Prevention :- Prevention of Denial of Service attacks Analysis of the data flow from

IoT devices to spot and prevent characteristics of DDoS attacks.

Further there are some benefits of the edge computing in terms of security:-

The distributed nature of edge computing for IoT means that malicious attacks aimed

at the network are harder to instigate as attacking single nodes will only have limited

impacts.

50

The IoT, by its very nature is a distributed, complex network of devices. Edge

computing pushes much of the logic and data storage for effective operation of the IoT

closer to the end devices and having security services also distributed at the edge offers

the opportunity to improve security capabilities, as well as offering native security for

new low latency applications.

The IoT edge offers a new way of securing IoT endpoints which may not be running

the most up to date firmware or operating system. Security services at the edge can be

used to ensure that devices with a high-risk profile can be more easily isolated or have

their data actively intercepted and secured.

Data and device provenance as processing and data storage moves closer to the edge,

then the origination point of data is better understood and can be recorded with greater

confidence.

Processing of authentication, identity and access management – although sometimes

constrained, additional processing power at the edge can be used for robust security

processes as well as customer applications, meaning that security processes can be

applied to ever increasing volumes of data.

11.3. Priority Messaging

Within the IOT network there is different kinds of data generated. Priority messaging

will need be considered as part of the overall design of IoT products, networks and data

processing services. There are different types of priority messaging. Some have low

priority like unexpectable status update, some minor errors in the network but there is

some critical data which needs to be delivered to the end users without any latency

problems or glitches. High priority messaging and subsequent actions are enabled by

51

the IoT edge. By only communicating with local IoT gateways and cloudlets, and

keeping the impact of priority messaging local, faster responses can be assured[15].

Further examples of priority messaging include:

Transportation - accident alert that needs to be sent to following vehicles to enable

them to avoid collision.

Health & Safety - fire alarm linked to building evacuation.

Environmental - rainfall or pollution above maximum safe levels linked to remedial

activities.

Security – unauthorized activity leading to automated security actions e.g. doors

closing. Terrorism response in immediate vicinity; drones flying into no-fly areas.

Industrial – failure of critical component required immediate shutdown of other

systems. Construction worker in unsafe location. The edge enables high priority data

needs to be generated, sent, processed, and actioned more quickly than sending the data

to the cloud.

Fast processing at the edge – low latency means that priority messages can be acted

upon more quickly at the edge. Having relevant data storage and applications in a local

cloudlet means that messages are received and acted upon quickly, without having to

rely on centrally held data or applications.

Routing – processing of priority messages at the edge means that their routing can be

optimized through the rest of the network architecture, so they get to a specific endpoint

in the fastest possible time.

52

Battery life – data prioritization at the edge means that low powered IoT devices can

save battery life and processing power by leaving the actioning of critical data to the

edge node.

11.4. Data Aggregation

As more data is generated through a network ,there can be chance of data replication

or duplication like a there can multiple readings from temperature sensor for a given

point of time. In this situation edge can play vital role in deciding which data should be

send to cloud or centralized devices.

Further are some examples where data aggregation can help:-

Data from multiple temperature sensors in the same location can be aggregated to

produce statistical measures (min, max, mean etc.)

Traffic data derived from multiple vehicles in the same queue.

Power outage reports from meters sending last gasp communications.

Positive status reports from widespread connected equipment such as streetlights.

Further there are some benefits of edge computing to data aggregation:-

Network efficiencies Data aggregation can create significant efficiencies in the IoT

network. Aggregation can remove the need for replicating data across multiple systems

and performing the same processing multiple times on different systems. This means

that there is no need to backhaul masses of replicated data, and therefore resources for

data analytics can be used more effectively and data storage needs can be lowered. All

of this means that the load on the core infrastructure is significantly reduced.

53

Latency improvements by having less data to sift through, quicker decisions could be

made, so appropriate actions can be taken faster. By reducing the amount of data to be

communicated and processed, latency should be improved.

Richer data sets Aggregated data provides valuable data sets where much of the data

pre-processing has already been completed. This could aid machine learning in making

more reliable predictions and allows patterns and trends to be more readily identified.

Aggregated data generation is if multiple messages are received by an edge node over

a short period, these messages can be aggregated, and the original messages are deleted.

The edge would not send on all messages or data when multiple messages are received.

Instead, a new dataset is generated which summarizes the data received. This would

contain an overview of all messages received, potentially with periodic updates, until

there is a status change.

Sampling of devices, if a large volume of similar messages is received, the edge could

elect to only monitor a handful of affected devices, rather than the whole fleet until the

status of these changes. If cars enter a traffic jam, rather than taking status updates from

every vehicle in the jam, data could only be taken from every 10th vehicle until they

start to move again.

11.5. Cloud Enablement

From a service point of view, IoT edge environments will rarely operate completely in

isolation. A connection to a centralized cloud will often be required for control,

monitoring and update purposes at the very least, and so the dynamic between the edge

and cloud is a complex one. A hybrid edge and cloud architecture can offer the best of

both worlds[15].

54

Further there are some benefits of edge computing to Cloud Enablement:-

Maximize use of resources both locally and centrally by linking the edge and the cloud, the most appropriate resources for specific tasks can be identified and allocated. This means efficient usage of resources in the operator’s domain, perhaps held at local base stations, or on the device itself, or in the cloud providers domain, where huge data volumes can be centrally stored.

Integrating the edge and cloud means that users will have a view over the status and location of all data relevant to their application or service. A seamless view means that quality of service can be achieved without having to resort to only working at the cloud or at the edge.

By enabling distributed resources, scalability of IoT services at the larger end of the scale becomes achievable. Even if resources at the edge are unable to cope with the scale of operations required by a deployment, there can be a fallback to the cloud, meaning that some or all of the service can be maintained.

Security Full integration of edge and cloud means that data security can be overseen from a single source. Utilizing the relevant cloud agents ensures that data can be securely transmitted across the cloud, edge and device.

Enablement of new business models Business models including Infrastructure as a

Service (IaaS) or higher response level Service Level Agreements enabled by lower latency can be introduced with integrated cloud and edge access. Unified billing for processing and storage can be managed across the cloud and edge in processes seamless to the end user. The edge can reside in different locations of the IoT deployment chain, and any cost benefits can be used to create new business models.

55

11.6. IoT Image and Audio Processing

IoT Edge introduces new ways of analyzing this data without having to backhaul the

entire image or audio stream. An edge cloudlet can be used to process the image, video

or audio data to determine key information, such as license plate numbers or the number

of people in an area, meaning that only a small amount of data, such as the license

number itself is forwarded or stored[15].

Further There are certain benefits of Edge Computing architecture to this Use Case:-

Low cost Cameras and microphones are relatively low cost to procure, install and

maintain for the insights they can provide. Use of edge processing means that network

management costs are also managed effectively, making them an attractive general-

purpose alternative to dedicated IoT sensors.

By identifying objects within images, without needing to send the image itself to

upstream servers, the amount of data that needs to be transmitted back to the core is

significantly reduced.

Quick decision-making Fast processing means that it is possible to support a wider

range of real-time or near real-time applications. Speeding up the management of

production lines or enabling new ways of charging drivers at tollgates and so on.

Because of more flexible IoT sensor arrays, a more comprehensive analysis can be

taken, as cameras can add more general context (through both imaging of a location

and broad image coverage).

Enabling new use cases New IoT use cases become possible with the use of cameras

and microphones as sensors. For example, the use of image processing for recognizing

yields, pests and diseases of crops whilst they grow.

56

Devices such as cameras, including CCTV, and microphones can provide data for processing by IoT platforms and applications, such as license plate reading or monitoring noise pollution. IoT edge introduces new ways of analyzing this data without having to backhaul the entire image or audio stream. An edge cloudlet can be used to process the image, video or audio data to determine key information, such as license plate numbers or the number of people in an area, meaning that only a small amount of data, such as the license number itself is forwarded or stored.

57

CHAPTER 12

12. Industries using Edge Computing

12.1. Amazon Web Services

High-performance edge applications rely on the cloud for processing, analytics,

storage, and machine learning, but also need to do some processing, like ML inference,

close to where data is generated to deliver intelligent real-time responsiveness and

reduce the amount of data transferred.

AWS edge computing services provide infrastructure and software that move data

processing and analysis as close to the endpoint as necessary. This includes deploying

AWS managed hardware and software to locations outside AWS data centers, and even

onto customer-owned devices themselves[16].

Due to the vast variety of edge services offered by AWS ,many big organizations have

started using their services. Given below are the some of the examples where AWS are

used and how it’s benefitting the companies:-

Manufacturing:- Manufacturing is a vast market which includes lots of human effort

and machinery. This is where Amazon Web Services plays very important role because

of the following benefits:-

Improve operations:- AWS makes it easy to build and tailor your data lake allowing

you to securely store, categorize, and analyze all your data in one, centralized

repository.

58

Innovate faster:- AWS virtually unlimited High-Performance Compute (HPC)

capacity allows you to improve your pace of innovation without the need for large

capital investments.

Lower IT/OT costs:-Focus on improving business operations and innovation, not

IT and OT infrastructure. AWS pay-as-you-go microservices and serverless

computing models reduce the cost of running your connected plant or smart product

programs.

Enhanced security:-Cloud security at AWS is the highest priority. As an AWS

customer, you benefit from a data center and network architecture built to meet the

requirements of the most security-sensitive organizations.

Telecommunications:- Amazon Web Services (AWS) is helping to power the future of telecommunications. Leading communications service providers (CSPs) are partnering with AWS, to accelerate their data center consolidation and migration to the cloud, but also to monetize their path to 5G by offering customers next-generation capabilities in mobile edge computing and IoT. Moreover, AWS helps to:-

Accelerate digital transformation and data center consolidation to optimize

performance, lower IT costs, strengthen security posture, and free up investment

capital.

Getting the path for the 5G and mobile edge computing to bring next generation

capabilities to smart devices and networks.

Improving the customer experience by providing customer care or troubleshooting

a problem using machine learning and artificial intelligence.

59

Automate the business processes so that the overall process of manufacturing

becomes more efficient and effective.

Healthcare :- All healthcare providers, public health organizations from around the world are using AWS to stay updated with the ripple effect of COVID-19. This includes providing customers in the most affected regions with technical support and AWS

Promotional Credit to help cover costs of initiatives related to COVID-19 response.

AWS helps to :-

Accelerate the process of clinical forecasting, personalized engagement, advance

clinical research, and accelerate drug discovery.

Empower your researchers, clinicians, and operations teams to optimize their

efficiency and make better informed decisions with data transparency, analytics,

and machine learning.

Secure and control patient data, enable data interoperability, and facilitate

collaboration using secure AWS services designed to conform to global industry

standards.

Media and Entertainment:- To innovate at the speed your customers demand requires a flexible suite of media technologies and solutions to accelerate content production and delivery for the right audience, at the right time, for the right screen. That's where

AWS can help. AWS brings highly scalable, elastic, and secure cloud services to content production, storage, processing, and distribution. With Machine Learning and

Analytics embedded throughout the media value chain, you can make smarter content investments, better monetize your content library, and delight users with personalized experiences.

60

Table 1 Companies using Amazon Web Services- I

Category Company Amazon Web Services Used

Amazon Sage Maker helps designers organize, sort through thousands of Autodesk generative designs and AWS Lambda Autodesk is able to process data in near real- time, providing timely analytics for product improvements and other business opportunities. Manufacturing

Amazon Kinesis helps to stream real-time data from manufacturing equipment to a Georgia-Pacific central data lake based on Amazon Simple Storage Service (Amazon S3), allowing it to efficiently ingest and analyze structured and unstructured data at scale.

Amazon Athens helps to analyze the raw data like pulping mechanisms, paper machines and paper quality.

Uses Amazon Virtual Private Cloud Comcast (Amazon VPC) and Amazon Direct Connect deliver the scalability and security needed for rapidly innovating in a hybrid environment. It deploys features for its flagship video product, XFINITY X1 Telecommunications

Uses Amazon Redshift to ingest multiple terabytes of analytical data each hour from Boingo different sources, including account and authentication data from hotspot venues.

Category Company Amazon Web Services Used

61

A gateway puts X-ray images into Amazon Elastic Compute Cloud (Amazon EC2) UC San Diego Health instances, the machine learning algorithm analyzes them and applies the heat map overlays, and then the images travel to be viewed by clinical personnel

Uses Amazon redshift to simplify, accelerate Healthcare and automate data transfers to the AWS Philips Cloud.so far the company transferred 37 million records in 90 minutes and can optimize large data sets within two hours.

Uses Amazon Kinesis Streams processes multiple terabytes of log data each day and Netflix in analytics events show up in seconds. respond to issues in real time, ensuring high availability and a great customer experience.

Uses the Amazon Kinesis and Amazon Sage Media and Fox Network Maker to enhance live video streams and enable a real-time data capability Entertainment

12.2. IBM

IBM Edge Application Manager, an intelligent and flexible platform that provides

autonomous management for edge computing. A single administrator can manage the

scale, variability and rate of change of application environments across tens of

62 thousands of endpoints simultaneously. It is a full lifecycle edge environment that helps you safely create, deploy, run, monitor, maintain and scale business logic and analytics applications at the edge, which can be access virtually from anywhere. further these edge or edge points run on Red Hat or Kubernetes platform[17].

The IBM Edge Platform provides benefits like:-

Better data control and costs:- Minimize data transport to central hubs and reduce

vulnerabilities and cost.

Faster insights and actions:- Tap into new sources of data created and processed at

the edge

Continuous Operations:- Run autonomously, even when disconnected, and reduce

disruption and cost.

Industrial processes are full of data and information. When all these bits and bytes are adequately collected and processed, they are used to optimize the production process.

For example, by automatically adjusting the process or for making the right decisions at the right time.

Further let’s discuss about some of industrial giants which are using the IBM Edge

Platform :-

L’Oréal:- Like many other companies, L’Oréal faces increasingly demanding

customers. Consumers expect cosmetics that are perfectly tailored to their skin type,

skin color and personal preferences that requires the continuously development of

products and we need to put these products on the market much faster. It is a great

challenge to make such a large organization more agile and flexible. To address

this challenge L’Oréal combined sensors, laser measurement, cameras and

63 advanced conveyor belts. This happens in a totally new production line in the

Belgian plant in Libramont. So, L’Oréal has partner with IBM in order to become agile. With the new platform we collect data from various sensors within our production systems. This enable us to process this data and apply it in the process.

IBM helps to develop new products and services and to deliver high-quality products.

Honda:- Honda is one of the world’s most innovative companies, as anyone who has witnessed its work in robotics can testify. In the automotive space, Honda aims to become the premier manufacturer of interesting, cleverly designed cars that enable customers to experience the joy of driving. Honda R&D realized that new sources of big data – vehicle diagnostics and telematics, , biometric sensors and large bodies of unstructured text such as customer surveys – had great potential value. To help its harness these big data assets, Honda R&D needed two things: a comprehensive set of big data analytics tools, and a group of engineers with the skills and enthusiasm to use them Honda has partnered with

IBM. Kyoka Nakagawa, Chief , TAC, Honda R&D Co said IBM was the right choice because First, IBM offers a very broad range of big data analytics capabilities, including data mining, text analytics and visualization – so we were able to get all the tools we needed from a single vendor. Second, IBM had the skills and experience to guide us all the way through our big data journey, from consultation through proof-of-concept to final realization. Honda uses the IBM platform like:-

64

IBM SPSS Modeler for data mining tool which is very good for organizing raw

data into usable datasets, so that it can be analyzed easily. It is also very easy to

use for complex analyses. Another valuable feature is the ability to monitor

users and see how they are interacting with the tool. So, if someone is struggling

to manage their data effectively, colleagues can give them some extra help

IBM Watson for content analysis:- Honda R&D uses IBM Watson Content

Analytics for text mining – giving researchers near-instant insight into vast

stores of documents and other textual data.

IBM Predictive Maintenance Quality:- IBM Predictive Maintenance and

Quality is designed to help organizations monitor their assets and processes and

predict asset failure or quality issues.

This combination of IBM platform the processing closes to the edge and makes the

overall work cycle runs faster and more efficiently.

Mitsufuji Corporation:- Mitsufuji started as a traditional Japanese textile manufacturer. But today they are one of the companies dealing in wearable IOT devices. Mitsufuji transformed their business by using IBM Maximo Worker Insights and IBM Watson to build a solution that captures, analyzes and tracks users’ heart rate, location and other critical health data in real time by using a range of innovative wearable products. In order to achieve this, they had to create a in house GPS. Using the IBM edge platform not only solve their scalability problem, but also provided new functionality and helped accelerate development. In fact, Mitsufuji was able to migrate its entire back-end systems architecture to IBM Watson IoT Platform and IBM Maximo

Worker Insights in less than four months.

65

12.3. Dell Computers:- The edge exists wherever the digital world and physical world

intersect, and data is securely collected, generated and analyzed to create new value.

We make this possible with solutions that deliver consistent infrastructure and

operations, intrinsic security, and expert support around the world[18].

Dell computers have developed an edge platform which consists dell servers like dell

EMC Power Edge XE 2420, dell EMC PowerEdge XE 7100 built for scalable data-

intensive environments and ideal for object storage, intelligent video analytics (IVA),

and media streaming workloads. This edge platform has seven different types of

storage solutions that can be used for the fast and easy storing data. Further, in order

to remotely control a location dell has systems like:-Vx Rail, Microsoft Azure

integrated server with the cloud integration helps in uninterpreted workflow.

Further dell edge platform is helping the in-construction management. companies like

the iNET which deals in construction management uses dell edge platform for

monitoring Realtime data like work progress, machinery status, locations and

atmospheric conditions using drones, sensors etc. iNET uses dell EMC servers which

are enabled with hypervisor technologies, which can deal with up to 3 petabytes of

data in Realtime.

Doosan leverages OEM Solutions to accelerate the release of PreVision while

reducing the cost of development. It chose Dell EMC PowerEdge R640 and R740

servers as the platform—a highly reliable hardware that would give PreVision

customers worldwide peace of mind. Plus, Doosan could accelerate shipment

processes with OEM Solutions ensuring PreVision software was installed before the

66

servers left the Dell Technologies factory, saving time and effort when the hardware

arrived at Doosan’s offices.

12.4. 5G Network:- 5G Edge is a mobile edge computing platform available for

businesses. It is designed to enable developers to build applications for mobile end-

users and wireless edge devices with ultralow latency.

Company like Verizon have adopted this edge computing platform to implement 5G

technology. edge computing architecture infuses the power of the cloud directly into

the Verizon 5G network. By extending infrastructure to where business happens, it

allows for a new class of cloud-native applications, creating innovative, value-based

opportunities for customers.

In addition to ultralow latency, we expect users to greatly benefit from unprecedented

increases in speed, bandwidth, throughput, reliability, agility, scalability, energy

efficiency, privacy and security.

5G Edge helps enable the real time enterprise by delivering:-

A fully integrated network and computing environment.

Ease in performing data analytics locally.

Consistently low latency for workloads and applications (closed-loop control

systems, autonomous machines, robotics, AR/VR, IoT performance tracking, etc.).

Data and application sovereignty to support security and compliance requirements.

67

CHAPTER 13

13. Challenges

Edge computing, offers undoubtedly great potential benefits to businesses, but still there are some challenges that this platform faces[19]:-

13.1. Proliferation of devices, platforms and protocols

The IOT world is characterized by heterogeneity – many different things, protocols, new vs. legacy hardware, etc. Ideally, Edge computing should act as a “shield” to this complexity, but there’s already a growing proliferation of often-incompatible edge computing platforms and applications on the market that could hinder wider adoption.

13.2. Open vs. Proprietary Systems

The need to adopt open edge standards and systems is and will become more critical.

“Open,” at a minimum, means the edge computing Platforms must be silicon, hardware, operating system, software application and Cloud. We also need open standard APIs that can enable “plug and play” of any software application at the edge.

13.3. Time-Critical Performance

Many of the applications we want to run at the edge – including closed-loop control, specialist AI and analytics applications – need access to “real-time” data. These can be very challenging performance constraints, e.g., millisecond or even microsecond response times.

68

13.4. Hardware Constraints

The hardware available to run time-critical edge applications is often highly constrained in terms of memory availability or the need to run at low power. This means edge computing software may need to be highly optimized.

13.5. Open edge ecosystems

Getting the edge “right” is not just about technology, it is also about the global ecosystems that support it. A one-company “open” API is not really an open API, and the scale of the problem and the diversity at the edge requires collaboration: an ecosystem.

69

CHAPTER 14

14. Conclusion

With increasing interest in new use cases and services like smart manufacturing, augmented and virtual reality and the high interest in online gaming, there is a clear need for edge computing. Edge computing is an enabler for use-cases requiring security, resilience, and low latency in combination with other technical solutions like private networks. Moreover, bandwidth could also be saved if a larger portion of data could be handled at the edge rather than uploaded to the cloud. The burgeoning of IoT and the universalized mobile devices changed the role of edge in the computing paradigm from data consumer to data producer/consumer. The edge computing ecosystem is vast and is evolving rapidly. Many organizations and companies are involved in specifying the technology and defining solutions. Edge computing covers a vast number of uses cases, but there’s no one solution that fits them all. Organization should choose the one that suits their enterprise strategy best and should be prepared to build strength by partnering with their respective service provider. Further as the technology is moving forward, the computing infrastructure needs to be more efficient ,effective and robust. At last, we put forward industrial user cases of edge computing and challenges that are worth working on, including Open vs. Proprietary Systems, time critical performance, hardware constraints etc. Edge computing is here, and we hope this paper will bring this to the attention of the community.

70

REFERENCES

1. The Future of Edge Computing. (2020). Freshconsulting.com.

https://www.freshconsulting.com/wp-content/uploads/2019/08/The-Future-of-Edge

Computing.pdf

2. Wikipedia contributors. (2020, November 9). Edge computing. Wikipedia.

https://en.wikipedia.org/wiki/Edge_computing

3. Satyanarayana, M. S. (2017). The Emergence of Edge Computing. The Emergence of

EdgeComputing.https://www.ics.uci.edu/~cs237/reading/reading2020/Satya_edge201

6.pdf

4. Weil, A. (2020, August). 20 Years of Edge Computing - The Akamai Blog. 20 Years of

Edge Computing. https://blogs.akamai.com/2020/08/20-years-of-edge computing.html

5. Satyanarayana, M., Bahl, P., Caceres, R., & Davies, N. (2009). The Case for VM-Based

Cloudlets in Mobile Computing. IEEE Pervasive Computing, 8(4), 14–

23.https://doi.org/10.1109/mprv.2009.82

6. Snyder, S. (2020, June 28). Why edge computing is essential to your connected

operations strategy. Business Operations. https://www.ibm.com/blogs/internet-of

things/iot-why-edge-computing-is-essential/

7. IBM. (2020). Edge Computing Architecture. Edge Computing

Architecture.https://www.ibm.com/cloud/architecture/architectures/edge-

computing/overview

8. Edge, L. F. (2020, March 17). Edge computing architecture and use cases. LF Edge.

https://www.lfedge.org/2020/03/05/edge-computing-architecture-and-use-cases/

71

9. Ranger, S. (2020, February 3). What is the IoT? Everything you need to know about the

Internet of Things right now. ZDNet. https://www.zdnet.com/article/what-is-

theinternet-of-things-everything-you-need-to-know-about-the-iot-right-now/

10. Gyarmathy, K. (2020). The Benefits and Potential of Edge Computing. Benefits of

Edge Computing. https://www.vxchnge.com/blog/the-5-best-benefits-of-edge

computing.

11. Mangat, M. (2020, October 23). Edge Computing vs Cloud Computing: The Key

Differences. Phoenix NAP Global IT Services. https://phoenixnap.com/blog/edge-

computing-vs-cloud-computing.

12. GSMA. (2019, July). IoT Edge Computing Requirements. IoT Edge Computing

Requirements. https://www.gsma.com/iot/wp-content/uploads/2019/08/IoT-

requirements-report-August-2019-1.pdf

13. Taleb, T., Samdanis, K., Mada, B., Flinck, H., Dutta, S., & Sabella, D. (2017). On

Multi-Access Edge Computing: A Survey of the Emerging 5G Network Edge Cloud

Architecture and Orchestration. IEEE Communications Surveys & Tutorials,

19(3),1657–1681. https://doi.org/10.1109/comst.2017.2705720

14. Ai, Y., Peng, M., & Zhang, K. (2018). Edge computing technologies for Internet of

Things: a primer. Digital Communications and Networks, 4(2), 77–

86.https://doi.org/10.1016/j.dcan.2017.07.001

15. GSMA. (2018). IoT-Edge-Opportunities. https://www.gsma.com/iot/wp-

content/uploads/2018/11/IoT-Edge-Opportunities-c.pdf

16. AWS for the Edge - Overview | Amazon Web Services. (n.d.). Amazon Web Services,

Inc. https://aws.amazon.com/edge/

72

17. Edge Services. (n.d.). IBM. https://www.ibm.com/services/process/edge-services

18. Dell Computers. (n.d.). Dell Technologies Edge Solutions. Dell Edge Computing.

https://www.delltechnologies.com/enus/solutions/edgecomputing/index.htm#cs&tab=

0&#indu&#cs

19. Ops, D. W. (2020, June 22). The promise and challenges of edge computing. The EE.

https://www.theee.ai/2020/06/19/3379-the-promise-and-challenges-of-edge-

computing/

73