Why Kubernetes is ideal for Industrial Edge?
Read Time:9 Minute, 40 Second
Data volumes continue to grow, particularly in industries like manufacturing, oil and gas, energy, and transportation that are undergoing rapid digital transformation. There is need to manage this data explosion at the edge and the many associated challenges including complexity of the systems, data privacy, latency issues, low bandwidth connectivity, and increasing costs for storing and processing data either in the cloud or data centers.
Edge computing reduces the amount of long-distance connectivity between a device and server by getting computing as close to the data source as possible, thereby improving the way the data is handled, processed, and delivered. When it comes to large industries like manufacturing, oil and gas, energy or transportation, Industrial edge computing is being deployed to analyse and manage all data at the asset end in real time for real-time analytics or to use the aggregated data for further processing in cloud.
The Industrial edge has three main components: (1) Connectivity is the ability to connect to any industrial system and collect and normalize data for immediate use. (2) Intelligence is focusing on data processing and analytics functions at the edge to act and derive insights at the data source. (3) Orchestration is the ability to create, deploy, manage, and update edge applications.
Most of the organizations are either sending all their data to the cloud or keeping it at the edge. Industrial edge computing adds benefit by operating at the edge, where it has the most effect and has zero latency. For rapid business decisions that improve quality and processes, the industrial edge unlocks real-time analytics such as inventory usage, asset uptime and downtime, capacity utilization, and more.
The edge enables predictive/prescriptive maintenance, condition-based monitoring, OEE, vision systems, quality improvements and more. Edge data can also enable more advanced use cases like artificial intelligence and machine learning in the cloud. The intelligent edge is driving significant operations and process improvements.
Businesses have begun to adopt modern edge platforms to drive these initiatives with a unified solution that enables the three facets of the edge — edge computing, edge analytics, and edge intelligence. Recent report from Gartner, states that deployed IoT endpoints in the manufacturing and natural resource industries to reach 1.9 billion units by 2028.
In this blog, we will look at how Kubernetes, the edge and the cloud can collaborate together to drive intelligent business decisions.
Cross posted from HCLTech Blogs
What does it take to enable edge computing, edge analytics, and edge intelligence?
To meet the needs of modern manufacturing, automotive or telecommunications industries, it is important to acknowledge that businesses are under tremendous pressure to drive innovation and efficiency on their plant floors. This necessitates a comprehensive methodology that combines new/existing IT products with emerging methodologies, along with the goal of avoiding the time-consuming and error-prone manual configuration of several devices and applications at scale.
Following considerations must be noted while building the edge architecture:
- Resource constraints: Low compute capability, small footprint of the devices.
- Security Challenges: Data privacy, Physical device security and network security of the connected devices.
- Manageability: Manage application software across thousands of devices from many different suppliers.
- Reliability: Consistency in building, deployment, and maintenance of applications.
- Automation: Provision for automated means to deploy and manage multiple distributed applications across any number of machines, physical or virtual with high levels of automation.
Meeting the needs of computing, analytics, and intelligence at the edge
Edge-based infrastructure presents several problems in terms of resource and workload control. In shorter period, there would be thousands of edge nodes and far edge nodes to control. Edge architecture implemented by organizations are supposed to provide more centralized autonomy from the cloud, security protocols, and relatively low latency. Let us look at the proposition that Kubernetes for Edge offers:
Resource constraints: Low compute capability, small footprint of the devices
Certified light weight versions of Kubernetes distributions are available for production workloads running in highly constrained environments such as IoT and edge computing deployments.
Security Challenges: Data privacy, Physical device security and network security of the connected devices.
Provides policy-based mechanism for all the type of deployments so that any policies, rule set(s) can be applied to overall infrastructure.
Also, policies can be narrowed down for specific channels or edge nodes based on specific configuration requirements.
Manageability: Manage application software across thousands of devices from many different suppliers.
Use of Customer Resources Definition CRD’s & Operators, extend the simplicity and power to Site Reliability Engineering teams to manage complex applications, infrastructure successfully.
Reliability: Consistency in building, deployment, and maintenance of applications.
GitOps & DevOps pipelines based approach ensures management of system complexity by pushing only reviewed changes of configuration settings, application artifacts, and then they are automatically pushed into operational systems.
Automation: Provision for automated means to deploy and manage multiple distributed applications across any number of machines, physical or virtual with high levels of automation.
Kubernetes cluster is constantly reconciling to a point of target state that have been defined by developers or administrators.
Kubernetes also uses a pipeline approach, in which YAML-based configuration and container image changes will start as git commits, which then cause pipeline operations, result in updates to applications and the cluster as a whole.
Rise of Kubernetes as most preferred platform for Edge computing
Since its introduction in 2014, Kubernetes has seen rapid adoption in Data centre and Cloud environments. Kubernetes has evolved from orchestrating lightweight application containers to handling and scheduling a broad range of IT workloads, from virtualised network operations to AI/ML and GPU hardware resources.
Kubernetes is fast becoming most preferred control plane for scheduling and handling work in distributed systems. These tasks could include deployment of virtual machines on physical hosts, pushing containers on edge nodes, or even expanding the control plane to other schedulers including serverless environments.
Kubernetes’ extensibility is making it a universal scheduler and most preferred management platform. Let us now explore various deployment approaches that cover the above requirements.
Deployment Approaches for Edge based deployments
Below approaches demonstrate how Kubernetes can be used for edge workloads, as well support for the architecture that meets enterprise application’s requirements — low-latency, resource constrained, data privacy and bandwidth scalability etc.,
#1. Deploy entire Kubernetes cluster at the edge — In this approach, entire Kubernetes cluster is deployed within edge nodes. This option is more suitable for use cases where edge node has limited capacity resources and do not want to consume more resources for control planes and nodes. The following diagram details minimal k3s Kubernetes cluster running on Edge nodes.
Figure 2:K3s Architecture| Source
K3s from Rancher is a Cloud Native Computing Foundation-certified Kubernetes distribution and is designed for production workloads running in resource constrained environments such as IoT and edge computing deployments.
K3s can be deployed on the virtual machines in the public cloud, or even on a Raspberry Pi device. Its architecture, while maintaining full compatibility and compliance with Cloud Native Computing Foundation Kubernetes conformance tests, is highly optimized for unattended, remote deployments on resource-constrained devices.
By making it accessible and lightweight, K3s is bringing Kubernetes to the edge computing layer.
MicroK8s is another example. It is the simplest production-grade upstream K8s. Lightweight and focused with options to install on Linux, Windows and macOS, Cloud Native Computing Foundation-certified Kubernetes distribution.
My book IoT Edge Computing With MicroK8s gives a hands-on approach to building, deploying and distributing production-ready Kubernetes on IoT and edge platforms. This edition has 400+ pages of real-world use cases, scenarios to help you successfully develop and run applications and mission-critical workloads using MicroK8s. Some of the key topics covered are:
- Implementing AI/ML use cases with the Kubeflow platform
- Service mesh integrations using Istio and Linkerd
- Running serverless applications using Knative and OpenFaaS frameworks
- Managing Storage Replication with OpenEBS replication engine
- Resisting Component Failure Using HA Clusters
- Securing your containers using Kata and strict confinement options
By the end of this book, you’ll be able to use MicroK8s to build and implement scenarios for IoT and edge computing workloads in a production environment.
Optionally, you can use platforms like Google Anthos or AKS to manage and orchestrate container workloads on multiple clusters like the one below:
Figure 3:Google Anthos on cloud & Microk8s at Edge | Source
#2. Deploy Kubernetes Nodes at the Edge — In this approach, Kubernetes node is deployed at the edge nodes and place the main Kubernetes cluster at a cloud provider or in your datacentre. This is more suitable for the use cases where the infrastructure is limited at the edge.
KubeEdge is open-source application extending native containerized application orchestration and device management to hosts at the Edge. KubeEdge consists of a cloud part and an edge part.
It is built upon Kubernetes and provides core infrastructure support for networking, application deployment and metadata synchronization between cloud and edge. It also allows developers to author custom logic and enable resource constrained device communication at the Edge using MQTT.
KubeEdge is stable and addresses the key use cases related to IoT and edge. It can be installed on a supported Linux distribution or on an ARM device like a Raspberry Pi.
Figure 4:KubeEdge Architecture | Source
#3. Deploy Virtual Kubernetes Nodes at the Edge — In this approach, Virtual node agents reside in the cloud and other part the abstract of nodes and pods are deployed at the edge. Virtual node agents get command control for edge nodes containing containers.
Although there are other examples, Microsoft’s Virtual Kubelet project is good example of extension of the Kubelet agent and Kubernetes API. The Virtual Kubelet is a Kubernetes agent that runs in an external environment and get itself registered as a node in the cluster. The agent uses the Kubernetes API to create a node resource on the cluster. It schedules pods in an external environment by calling its native API, using the principles of taints and tolerations.
Figure 5:Microsoft Virtual Kubelet
Virtual Kubelet works with Azure Container Instances, Azure IoT Edge, and AWS Fargate control plane.
#4. Deploy Kubernetes Devices at the Edge — In this approach, Kubernetes device plugin framework is being leveraged to expose leaf devices as resources in a Kubernetes cluster.
Microsoft Akri exposes a range of sensors, controllers, and MCU class leaf devices as resources in a Kubernetes cluster. Akri project applies Kubernetes device plugin framework to the edge, where there are variety of leaf devices with unique communication protocols and intermittent availability.
Figure 6:Akri Project — Architecture | Source
Akri currently supports ONVIF, udev, and OPC UA Discovery Handlers. Support for more protocols is under development.
As seen above, various deployment approaches demonstrate how Kubernetes can be used for edge workloads, as well support for the architecture that meets enterprise application’s requirements — low-latency, resource constrained, data privacy and bandwidth scalability etc.,
These architectures patterns/approaches offer good business propositions for us to work with customers and realize industrial edge solutions. They can help realize the goal of achieving higher efficiency offered by the adoption of a Kubernetes for Edge computing for variety of industrial use cases.
As business adopt Digital transformation, Industry 4.0, Industrial Automation, Smart Manufacturing, and all the advanced use cases that these strategies offer, the industry is recognizing the importance of the Kubernetes, edge and cloud collaborating together to drive intelligent business decisions.