Compute@Edge with OpenFaaS
Darshan Ajit Kumar June 21, 2022

Serverless has been on an upward trajectory for the past few years, adding an additional layer of abstraction called Functions as-a-Service (FaaS). This allowed organizations to quickly develop, deploy, and maintain production-quality applications with as little scaffolding as possible. AWS Lambda, Azure Functions, and Google Cloud Functions are market leaders, with many other cloud players trailing not far behind.

This paper explores the open-source alternatives to these large cloud providers, focusing on one of the more popular frameworks – OpenFaaS.

As the tagline for OpenFaaS – “Serverless Functions, Made Simple” suggests, it makes building, deploying, scaling, and maintaining serverless functions as simple as your existing virtualized computing resources.

Cloud adoption in the past few years has been on a fast track, and the global pandemic has worked to over-clock digitization. For large enterprises that still have a lot of their IT infrastructure on-premise, moving their workloads to the cloud has been a more complex challenge than for those who were “cloud-native” from their infancy. Yet the appeal of reducing operational expenses, global reach, and quick GTM have pushed these organizations to move some of their workloads to the public cloud via top cloud providers, according to Gartner.

What happens then with this unused computing power, already paid for, sitting in data centers owned by large corporations? With OpenFaaS and Kubernetes, organizations can set up their own cloud agnostic serverless functions that are auto-scalable, secure, and, best of all, running on virtualized computing resources owned by themselves.

Introduction to OpenFaaS

Functions as a Service (FaaS) is a cloud computing model where everything except your functions (code) is abstracted and someone else’s responsibility to manage.

Popular FaaS providers include AWS Lambda, Google Cloud Functions & Azure Functions. They bring all the benefits provided by AWS, Google, and MS Azure, respectively, and much more. But they have their own set of drawbacks:

  • Cold starts – Cloud functions are notorious for cold starts so much so that they can be the reason why critical workloads are never taken serverless.
  • Resource Limitations – Each cloud provider sets its own resource limitations based on platform metrics.
  • Non-uniform tooling – Each FaaS service differs in the language support, debugging, and other key Developer Experience factors.
  • Migrating existing functionality to functions can be tedious and downright problematic if not done correctly.

Developed originally by Alex Ellis (https://github.com/alexellis), OpenFaaS is an open-source framework that allows one to deploy functions and existing code to Kubernetes. This allows you to package and run any program in a container, managed as a function, via the OpenFaaS command-line utility or web UI that comes out of the box.

The OpenFaaS architecture is based on cloud-native standards and includes – API Gateway, Function Watchdog, Prometheus, and container orchestrator.

API Gateway: Provides an external route into your functions. It has a built-in UI that can be used to deploy custom functions or functions from the OpenFaaS store. It also performs the operations of auto-scaling via AlertManager and Prometheus.

WatchDog: Startup and monitoring of functions. Create a function out of any binary.

Prometheus: Collects and presents metrics on performance, requests, and processes and allows API Gateway to make scaling decisions based on alerts and events.

Understanding Compute@Edge

The concept of computing at edge locations is not new, with cloud providers providing managed services at edge locations closest to the source of the data. However, this ability comes tightly coupled with the choice of the cloud provider. For instance, if the source of data is in a location where the cloud provider does not offer services, the implementation is set to fail from the get-go.

One possible solution is a hybrid model where organizations can use existing virtualized hardware on-prem with managed services on the cloud. Imagine this combined with the power of FaaS managed with OpenFaaS. You essentially can deploy functions on the cloud, on-prem, or any computing device on the edge and achieve the following:

  • Enhanced Developer Experience: Developers spend more time writing code than setting up scaffolding for their dev environments.
  • Vast Language support: This allows you to package any utility, assembly, or compiled code in docker format containers and serve as functions.
  • Scale-as-you-go: OpenFaaS architecture ensures scaling as demand expands or shrinks and helps keep everything available.
  • No cloud network tie-in: Docker and Kubernetes clusters can be set up on-prem as easily as on the cloud.
  • Rich community-led plugins and connectors: The open-source community contributes heavily to this with custom docker images, plugins, connectors, and CLI tools to help faster GTM.
  • No Not restricted by any traditional resource limitations set out by proprietary FaaS providers.
  • Great for A-B testing Orchestrator Independence

This concludes how any utility can be packaged and deployed as a function, in the process enjoying the flexibility and convenience of using a FaaS architecture. An open-source community can always step up to adopting FaaS as a viable architecture without having to tie down one cloud provider.

About the Author

 

Darshan Ajit Kumar

Working as a Senior Lead, PPE for Brillio. A technologist, engineer & open-source enthusiast, with 9 years of industry experience ranging from building, deploying & supporting enterprise-grade web apps. Experienced in architecting cloud-agnostic solutions, serverless applications & Dev-Sec-Ops processes across frameworks & tools. Helping set up cloud governance strategies & security contexts is another area of interest.

Let’s create something brilliant together!

Let's Connect