Pushing the Edge

Jason Melo
Lifion Engineering
Published in
4 min readOct 24, 2017

--

Developers have typically understood their processing runtimes to exist either within their own data centers or targeted to the client side. However what opportunities for building the next iteration of distributed systems exists between the client and the server?

Can we ship code to the edges of our network to do real work as opposed to simply configuring routing tables or other IP rules here? What if the same runtimes we target on our internal servers existed at the edge, a new functional edge. Would this provide interesting and valuable use cases or is it just another means to configure networks while providing little functionality beyond what has been traditionally possible?

What is edge computing?

Edge computing is any programmable interface and runtime built on operating systems running on network devices between end users and hosting facilities, cloud or otherwise. These devices typically exist in the following three areas:

  • Internal systems edge: routers, load balancers, API gateways and reverse proxies such as NGINX, F5, HAProxy, AWS API Gateway or other gateways like Traefik or Kong
  • Public network edge: ISP’s, WAF, CDN’s like Akamai and Cloudflare
  • Mobile & IoT: phones, smart home connected devices, raspberry PI’s and home media

Applications of Edge Computing

CDN based runtimes like Cloudflare’s recently announced Cloudflare Workers which can be used to shape public internet traffic, handle optimistic cache warming, redirect in more intelligent ways to other regions based on any set of conditions, challenge suspicious activity at the edge, protect core processes from floods or nefarious traffic as well as pulling additional metadata and data from end users on their local machines to help speed up requests or return more rich information to their requests.

Current Implementations:
# Cloudflare Workers — built using the standard Service Workers API on top of Google’s V8 JavaScript engine, also used in Chrome & NodeJS. Service workers can make multiple sub-requests, in series or in parallel, and then combine the results into a single response message. Cloudflare calls out the following potential use cases for their new workers framework:

  • Use custom logic to decide which requests are cacheable at the edge, and canonicalize them to improve cache hit rate.
  • Expand HTML templates directly on the edge, fetching only dynamic content from your server.
  • Respond to stateless requests directly from the edge without contacting your origin server at all.
  • Split one request into multiple parallel requests to different servers, then combine the responses.
  • Implement custom load balancing and failover logic.
  • Respond dynamically when your origin server is unreachable.
  • Expand HTML templates directly on the edge, fetching only dynamic content from your server.
  • Respond to stateless requests directly from the edge without contacting your origin server at all.
  • Split one request into multiple parallel requests to different servers, then combine the responses.

# Akamai Cloudlets — a seemingly robust functional edge platform although the approach here seems more marketplace based where functions are made available from Akamai and it’s partners. It likely stands to reason that Akamai will follow a more extensible model like AWS and Cloudflare have adopted.

Network Routers, Reverse Proxies and API Gateway functional runtimes can shape internal traffic, implement custom asynchronous cache warming or other optimistic resource handling which could be used to reduce latency, form additional security controls or more intelligently distribute load across a system within a datacenter.

Current Implementations:
# AWS — Lambda functions are the means by which AWS’ API Gateway can implement custom rules for features like health checks, authN/Z, optimal routing algorithms and other interactions within the gateway. Through a combination of AWS Cloudtrail, API Gateway and Lambda powerful applications can be built in a distributed fashion to further isolate your EC2, database or other core infrastructure layers. The following diagram illustrates a client login where the core system is protected by a functional edge checking for authenticated clients by checking encrypted headers/cookies before ever touching your internal services:

# NGINX — nginScript is a JavaScript interface implemented on top of NGINX routers allowing for a number of different performance, quality and intelligent routing scenarios. nginScript was announced about 2 years ago and runs on top of the largest open source routers implemented in linux based web platforms today. Here NGINX is taking a add-on approach allowing customers to easily enable a more sophisticated JavaScript configuration and processing layer.

Wrapping up

Edge functions have a number of potential use cases like intercepting any incoming HTTP request. They permit web, mobile & IoT applications to do things like format responses at the edge, ensure only dynamic content is served from internal servers, process more complex security paths and execute custom load balancing with more proactive failover handling.

Over and again the usual questions of code management, operational skill needs, SDLC, CI/CD, testing and runtime maturity are raised and again these present both risks and opportunities for many organizations.

Edge computing should be invested in and become part of developer and architect’s knowledge sphere. The enterprise is more and more asynchronous, in edge computing the asynchronous approach is almost a requirement avoiding blocking operations that can cause network latency while introducing more points of potentially unhandled system failure. So far the prevalent programming languages in this space support native async models like NodeJS and Go. Although certain security and network shaping calls will need to be handled synchronously on the request thread.

More intelligence, more opportunities to predict and preempt, more touch points to help fully understand human and automated activity. I’m enjoying the potential opportunities although in my portfolio I’m currently taking a Hold with a future Buy approach.

How is your organization looking at the edge? As a novelty or as an opportunity to push the definition of distributed scale even further?

--

--

Technologist, Architect and Entrepreneur. Founder of high-growth startups in NYC and the Bay Area. VP of Technology @ nearForm