We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!


How can software be faster, cheaper, and more resilient? For many developers, the answer in 2021 was to move the computation out of a few big datacenters and into many smaller racks closer to users on the metaphorical edge of the internet, and 2022 promises more of the same.

The move is driven by physics and economics. Even when data travels at the speed of light, the time it takes to send packets halfway around the world to one central location is noticeable by users whose minds start to wander in just a few milliseconds. The price of data transmission is often surprising, and many CIOs have learned to make sure to include the cost of data exfiltration alongside the price of servers and disk drives.

These fundamental advantages are indisputable, but edge computing will continue to be limited by countervailing forces that may, in some cases, be stronger. Datacenter operators are able to negotiate lower prices for electricity and that typically means right next to the point of generation like a few miles from some large hydroelectric dams. Keeping data in multiple locations synchronized can be a challenge, and some algorithms like machine learning also depend heavily on working with large, central collections.

Despite these challenges, many architects continue to embrace the opportunity, thanks to the efforts of cloud companies to simplify the process. In May 2021, Amazon, for instance, changed its billing granularity for their Lambda@Edge functions from 50 milliseconds to 1 millisecond, opening up more opportunities. Developers are now paying closer attention to the time a function runs and splitting up work into smaller, simpler units that can take advantage of the lower prices.

Event

Transform 2022

Join us at the leading event on applied AI for enterprise business and technology decision makers in-person July 19 and virtually from July 20-28.

Register Here

AWS vs. Cloudfare

AWS’s decision was no doubt driven by competition from Cloudflare, a company with a strong focus on edge computing. In 2021, the company continued to push hard, focusing especially on the high egress fees that some cloud providers charge on data leaving their centers. Cloudflare’s R2 storage layer, introduced in September 2021, is pushing prices lower, especially for data that’s only accessed occasionally. The service is tightly integrated with Cloudflare Workers, their edge functions, opening up more opportunities for both storage and computation in their local nodes.

Cloudflare also announced more opportunities to simplify adoption by adding partnerships with database companies MongoDB and Prisma. Developers can rely upon modern query languages and well-understood database models with their Worker functions, expanding the opportunities to move workloads to the edge.

Other database companies are following the same path. PlanetScale, for example, is managing Vitess databases as a service, simplifying the work of horizontally scaling large datasets that span multiple locations.

A big question for 2022 will be how many workers return to offices. These locations are the original edges, hosting local databases and file storage, often down the hall. If the pandemic recedes and people return to the offices for a substantial amount of the workweek, the office building will again be a popular edge location. Cloud companies continue to push into company datacenters, offering hybrid solutions for on-premises hosting. It’s now possible to use much of the same software infrastructure from the cloud hosts in your local datacenter, saving time and money. Some CIOs continue to feel better about having the servers under the same roof.

Edge computing’s optimal location: phones and laptops

The ultimate edge location, though, will continue to be in the phones and laptops. Web app developers continue to leverage the power of browser-based storage while exploring more efficient ways to distribute software. WebASM is an emerging standard that can bring more powerful software to handsets and desktops without complicated installation or permissioning.

Computer scientists are also working at a theoretical level by redesigning their algorithms to be distributed to local machines. IBM, for instance, is building AI algorithms that can split the jobs up so the data does not need to move. When they’re applied to data collected by handsets or other IoT devices, they can learn and adapt while synchronizing only essential data.

This distributed buzzword is also more commonly found in debates about control. While the push by some to create a distributed web, sometimes called Web3, is driven more by political debates about power than practical concerns about latency, the movement is in the same general direction. Edge computing developers looking for speed and blockchain advocates looking for distributed algorithms will be tackling the same problems of splitting up control and blowing up centralized models.

This push will also be helped, perhaps unintentionally, by governments that battle to exert more and more control over an internet that was once relatively border-free. Edge computing allows companies to localize computing and data storage inside political boundaries, simplifying compliance with the combinatorics of a burgeoning regulatory thicket or swamp. AWS, for instance, talks seriously about adding city-level control and embargoes to their CloudFront. Remember the age of the city-state in history class, when Athens and Sparta reigned? That model is returning as the governments work to atomize the internet and hasten the growth of edge computing.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Author
Topics