Investing analysis of the software companies that power next generation digital businesses

Fastly (FSLY) Stock Review

Fastly (FSLY) is a leading edge cloud platform. While they support traditional CDN use cases, their focus and future growth opportunity lies in enabling edge computing. Edge computing represents a large addressable market and Fastly is rapidly rolling out new product offerings in this area. They are developer-centric with strong appeal to the developer community, due to their advantages in performance, security and programmability. Fastly has broad penetration among a set of forward-thinking enterprise customers and is experiencing high DBNER. They are founder led, with a strong leadership team. For these reasons, I think FSLY stock is worth consideration as part of a long term investment portfolio.

Fastly went public in May 2019. I do not ordinarily invest in software companies until they have been publicly traded for a year. This allows for several quarters of results to review and any IPO price inflation to wear off. So, I am not setting a formal price target on this stock. However, given its position as a software stack company and associated growth potential, I plan to monitor it closely and may open a position in the middle of this year.

Readers will find this post long and somewhat technical, with a focus on Fastly’s product offering, the technical underpinnings, the competitive landscape and addressable market. I think this foundation is important to properly evaluate Fastly for investment and to adjust expectations as the market is evolving rapidly.

Financial Overview

I will not spend time a lot of time on deep analysis of the company’s financials. Not because these aren’t important as part of an investment decision, but I think product development, customer fit and addressable market are better determinants of long term growth. Also, extensive financial analysis is available on other blogs, like Seeking Alpha, and I don’t need to be redundant.

With that said, here are some high level financial metrics from the most recent quarterly report (Q3 2019) that I find appealing:

  • Year over year revenue growth of 35%.
  • Dollar-Based Net Expansion Rate (DBNER) of 135%, up from 132% in the prior quarter. This highlights strong expansion within existing enterprise customers.
  • Total enterprise customer count of 274, which represents 86% of revenue. Enterprise customer count grew 4.6% from the prior quarter.
  • Average enterprise customer spend of $575k, increasing from $556k in the prior quarter.
  • Gross margins are reasonable (but not high) at 56.1%, increasing from 54.8% from the prior year (Non-GAAP).
  • Non-GAAP operating loss of $9M. Non-GAAP EPS of ($0.09) versus ($0.13) consensus.
  • Cash and short term investments of $207.8M.

The company is not profitable. Non-GAAP operating loss was $9M on $50M in revenue. The current Enterprise Value is about $1.9B. The trailing EV/revenue ratio is 10.7. With a target revenue for 2019 of $196M, their forward EV/revenue for this year (which in theory is complete) is 9.7. For 2020, the average analyst estimate for revenue is $256.5M, representing growth of about 30.6%. I think this is a little conservative. It’s likely Fastly can match 2019 revenue growth of 35% (their base is still small and DBNER is high). This would give a revenue target of $264.6M and a 2020 EV/revenue ratio of 7.2.

To see how this compares to other software names growing in the 30% range, I looked at quarterly revenue growth, operating margin and EV/Revenue ratios for ZEN, NOW and DOCU.

As you can see, Fastly has the lowest relative EV/Revenue ratio, but also the worst profitability measure. This reflects the market’s current bias towards software companies that are demonstrating a path to profitability.

Whether Fastly can begin to demonstrate operating margin improvement in 2020 will likely strongly influence its valuation. Revenue growth in the 35% range would justify a EV/Revenue ratio higher than 7.2, based on our comparison. Analysts are estimating Non-GAAP EPS of -$0.42 in 2020 and -$0.28 in 2021, compared to -$0.52 target for 2019. If Fastly can outperform over these targets, that will help propel the valuation over the next 2 years.

In an effort to generate some sort of price target, let’s try to estimate 2021 revenue. Analysts have an average estimate of $327.1M, representing growth of 27.5% over their 2020 estimate. I will be a little more optimistic and assume 32% growth over the $264.6M estimate. This gives $349.3M, and a EV/Revenue of 5.4. If a comparable EV/Revenue ratio at that time would be 10-12 for a 30% grower with losses, but demonstrated operating margin improvement, we could see valuation double over the next 2 years.

Product Overview

Pulling some content from Fastly’s investor relations page, we see their mission is focused on content delivery in a fast, secure and scalable manner.

As the consumption of online content continues to grow globally, organizations must keep up with complex and ever-evolving end-user requirements. We help them surpass their end-users’ expectations by powering fast, secure, and scalable digital experiences. With Fastly’s edge cloud platform, our customers are disrupting existing industries and creating new ones. Today, our platform handles hundreds of billions of internet requests a day.

This sounds very much like a CDN (Content Delivery Network), which arguably Fastly was at its founding. However, Fastly is positioning themselves as evolving beyond their CDN roots towards providing edge compute solutions. The subtlety lies in how CDNs are perceived and their opportunities for growth versus commoditization. Looking at the evolution of CDN’s historically helps provide perspective.

A Bit of History

In the late 1990’s and early 2000’s, Internet usage and web sites were experiencing explosive growth. Original web site hosting architecture dictated that all web content, both dynamic files (calculated at time of request) and static files (created once and rarely changed) should be served from “origin” servers, representing the web site’s own data center. Web site companies added some sophistication to this approach, by designating certain servers for static content (images, CSS files, video files, PDF documents, etc.) and putting caching servers in front of them. This at least improved performance of delivering static files to users’ browsers, but all content was still served from a central data center. This, of course, created the problem whereby a user on the East Coast of the U.S. might be requesting content from a data center on the West Coast of the U.S. Worse lag applied to international users.

In the mid 2000’s, page load times become a focal issue for web site providers. There was much literature published (by Google and others) about how latency in loading a web page can result in user session abandonment. There were studies published that made claims of user engagement dropping by significant percentages for every second of page load latency. Many tools emerged to measure page load times from multiple locations in the world (like Gomez) and web site operators began to obsess over optimizing page load time.

As a result of this, web site companies got the idea that they could move these static content servers to be closer to end users. Before modern CDN’s were available, companies would actually build their own static content delivery stacks in data centers around the country to facilitate this. When I was the VP of Engineering at Download.com (CNET Download), we maintained several of these bespoke delivery stacks for the most popular software files.

Quickly, independent companies sprung up to capitalize on this opportunity to provide content delivery services. The CDN industry was born. Major providers were Limelight, Akamai, Level 3 and some others. These established hosting infrastructure (POPs) at data centers throughout the world, with a system to upload, refresh and deliver static content to users. They made it easy to split out static content from a web site provider’s web page and add references to the CDN’s network for images, CSS, Javascript, videos, etc. This process worked wonderfully. Web site operators saw their page load times drop dramatically and CDN providers experienced strong revenue growth.

Once Cloud providers like AWS emerged in the late 2000’s, they got into the mix as well. This service offering was pretty easy, as hosting static content was a straightforward extension of hosting storage and compute. AWS launched their Cloudfront service as a beta in 2008. By early 2010’s, they had POPs numbering 10-20 in the U.S. and overseas. As of October 2018, they had 138 access points (127 edge locations and 11 regional edge caches) in 63 cities across 29 countries. Azure and GPC offer similar services.

Through the 2010’s CDN services evolved to include video file delivery and have been heavily tapped to facilitate video-on-demand offerings from the emerging streaming services. Also, CDN providers realized that they could help with external security for web sites, specifically in the areas of DDOS protection and Web Application Firewalls (WAF). Given that they had points of presence (POPs) distributed across the globe, they could add services to monitor all inbound network traffic for a web site and filter out undesirable traffic, like a denial of service attack or repeated SQL injection requests. Akamai, Cloudflare, Imperva and others offered these services, in addition to basic CDN.

At this point, technical and product evolution in the space slowed and the services became commoditized. The offerings of the CDN providers started looking similar. Web site operators could really only distinguish between CDN providers based on the number of POPs and price.

Edge Computing Emerges

CDN services were largely designed to cache large numbers of static files and distribute these to the Internet users closest to them. They performed little logic in deciding what content to distribute and didn’t offer web site operators much programmability, beyond a centralized Admin API for uploading and refreshing content.

However, in parallel, several use cases emerged that encouraged moving compute out of centralized data centers. One driver was the need to reduce bandwidth from local network entry points to the central data center. The emergence of IoT, in which large amounts of raw data is collected by millions of devices, made this a necessity. With edge computing, data could be processed, filtered and aggregated before sending it back to a central data center.

Another driver was the desire to continue to improve end user response times and capitalize on new revenue opportunities. Authentication and authorization to view content represent a good example. If a user is a subscriber to a content service (like a newspaper or video subscription service), the decision as to whether the user can access a piece of content or not can be made at the edge, where the content is locally cached. This moves a bit of logic (basically the same centralized server compute) to a server on the edge, resulting in a better user experience by cutting out the round trip back to the central servers. Another example is around displaying personalize ads to the user, based on their viewing habits. Live video is a good use case, here, where a targeted video ad would fetch a higher CPM. Finally, the growth of APIs for providing packaged server responses that are valid for a finite time provides another opportunity to calculate and cache a response at the edge. These APIs drive mobile apps, gaming, mapping and other applications running on a local device, where latency is particularly noticeable.

These use cases provide big opportunities for distributing compute capabilities to locations closest to end users. As volumes of data generated by IoT devices increases and tolerance for lag time in the newest wave of responsive apps (gaming, mapping, AR, etc.) reduces, edge computing will become increasingly in demand by content and app providers, competing for customer attention.

Fastly and Edge Compute Offerings

Given this backdrop and the evolution of CDN offerings, let’s take a look at how Fastly has positioned themselves. Fastly was founded by Artur Bergman in 2011. He was previously the CTO at Wikia, which is a wiki hosting platform (like Wikipedia – articles anchored on a particular topic). As the story goes, Artur became frustrated with the capabilities of CDNs around 2010. He then decided that he could build a better solution and did so.

Interestingly, when Fastly was first getting off the ground, they referred to themselves as a CDN as well. This was acknowledged in press articles covering their fund raising activities over the course of 2014 – 2015. From Venture Beat in 2014, “Fastly grabs $40M on its quest to build a big, cool content-delivery network.” And TechCrunch in 2015, “Fastly Raises $75M For Its Real-Time CDN.” However, by 2017, references to CDN were replaced with moving computing to the edge. In a Computerworld article, “In the need for speed, Fastly goes all the way to the edge.”

The company is today expanding its offering to include a new “edge cloud platform.” The idea of this announcement is to move Fastly away from simply being a content delivery network (CDN), as it explains. Since significant amounts of the traffic traversing the internet is actually generated by users sitting at the edge (think connected cars, wearable devices and connected homes), this addition is designed to both process and serve the data at the edge.

ComputerWorld, April 2017

Coinciding with that announcement, Fastly launched three new services WAF, Image Optimizer and Load Balancer, in addition to its edge compute platform.

This brings us to the current Fastly product offering, which revolves around the following 6 areas.

Let’s take a brief look at each of these, and then delve more deeply into Edge Compute, where the bulk of the future opportunity lies.

  • Content delivery and image optimization. This represents Fastly’s traditional CDN offering. With POPs spread across the globe, Fastly’s CDN can deliver content rapidly to local users. They cache content on SSD’s (solid state drives), which have fast retrieval rates (similar to memory, versus slow disks). This reduces response times. Fastly also advertises cache purge times of 150 ms on average, which is very fast. This is important for caching content that changes frequently, like election results, traffic maps or weather images. Finally, activity data about content access can be streamed and centralized into a log analysis tool for decision making through their Real-Time log service. Logging tools supported include popular commercial aggregators, like Sumo Logic and Papertrail, in addition to the open syslog format. Finally, they offer a useful “origin shield” service, which protects the customer’s servers if a burst of traffic requests occurs at the same time as cache is invalidated.
  • Video & Steaming. This is CDN for video. Fastly can cache and deliver a wide variety of video formats. They claim a high cache hit ratio of 97%, which reduces video loading time and stress on the customer’s origin servers. They can protect video content from unauthorized access or distribution with Edge Authorization Tokens, which creates expiring URLs for access. Finally, they support dynamic ad insertion, so that content providers can make on the fly decisions about what ad to serve in between videos.
  • Cloud Security. Like other CDNs, Fastly’s large globally distributed network allows it to naturally extend its offerings into security. It provides protection against web application vulnerabilities, DDoS and botnet attacks. Security rules are enforced at the entry point into hosted customer applications. Suspicious inbound traffic is reviewed against rulesets and automatically scrubbed to prevent malicious behavior. Bot detection is provided through a partnership with PerimeterX and prevents account takeover, scraping, digital fraud and complex application-layer attacks.
  • Load Balancing. Leveraging its footprint, Fastly is also able to offer global load balancing. Companies use this service to direct traffic to local data centers for internationalization, legal requirements (GDPR) and content versioning. Their Layer 7 load balancer allows customers to define content-aware routing decisions and instant fail-over. Fastly’s solution provides more granular control over routing than traditional DNS-based solutions, which are broad. 
  • Managed CDN. For customers with particular security or data privacy requirements, Fastly offers a managed CDN solution. This basically duplicates Fastly’s CDN capabilities within a customer’s own private network. Fastly can deploy their edge cloud platform on dedicated POPs within the customer’s network at multiple locations. This service can be used on a single private network, or as part of a hybrid, multi-CDN strategy.

Edge Compute Technology

Edge Compute is the newest offering from Fastly and the area with the most potential. In the other five product areas, Fastly’s offering is equal to or incrementally better than that of competitors. These are still areas of commoditization, where competition around features and price will continue.

Where Fastly is making its big bet and primary technology investment going forward is in edge computing. This involves providing programmability for customers in multiple, distributed locations. The benefit for customers is similar to the driver behind CDN. By moving processing and delivery closer to end users, customers can reduce latency and bandwidth requirements associated with sending all requests to a central data center. Fastly achieves programmability by exposing a hosted runtime and development environment for customers on its POP servers.

This is important for investors to consider, as this capability hasn’t existed previously. While it was straightforward for CDN providers in the past to stand-up clusters of caching servers at distributed global POPs and orchestrate the management and delivery of static content, it is an order of magnitude more complicated to enable developers to run code on these edge servers in a coordinated fashion. Also, fewer use cases existed 10 years ago to make running code on the edge a priority. Companies were just getting used to the idea of moving their centralized infrastructure to the cloud, but it was still centralized. Comfort with and capabilities around code distribution and orchestration (containers, Kubernetes, etc.) have evolved to the point where this is conceivable. Also, a plethora of use cases have emerged in the last couple of years that elevate the priority of running code closer to the edge of the network. As mentioned previously, these revolve around IoT (explosion of devices collecting data), gaming (extreme latency requirements and high bandwidth), AR (high bandwidth) and authentication (controlling access to resources).

Fastly’s intent to focus on edge computing was taken to the next level in November 2019, with the announcement of their Compute@Edge product. This offers a language-agnostic compute environment running on Fastly’s distributed network. It is designed to allow developers to build more advanced edge applications with “greater security, more robust logic, and new levels of performance”.

The platform is deployed in a “serverless” model, which means that Fastly is not continuously running servers with customer code waiting for user requests. Funny, “serverless” is a bit of a misnomer. A server still processes code in response to a user request. It’s just that this server is not started until the request comes in, versus the normal model of keeping a server running continuously with active threads waiting for requests.

In Fastly’s model, when a request comes to a POP, the edge compute platform spins up a web container to process that specific request. In the past, this server start-up involves noticeable latency, ranging from hundreds of milliseconds to even several seconds. If a user request has to wait more than 100-200 milliseconds for a server response (not including time to process and return content), then the lag becomes noticeable to humans. Fastly claims their servers can start up in 35 microseconds, which is extremely fast and pretty much obviates the core argument against serverless modes of operation in the past. Fastly claims this is 100x faster than competitive serverless solutions. This is a fair assessment, as AWS Lambda previously took several hundred milliseconds and now requires at least several milliseconds to boot.

The engine behind Fastly’s serverless infrastructure is based on WebAssembly, which was originally designed to allow browsers to run code locally, but has been extended to the server. WebAssembly is a great framework for this, as it is already designed for high performance, low memory footprint and security. Fastly built their own compiler and runtime, called Lucet, which leverages and extends WebAssembly for server-side execution requirements. You can read about Lucet in this blog post. It has been open sourced, so that developers understand how it works.

Being able to run in a true serverless mode provides several advantages to Fastly’s edge compute configuration. First, POPs would require fewer servers than competing offerings. This is because the servers would only be spun up to handle each request for each customer’s app, versus needing to consider how to maintain some amount of processing overhead online for each customer, whether it is used or not. Second, there are significant security advantages to this approach. Specifically, each request can be run in a completely isolated manner. Since the processing container is spawned for just that request, there is no sharing of resources (memory, container threads, etc.). Therefore, common exploits to access the shared memory space of the server would come up short.

The Compute@Edge development and runtime environment supports two languages currently, with more language support planned in the future. First is the Varnish Configuration Langurage (VCL). Varnish is a web accelerator, which basically describes a system that is designed to cache and optimize the delivery of content over the web. Logic was added to Varnish using the Varnish Configuration Language, a domain specific language used to write action scripts that are tied to hooks triggered at designated points in the handling of each web request. VCL would be familiar to many developers, who had previously used Varnish for their own centralized web caching implementations. The second language is Rust, which is growing in popularity among developers. As the developer community considers options for next generation languages, Rust is often put forth as a leading contender. It is designed for concurrency (high performance) and security (memory safe), which are both important considerations for modern applications. Rust is designated as the “most loved programming language” in the Stack Overflow Developer Survey.

Finally, Fastly is a certified PCI DSS Level 1 service provider. They provide fine-grained controls and a security model that allow sensitive PCI or HIPAA-related content to be served. Legacy CDNs can only handle this type of content by routing it on separate, sub-optimal networks.

Customers

Fastly focuses on large enterprise customers, which they claim distinguishes them from a competitor Cloudflare, that caters more to SMB. Fastly defines an enterprise customer as spending more than $100k annually. As mentioned in the Financials section, they reported 274 enterprise customers in Q3, growing 5% q/q and representing 86% of revenue. Fastly’s DBNER with customers is extremely high at 135%, increasing from 132% in the prior quarter. This indicates that enterprise customers are finding new use cases for Fastly to address, which is an important consideration for future growth, as edge use cases continue to expand.

Reviewing Fastly’s customer list, we see a Who’s Who list of leading Internet brands. Examples span digital publishing, e-commerce, online video, SaaS and travel. I will highlight some of the interesting customer use cases below for context.

  • Digital Publishing (Conde Nast, Gannett, NYT, Pinterest, Buzzfeed). Conde Nast uses Fastly to manage user authentication at the paywall. Having this processing at the edge likely results in a much faster response time to get the subscriber to the content they requested. Pinterest moved the “Pin It” functionality to the edge, off of centralized servers on AWS.
  • Ecommerce (Dollar Shave Club, Shopify, Wayfair, Boots, Deliveroo). Wayfair leverages the edge caching of images and uses VCL scripts to control delivery rules at a granular level, a capability other providers weren’t able to support. Shopify uses Fastly to optimize content delivery for their 325,000 (or more) individual merchant sites. They value Fastly’s real-time logging infrastructure and ability to programmatically update VCL scripts to support rapid feature updates and A/B testing. They had moved off of a legacy CDN provider who couldn’t support these capabilities.
  • Online Video and Audio (Sherpa, Vimeo, FuboTV, Shazam). FuboTV streams live and on-demand video through Fastly and also improves API response delivery. Vimeo is interesting because they host core infrastructure on Google Cloud Platform (GCP) and valued Fastly’s native integration. Vimeo uses surrogate keys to allow for rapid cache invalidation and origin shield to minimize requests back to origin servers on cache misses for popular videos. Finally, the Vimeo engineering team expressed comfort in the use of Varnish as a core component of the Fastly framework.
  • SaaS (Stripe, Github, New Relic, Pantheon, Foursquare). Foursquare was able to fine tune their caching configuration with VCL and accelerate API responses by turning over SSL termination to Fastly. Stripe moved the checkout process to Fastly and valued the security associated with Fastly’s edge solution. They also use DDOS capabilities. Interestingly, Fastly was introduced into Stripe by a new VP of Engineering, who had used Fastly at a prior company.
  • Travel (Kayak, Airbnb, Alaska Airlines, TripAdvisor, HomeAway). Lonely Planet automated their deployment process, which included pushing custom VCL scripts to Fastly. This sped up their feature development times. They also use Fastly to serve localized (language, currency) content to each user, depending on their country of origin. They claim this dynamic logic to control content selection was not possible with other CDNs.

Competition

As CDN has been around for over 10 years, a bevy of companies exist offering similar services for content delivery, global load balancing, cloud security solutions and video streaming. I won’t spend a lot of time reviewing these, as this is a fairly commoditized business. As mentioned above, Fastly distinguishes its solution incrementally in these core CDN areas and lays claim to “higher-end” use cases around video. Wikipedia provides a fairly complete list of traditional CDN providers.

I’ll focus on edge computing solutions as part of a competitive analysis, as this represents the primary growth area for Fastly and opportunity for differentiation going forward. Also, let me caveat this by saying that the space and vendor offerings are evolving rapidly.

Fastly Market Messaging

First, I’ll start out by saying that Fastly has done a great job in their market messaging and developer evangelism. Upon first visiting their web site, their focus on enabling edge computing and developer friendly solutions is clear. I didn’t find this as easily on other provider sites.

On quarterly conference calls and analyst meetings, the leadership team continuously reinforces their developer first motion, which reminds me a lot of other favorite software stack companies, like Twilio and MongoDB.

Additionally, as mentioned in the customer section above, their target market of enterprise customers is clear. Images of impressive customer logos are strewn through the site, along with many published customer use cases. This gives confidence for CTO’s and VP’s of Engineering (like me) that Fastly would be a reliable solution for their mission critical traffic.

Finally, along the lines of their devloper-centric approach, I found Fastly’s documentation, API completeness and starter code modules to be extensive. On the “Build on Fastly” section of the site, they provide step-by-step tutorials and ready-to-deploy code, addressing many common edge computing use cases.

There are over 80 of these pre-built code blocks. I expect this to continue to grow. Even more impressive, each links to a working demo using jsFiddle, a popular interactive code playground for running code in a browser.

The example above provides code for processing regular expressions, which would be a common task associated with compute on the edge. This would provide a developer with the ability to add data processing logic to their edge compute solution, like pattern matching on desired data fields and filtering out the rest (for, say, an IoT stream).

I realize this level of depth is a bit technical for most investors. However, my intent here is to demonstrate the lengths to which Fastly is going to make their solution appealing to developers. As I mentioned in a prior post, winning developer mindshare is one of the most important criteria for determining long term success for software stack companies. Fastly gets high marks in this area.

AWS

Amazon offers their Lambda@Edge product for edge computing. This is a combination of Amazon Cloudfront, which is their popular CDN service, and AWS Lambda, which provides Amazon’s serverless solution. This is a smart strategy by AWS, as they can leverage their investments in both products for their edge offering. Lambda provides a compute environment and supports many languages. The code run on AWS Lambda is called a “Lambda function”, and is available on demand. Lambda functions can be written in Java, Go, PowerShell, Node.js, C#, Python, and Ruby (and a runtime API for other languages to call). This provides a big advantage over Fastly’s compute environment currently, as Fastly edge logic can be written using VCL and Rust. However, they intend to expand to other languages in the future.

The one downside to AWS’s use of Lambda is that Lambda has been associated in the past with being ideal for processing asychronous loads. These are primarily offline jobs, for which an actual human is not waiting for a response. This was based on the foundational premise of “serverless”, being that offline jobs are called infrequently and therefore don’t need a dedicated server daemon running continuously to process requests. Therefore, a start-up time of several hundred milliseconds (or seconds) was acceptable, as a trade-off for the cost reduction of not having a server online continuously (compute charged by the second of use). This led to a “cold start” problem for any developer considering Lambda for a synchronous workload, as the latency would be noticeable for humans.

My understanding is that Amazon is making strides to speed up Lambda’s “cold start”, but it hasn’t reached the sub-millisecond response times claimed by Fastly.

In terms of solution marketing, Lambda@Edge does provide some sample use cases and a few links to blog posts with “how to” details and working code. As of the time of this writing, I counted 10 examples.

Azure

In a similar approach to Amazon, Microsoft Azure offers edge computing (called Intelligent Edge) through the use of Azure Functions, which can be deployed on Azure, Azure Stack (a managed physical appliance) or Azure IoT Edge. Like Amazon’s Lambda solution, Azure Functions were primarily designed for asynchronous processing. This extension of Azure Functions does have the benefit of applying the same programming models, data access and deployment tools for edge computing, as are available on centralized Azure.

Like Lambda, Azure Functions supports several programming languages, including C#, Java, Javascript, Powershell and Python. Documentation is pretty extensive and I found 31 pre-built code examples for sample Functions. From Functions, a developer has access to all other Azure services and data sources available, all controlled by a unified security model.

Microsoft’s edge computing solution appears very focused on two areas of differentiation. There is a noticeable emphasis on enabling IoT use cases, with a lot of explanation and support for the ability to gather data from large numbers of industrial devices. Additionally, they offer extensive machine learning capabilities and models to efficiently process all the data streaming from devices. Harnessing these pre-packaged capabilities would save a lot of time.

In terms of customers, the intelligent edge site lists customer stories from Airbus, Starbucks, Sneider Electric, Shell and Kroeger. In all these examples, the use cases appear to focus on interactions with industrial devices, physical stores and product inventory.

Akamai

Akamai has one of the oldest CDN offerings. They were born of the Internet, so to speak, incorporated in 1998, many years before there were cloud providers.

Akamai supports all the standard CDN functions, including static content delivery, global load balancing and video streaming. A lot of their focus has shifted to security recently, providing Web Application Firewall (WAF), DDOS mitigation and bot protection. They even have a neat interactive graphic on their home page showing live attacks across the globe. I have used Akamai’s services for DDOS protection at past companies and they were sufficient.

For edge computing, Akamai offers their Intelligent Edge Platform. However, it is promoted as a “defensive shield that can surround and protect everything — sites, users, devices, data centers, clouds. It is the technology that eliminates friction and enables immersion.” It doesn’t appear to support programmability or compute at the edge locations. Under a bullet about a extending business logic to the edge, the description reads “Business-critical activities such as global load balancing and URL redirection occur at the edge of the platform, closer to your customers, for performance improvement and simplified workflows.”

With that said, Akamai has a deep and impressive customer list for the services they offer.

  • 46 of the top 50 online retailers in the U.S. and Canada
  • 7 of the top 10 European online retailers
  • 47 of the top 50 television networks in the US
  • 18 of the largest asset managers
  • 7 of the top 10 world’s largest banks
  • 7 of the top 10 world’s largest insurance carriers
  • 7 of the top 10 world’s largest newspapers

Therefore, new capabilities or an acquisition of a smaller edge compute provider by Akamai should be watched closely, as they have a ready-made customer base for deep penetration in edge computing.

Cloudflare

My first introduction to Cloudflare (NET) was associated with their DDOS mitigation services. When DDOS attacks were getting a lot of news in 2010 – 2013, Cloudflare generated a fair amount of press in fighting them.

They still appear to have a bias towards security prevention and DDOS mitigation. See the “Under Attack?” link at the top of their home page. This would be used by a CTO whose site is down due to an ongoing DDOS attack. Obviously, with a global network of POPs to manage their customer’s network traffic, it was straightforward for Cloudflare to evolve into other CDN services – including web content, DNS, load balancing, video streaming, even domain registration. Security offerings still dominate the product list – DDOS, WAF, bots, rate limiting, SSL, private tunnels, etc. This bias towards security is further reinforced by their recent acquisition of S2 Systems, which provides a browser isolation technology that executes browser code on cloud servers rather than on a user’s device, to isolate devices from security threats.

They now advertise a serverless platform solution, called Cloudflare Workers. Their customer list for the serverless / edge solution highlights the following.

Like other serverless offerings, it supports multiple languages, including JS, Rust, C and C++. The choice of Rust is interesting, as that overlaps with Fastly. Cloudflare also adopts the model of allowing serverless compute to be distributed across their global network of POPs. They advertise much faster cold start times than the cloud providers, in milliseconds. Their documentation is pretty extensive, along with 28 code snippets in their Template Gallery.

When asked about Cloudflare on analyst calls, Fastly leadership distinguishes Cloudflare as focusing on SMB customers, versus the enterprise focus of Fastly. I can’t comment on the accuracy of this. Cloudflare (NET) recently went public, so we should expect further product expansion by them.

One note for investors, I am not downplaying the opportunity for Cloudflare or Akamai relative to their seeming focus on security. That represents a large opportunity. I am simply contrasting this with Fastly’s singular focus going forward on edge computing.

Do It Yourself

While caching solutions are available to larger companies using open source solutions like Varnish and Squid, it would require significant configuration effort for a company to set up their own globally distributed edge runtime. This is probably reserved for the largest players, like Facebook or Uber, who tend to stand-up many network and compute services themselves.

Third Party Reviews

For another perspective on the competitive landscape, we can look at a couple more data points. First, Sumo Logic recently published their Continuous Intelligence Report, which details trends in application architecture, deployment processes and management tools in the cloud, based on observations across their 2,000 customers. I covered this in a prior blog post.

If you aren’t familiar with them, Sumo Logic provides a cloud-native platform that allows companies to collect the large volume of event data created by their software applications. This application data is processed to generate “continuous intelligence” or insights that help improve operational, business and security performance. Their Continuous Intelligence Report leverages the configuration data from their customers’ application stacks to calculate usage trends for modern software application architecture and management.

In the area of CDN adoption, they provide some comparative data for customers on AWS, which arguably is the largest cloud provider currently.

Sumo Logic Continuous Intelligence Report, Dec 2019

While CloudFront has the lion’s share of usage at 25%, we would expect this, as it is easy to integrate CloudFront into AWS for basic use cases. What is more surprising is Fastly’s share at 5%. First, it is almost equal to Akamai, who had about $700M in revenue in Q3, versus Fastly’s $50M. Also, both Akamai and CloudFront have been in the market longer than Fastly. Even Sumo Logic commented, “Fastly, a relatively new CDN vendor is experiencing similar adoption as Akamai, the global leader.”

Regarding performance of serverless solutions across providers, I found a site called Serverless Benchmark , which is referenced from the Cloudflare site. Serverless Benchmark set up a sample node.js function and runs it on each of the major cloud provider’s serverless offerings. Included are AWS, Azure, GCP, IBM and Cloudflare.

These function tests are called every 3 hours, response times collected and then performance data is aggregated. Of interest are the coldstart benchmarks, which represents how long it takes for a serverless service to spin up and respond to a request from a “cold” state. Cold state refers to the fact that most serverless services anticipate that after one request, more will likely follow and therefore keep the serverless “server” online for some period of time. These requests are then handled from a “warm” state.

Graph on serverless-benchmark.com, Jan 2020

While Fastly is not in the comparison set, they claim response times of 35 microseconds from a cold start for their Compute@Edge solution, which is 1000 times faster than the comparative times reported for other providers. Granted, I did not verify this information, and actual performance of Fastly in the wild may vary, but it is directionally useful. The publisher of the Servless Benchmark site provided a write-up of his testing approach.

Addressable Market

The total addressable market for Fastly’s services is large. This can be categorized into several areas, aligning with Fastly’s product offerings. The focal point is edge computing, but we can also roughly size content delivery, video streaming, cloud security and load balancing. Commentary on market size is available in the S-1 filing from April 2019, which is still relevant.

Fastly estimates the total market opportunity for their services to be $18 billion in 2019. They expect this to grow to $35.8 billion by 2022, with an expected CAGR of 25.6%.

For edge computing, they estimate a total market worth $2.7B in 2019, growing with a very high CAGR of 35.4% to $6.7B in 2022. This is based on data from Markets and Markets, an independent market research firm commissioned by Fastly. The definition of the edge compute market is skewed towards IoT in this case, which represents an opportunity for Fastly, but may not capture all the custom application logic use cases being created by customers.

For the other CDN-like services, estimates for web content delivery and video streaming combined total $7.3B in 2019, growing to $16.7B in 2022. Web content delivery is expected to grow at a CAGR of 29.5% and video streaming will experience a CAGR of 32.3%.

Cloud security includes WAF, bot detection and DDOS prevention. Combined, these markets represent about $5.1B of spend in 2019, growing to $8.7B in 2022.

Finally, traffic management solutions represents the market for Fastly’s load balancing and routing solutions. This market is projected to grow from $2.9B in 2019 to $3.7B in 2022.

Leadership

Fastly is founder-led, which is one of my characteristics of a successful software stack company. They have a strong leadership team with extensive experience in the domain.

Artur Bergman founded Fastly in 2011 and co-wrote the first implementation of their content delivery service. This reminds me of other hands-on software stack company founders, like Jeff Lawson (Twilio) and Eliot Horowitz (MongoDB). Prior to Fastly, Artur held engineering leadership positions at Wikia and SixApart.

Adriel Lares has served as CFO since May 2016, so he navigated Fastly through the IPO. Prior to Fastly, he served in CFO and finance leadership roles at Lookout (mobile security), HP (data and information storage division) and 3Par (acquired by HP, data and information storage).

Other notes:

  • SVP Infrastructure was formerly co-founder and CTO of DYN, a major DNS services provider. Reflects experience in scale and network infrastructure.
  • SVP Product held product roles at OpenDNS, Rapid7 and RSA. Obviously, a strong security background.
  • EVP Sales previously held sales leadership roles at Oracle, Turn, Adobe and Omniture. Enterprise sales focus.

Going Forward

As mentioned in the introduction, I am not setting a formal 5 year price target on Fastly, as I generally avoid investment in companies that have IPO’ed within the last 12 months. With that said, I think that Fastly offers a compelling consideration for investment. Their edge computing solution is market-leading and represents a growing opportunity with forward-thinking enterprises. With the growth of IoT, gaming, APIs and authentication, the number of use cases requiring compute at the edge is increasing. These secular tailwinds should benefit Fastly.

Risks primarily revolve around competitive threats from other cloud providers and the financial requirements for Fastly to scale and ultimately reach profitability. I plan to monitor FSLY closely over the next couple of quarters and may open a position then. Otherwise, I generally expect FSLY stock to appreciate meaningfully in the next 5 years, as they continue to capitalize on their leadership in this growing market.

4 Comments

  1. Brendan

    Your insights have been very helpful. I especially have enjoyed your detailed analysis of product and market structure.

    Let me know if I’m off here, but my view is that expanding platform for new languages does not necessary necessitate accelerating product adoption. Isn’t it more product or use-case driven? For example, Ruby on Rails didn’t necessarily make Shopify popular but more like Shopify helped make Ruby on Rails popular.

    • poffringa

      Thanks for the feedback, Brendan. Regarding Fastly’s expansion of support for languages on their Compute@Edge platform, I agree with you that use cases will primarily drive adoption by customers. Support for more languages makes it easier for the development teams within those Fastly customer organizations to implement their initial use case on Fastly and expand to future ones. Those teams could learn Rust or VCL, but would likely prefer to maintain consistency with whatever primary language(s) they use for other back-end applications. We can assume that AWS Lambda and Azure Functions thought about this with their choice to support mainstream languages like Java, JS, C#, Go, Python, etc, out of the gate. Fastly would benefit by continuing to add support for other languages (which is their plan), as choice of language might become a competitive differentiator in the future, if other criteria like security and performance become equal. Cloudflare probably represents the main threat here, as they seem to be trying to move upstream to enterprise companies and already support Rust, along with JS, C and C++ for their Workers product.

  2. Brendan

    Also, interesting but also possibly meaningless data point but on Stackshare, Fastly receives minimal inclusions in stacks or reviews. Cloudflare has several orders of magnitude more inclusions in stacks and reviews. I think this can possibly support the low brand awareness of company and nascent stages of use cases for edge compute?

    • poffringa

      Interesting point – you are probably right. My assumption is that Cloudflare has more representation on Stackshare due to their DDOS and CDN offerings, versus edge compute. They also have more traction with the smaller companies that tend to self-report actively on Stackshare than mainstream enterprises. Nonetheless, just like the Sumo Logic survey I mentioned in the article, Stackshare activity should be monitored going forward for adoption hints as well.