After two quarters of mixed results, Snowflake reignited investor sentiment with their Q2 earnings report. Revenue beat prior estimates for the quarter by a large margin, with management upping forward projections as well. Customer activity was the highlight of the quarter with record additions of $1M customers and surprising linearity in DBNRR. As we are two quarters past platform optimizations, Snowflake may be starting to benefit from additional workload migrations by large customers. Looking forward, their product strategy to expand the reach of the Data Cloud and bring application development directly onto customer data should provide additional drivers of platform utilization.
Audio Version
View all Podcast Episodes and Subscribe
Growth Metrics
After a smaller than normal beat on Q1 revenue, Snowflake delivered enormous outperformance for Q2. In Q1, they had projected product revenue growth of 80% year/year and actually delivered 84% growth, with about a $9M increase from their prior guidance. This was down as well from their Q4 beat of about 7% of annualized revenue.
For Q2, Snowflake leadership had projected product revenue in the range of $435M-$440M, representing year/year growth of 71%-73%. After finishing Q1 with 84% year/year growth, this appeared to portend a significant deceleration. To make matters worse, coming into the Q2 report, expectations were decidedly low. Several sell-side analysts had downgraded SNOW stock from a Buy to Neutral rating in the weeks leading up to earnings on August 24th. One even initiated coverage at Sell. The analysts referenced “channel checks” that seemed to reflect more customers cutting back on Snowflake spend and even forecast some competitive encroachment. Here are the main, market moving actions:
- BTIG: Aug 2, Downgrade from Buy to Neutral
- Guggenheim: Aug 12, Initiated at Sell with $125 PT
- Citigroup: Aug 16, Placed on Negative Catalyst Watch
- UBS: Aug 16, Downgrade from Buy to Neutral
Each of these created pressure on SNOW stock when announced. Given these concerns, I think the market was expecting a poor report, with revenue just meeting the company’s prior estimates, or even a slight miss. However, Snowflake delivered the opposite for Q2. Actual product revenue was $466.3M, up 83% year/year and 18.2% sequentially. This exceeded their prior estimate for 71%-73% growth by about $29M and 11% of annualized revenue growth at the midpoint. Q2’s 83% revenue growth was almost linear with Q1’s 84%. The sequential growth implies an even higher growth rate when annualized (about 94%).
Looking forward, Snowflake management beat analyst’s Q3 estimate and raised full year guidance, but by smaller margins. For Q3, they are projecting product revenue of $500M-$505M, for 60-62% annual growth and 7.8% sequentially at the midpoint. If they deliver the same size relative beat as Q2, revenue growth would reach into the low 70% range annually and 14% sequentially. This would represent deceleration, but we have to keep in mind that year ago revenue growth (Q3 FY20222) was phenomenal at 110%, providing a tough comparison. The Q3 projections beat analyst estimates for just under 60% growth in total revenue.
For the full year, Snowflake raised their product revenue target by about $18M to a range of $1.905B-$1.915B, for 67%-68% annual growth. This is up from the company’s guidance from Q1 in which they reiterated 65%-67% growth. The $18M raise is smaller than the $29M Q2 beat, but I think the market was happy that they raised the range at all. Total revenue is about 107% of product revenue. This implies that Snowflake will end this year with about $2.04B in total revenue.
If the sequential revenue growth rates each quarter drop into the 10-12% range in 2023, we could see 50-60% revenue growth next year. That would bring total revenue over $3.1B for 2023, lowering the forward EV/S ratio to about 19, even with the large stock price surge following earnings. With next calendar year just a few months away, this seems reasonable for high revenue growth and strong FCF generation. As I will discuss later, the key question will be how durable that elevated revenue growth rate is for several years into the future. If revenue growth remains over 50%, then compounding quickly brings down the valuation multiple further.
Other growth indicators in the Q2 report were favorable as well. At the end of Q2, RPO was $2.7B, up 78% y/y, but just 4% sequentially. This compares to Q1’s RPO of $2.6B, which was up 82% y/y. While this RPO linearity might cause concern, management signaled that the growth of RPO in Q4 should be exceptional. This is because many renewal customer contracts are increasingly lining up to Q4 and some customers are even “bridging” prior contracts so that they align this way. I imagine this helps IT leaders with their annual budgeting process. This circumstance makes RPO on a quarterly basis a noisy indicator – investors will need to look at RPO growth annually for a more reliable signal.
With that said, there was a large increase in current RPO in Q2, which represents the amount of RPO expected to be realized in the next 12 months. This jumped from 53% of total RPO in Q1 to 57% of RPO in Q2, representing about $1.539B in revenue. Current RPO was up 79.6% year/year, slightly higher than the growth in overall RPO and equal to Q1’s annual growth rate in cRPO. I think this linearity in cRPO growth was another aspect of the strong Q2 report, because it implies that revenue growth for the next 12 months should remain high.
Management did offer some soothing commentary about the demand environment. First, they aren’t seeing headwinds in Europe. Second, regarding the slowdown in spending from some companies called out in Q1, the results were mixed. Some companies started increasing spend again in Q2, while others didn’t. The key take-away is that Snowflake is still benefiting from significant customer expansion and durable growth. While they are guiding for macro impact, the current customer trends are not raising alarms.
Sponsored by Cestrian Capital Research
Cestrian Capital Research provides extensive investor education content, including a free stocks board focused on helping people become better investors, webinars covering market direction and deep dives on individual stocks in order to teach financial and technical analysis.
The Cestrian Tech Select newsletter delivers professional investment research on the technology sector, presented in an easy-to-use, down-to-earth style. Sign-up for the basic newsletter is free, with an option to subscribe for deeper coverage.
Software Stack Investing members can subscribe to the premium version of the newsletter with a 33% discount.
Cestrian Capital Research’s services are a great complement to Software Stack Investing, as they offer investor education and financial analysis that go beyond the scope of this blog. The Tech Select newsletter covers a broad range of technology companies with a deep focus on financial and chart analysis.
Profitability Measures
Snowflake was able to maintain its profitability performance in the quarter. Non-GAAP product gross margin for Q2 improved to 75.1%, up from 73.6% a year ago and even with Q1’s 75.2%. This outperformance prompted management to update their guidance for Non-GAAP product gross margin for the full year from 74.5% to 75%. As part of their long term plan, they have set a FY2029 target to reach 78% Non-GAAP product gross margin.
Non-GAAP operating income was $17.5M, for an operating margin of 3.5%. This is up from an operating loss of $21.9M a year ago, which represented an operating margin of -8.0%. In Q1, Non-GAAP Operating income was $1.7M for an operating margin of 0.4%. I like to see this steady improvement. Given this performance, leadership raised the full year FY2023 operating margin target from 1% to 2%.
Free cash flow increased significantly from a year ago, but did step back from Q1’s stellar performance. FCF can be a little lumpier than operating income, based on billing cycles for large customers. In Q2, total FCF was $53.8M for a FCF margin of 10.8%. This is up significantly from a year ago where FCF was -$12.0M and FCF margin was -4.4%. In Q1, Snowflake delivered $172.4M of total FCF for a margin of 40.8%. For the full year, they raised the adjusted FCF margin target from 16% to 17%.
Hiring
Snowflake continued to hire aggressively in Q2. While other companies are moderating their pace of hiring, Snowflake leadership is keeping their foot on the gas. Overall headcount increased by 9.5% sequentially in Q2, down somewhat from Q1’s normal surge with 14.2% q/q growth. Each individual department increased headcount by 8-13% sequentially, with both R&D and S&M growing their headcount by more than 50% on a year over year basis. For the first half of 2022, S&M headcount is up 28%.
These investments demonstrate Snowflake leadership’s confidence in the opportunity going forward. Since sales reps can take 9-12 months to ramp, leadership anticipates plenty of demand to chase next year. They are also keeping the product development funnel flowing with R&D headcount up 57% y/y. Both of these investments will drive growth.
While the macro environment has been dampening enterprise spend overall and thereby impacting the growth rates of software service companies, there is a side effect that investors should keep in mind. During periods like this, private companies are put under more pressure to control spending. This is because they typically don’t have as large a cash reserve as their public counterparts. And if they do, their VC backers will typically insist that they manage expenses to extend their “runway”, preventing the need to raise capital again and trigger the dreaded down round and significant dilution.
When economic slowdowns cause IT services companies to consider reducing costs, the impact is magnified for private companies. They have to slow investment in staffing or even trim headcount more aggressively to keep expenses in check and minimize cash burn. This activity (or even the messaging around it) can be very disruptive for a sales team in particular, where the best salespeople will react quickly to perceived risk. These effects will mitigate competitive pressure for the leading public software service providers, introducing a tailwind of reduced competition for deals with enterprise customers.
With almost $4B in cash and short term investments, Snowflake can afford to continue investing heavily for future growth, understanding that the ROI will be delayed until the macro environment improves. With that said, competitor Databricks is also investing aggressively in staffing to drive future growth. According to a Techcrunch article on August 12th, Databricks started 2022 with 3,000 employees, have over 4,000 now and plan to end the year with 5,500. As of the end of July, Snowflake had 4,991 employees and will probably end the year between 5,500 and 6,000.
Databricks is closing the gap in heacount quickly. Interestingly, Databricks reported that their annualized revenue run rate crossed $1B at the same time. This compares to Snowflake’s nearly $2B in annualized revenue. This implies that either Databricks is investing far in advance of the opportunity or is operating less efficiently than Snowflake (near parity on headcount with half the revenue). The two companies have about the same revenue growth rate at this point. In the same article, Databricks reported that their 80% growth from 2021 “has not slowed”, and Snowflake just put up 83% growth with some deceleration implied for the remainder of the year.
It will be interesting to see if the two companies adjust their investment going forward. Regardless, at these hiring rates, they will both crowd out smaller competitors by fielding even more sales reps, while private companies are likely delaying incremental sales staffing.
Customer Activity
Customer activity provided another area of positive performance for Snowflake in Q2. After a slowdown in Q1, total customer additions ticked back up in Q2. Snowflake ended Q2 with 6,808 total customers, which was up 486 from Q1, representing 7.7% sequential growth and 36.4% annual growth. The rate of annual growth has been dropping, but if Snowflake maintains the sequential growth in the 8% range, annual growth will stabilize at that rate. Given Snowflake’s enormous NRR (over 170%), a total customer growth rate in the 30-40% range will keep revenue growth high.
Large customer growth is clearly driving the outperformance in revenue. For Q2, snowflake added a record 40 new customers with trailing 12 month revenue over $1M, bringing the total to 246. This represented 19.4% sequential growth and brought the annual growth rate back over 100% to 112%. This was the largest sequential increase in $1M customers to date.
It’s also worth noting that these $1M customers represent just 4% of total customers. While not all customers can be expected to ever reach $1M in spend, a lot of them could. This implies that Snowflake has plenty of room to expand their $1M customer count. As part of their long term projection for $10B in product revenue by FY 2029, they are assuming about 1,400 $1M customers with an average spend of $5.5M. Given the trajectory this year (246 customers at $1M growing over 100% y/y), they appear to be well on-track to hit that long term goal.
This growth in large customer spend is reflected in Snowflake’s Dollar-Based Net Revenue Retention Rate (DBNRR rate). Loosely, this captures the increase in spend y/y for existing customers, including churn. For Q2, the DBNRR rate was 171%, which is remaining stubbornly high and even exceeded the rate from a year ago.
Leadership has indicated that they expect the rate to decrease over time. Based on commentary from Q4’s results, the CFO expects NRR to stay above 150% for FY 2023 (this calendar year), but added that he predects “it will remain well above 130% for a very long time.” Combined with continued growth in customer additions, these two factors should allow Snowflake to maintain 50% or higher revenue growth for a few more years.
I think these customer trends were the major contributors to the surge in SNOW stock price following earnings. The growth of large customers this quarter broke the narrative that large customers are reaching a saturation point and slowing down spend. We are actually seeing the opposite effect, which management emphasized during the earnings call. They provided commentary describing how their largest customers are still increasing their spend at very high rates, even as they pass the $10M mark or more.
Additionally, on the earnings call, the CFO delivered an update on Global 2000 customers. Snowflake added 12 new G2K customers in Q2 to reach 510. This is up from 498 in Q1, where 15 were added. Snowflake has now penetrated 25% of the G2K, leaving a lot of room to grow. The sequential rate of additions of this customer type has been slowing down each quarter, which will be a signal to watch. On the earnings call, the CFO stated that they expect these accounts to become their largest customers over time.
Once landed, these Global2000 customers are increasing their spend rapidly. The CFO stated that the average trailing 12-month product revenue from G2K customers grew by 14% sequentially and is now $1.2M. On an annualized basis, that sequential growth rate would be about 69%, which is inline with the DBNRR rate. Overall revenue growth will be supported by this high DBNRR rate, but Snowflake will need to keep landing these future large customers to keep the spend expansion funnel full.
Snowflake leadership has discussed in the past how many large customers are not in the G2K, usually representing digital natives that have a large data footprint. During their Investor Day presentation in June, the CFO pointed out that only 45% of their customers spending over $1M annually in product revenue are in the Global2000.
During the earnings call, the CFO referred to an example in which “a Global 2000 technology company is now a top 10 product revenue customer less than two years after signing their initial deal.” A top 10 customer generates over $10M in product revenue, based on the metric from Q1 that they had 10 customers spending more than $10M. This underscores how fast customers can become large, where the referenced technology company went from $0 to $10M+ in annual spend in less than two years.
During Investor Day, the CFO provided an example with a large Telecom customer of how this happens. The customer started using Snowflake as part of a migration from their on-premise data warehouse. This required the first year to kick-off the migration, allowing them to really started ramping up workloads in year two. These workloads encompassed data warehouse activity, and also began addressing other application workloads associated with a mixture of OLAP and OLTP (transactional) activity. The end result was that compute credit consumption increased by 6x in the second year.
However, didn’t Snowflake introduce performance optimizations earlier this year, which should be offsetting this expansion in credit consumption? As investors will recall from the Q4 earnings report, Snowflake management revealed that they rolled-out platform efficiencies that would deliver in a 10-20% improvement in workload performance on average. In a simple example, if a customer could run 100 queries for every Snowflake consumption credit before the optimization, now they can run 110 to 120 per credit.
In the near term, this allowed manual and automated customer jobs to accomplish the same work for less consumption of credits. Since Snowflake revenue is recognized on consumption and not contract sales, the immediate impact would materialize in revenue recognition, since fewer pre-sold credits will be consumed. We saw this effect in the Q4 and Q1 results, as revenue growth was muted. Snowflake was experiencing headwinds in customer spend expansion from the combination of the platform optimization and enterprise scrutiny of usage, due to the macro environment.
However, for Q2, we saw overperformance in revenue generation as evidenced by the large beat and near linearity in annual growth rates to Q1. Q2’s sequential revenue growth of 18% accelerated from Q1’s rate of 10% and even Q4’s rate of 15%. While the macro backdrop lingers, customer spend expansion was surprisingly strong, reflected in the DBNRR rate of 171%.
Perhaps an explanation is provided by Snowflake management’s thesis that platform optimizations eventually drive greater utilization over time, as large customers find the unit economics more compelling and move additional workloads to Snowflake. During the Investor Day presentation in June, the CFO provided an example of this.
In this example, a large Retail customer experienced a platform optimization (labelled as Quarter 2). In the next quarter, the realized product revenue for Snowflake did in fact dip below their prior contracted forecast. However, two quarters after the optimization, their product revenue surpassed the previously anticipated usage level, as a result of moving more workloads to Snowflake. From that point forward, the actual utilization continued to accelerate relative to the previous contracted commitment.
Since the most recent platform optimization was rolled out in Q4, we are now past the two quarter period. It may be that the workload expansion effect started kicking in during Q2, providing an explanation for the revenue outperformance. While Snowflake’s forward revenue guidance is still tempered by the macro environment, this expansion following an optimization may start to provide an increasing tailwind for large customer spend over the next few quarters.
Data Ecosystem
The strong growth of Snowflake’s data ecosystem continued in Q2. In their Investor Day presentation in June 2022, management increased the size of their TAM for the Cloud Data Platform to $248B by 2026. This encompasses all of their data management products. It was updated to include the expected impact of Unistore, Cybersecurity and fuller realization of Data Science / ML and data-rich applications.
Even with this, they expect an even larger addressable market for the full “Data Cloud”, which they aren’t sizing at this point. The Data Cloud encompasses the extension and usage of their cloud data platform to unlock new ecosystem and partner utilization patterns. Those patterns go beyond the existing enterprise customer workloads for processing and delivering their internal data to support a broader ecosystem of uses driven by exchange and enhancement. Programs expected to drive the broader Data Cloud include Data Sharing, the Data Marketplace and the Powered By Program.
In Q2, Snowflake leadership introduced a new slide to the Investor Presentation which summarizes their “Data Cloud Metrics”. Prior to this, management would reference these metrics separately, usually in the prepared remarks. Q2 represented the first quarter in which all three of these metrics driving growth of the larger Data Cloud opportunity were included together in a formal slide. I think this reflects Snowflake leadership’s intent to focus on these as formal KPI’s going forward (versus mentioning them sporadically, and predominately when positive).
To capture Data Sharing activity, Snowflake reports a measure called “stable edges”. Snowflake leadership sets a high bar for considering a data sharing relationship between two companies as actively being used. In order to be considered a stable edge, the two parties must consume 40 or more credits of Snowflake usage each day over a 6 week period for the data sharing relationship. I like this measure, as it separates empty collaboration agreements from actual value creation.
In Q2, the total number of stable edges between customers grew by 112% y/y. This compares to growth of 122% in Q1. Of total customers, 21% have at least one stable edge. This is up from 20% last quarter and 15% in the prior year. Applied to customer counts, this means that 1,429 customers had at least one stable edge in Q2, versus 749 a year ago. The total number of customers with at least one stable edge nearly doubled over the last year. That growth reflects the value of data sharing, as a stable edge costs money to maintain. This growth in customer counts with a stable edge actually accelerated over Q1, with 91% growth in Q2 versus 86% in Q1. So, while the overall increase in stable edges slowed q/q, the number of customers utilizing a stable edge increased.
Facilitating these data sharing relationships represents a competitive advantage for Snowflake, in my view. They increase customer retention, generate network effects to attract new customers and drive incremental utilization as shared data sets are filtered, cleansed and combined with other third party data. This network of data sharing relationships elevates Snowflake’s value proposition for customers onto a higher plane beyond focusing on tooling for analytics and ML/AI workloads within a single company.
Data sharing is also a popular feature for Snowflake’s large customers. For Q2, leadership reported that 65% of $1M customers have at least one stable edge. This is up from 63% at the end of Q1. Given that the penetration of stable edges is greater for large customers, it may be that these data sharing relationships encourage higher usage levels. Enterprises may be exchanging data with other parties and then applying more data processing to combine, enrich, enhance and then share that data.
To enable data sharing and enrichment, Snowflake’s Data Marketplace provides users with access to relevant data sets from third-party data providers. Companies can subscribe to these data sets for a fee and then seamlessly combine them with their Snowflake instance through data sharing. This eliminates the overhead of setting up separate integration processes to import, filter and combine this data. Additionally, secure data sharing handles updates automatically. That represents a huge cost savings. At the end of January (Q4), Snowflake had 1,100 data sets from 240 providers. For Q1 FY2023, listings grew 22% q/q to 1,350 data sets from over 260 providers. For Q2, marketplace listings grew another 13% sequentially to 1,539.
If the Data Marketplace is seeing strong growth, the Snowflake Powered By program appears to be garnering even more participation. This represents companies that have decided to build their data-driven product or service on top of Snowflake’s platform, that they then sell to their customers. For Q1, Snowflake announced there were 425 Powered by Snowflake partners, representing 48% growth over the prior quarter’s count of 285. For Q2, Powered By participation took another large jump forward, increasing by 35% q/q to reach 590 registrants.
That is quite a jump. As these companies grow their businesses, their consumption of Snowflake resources should increase significantly. As part of Investor Day in June, leadership revealed that 9% of their $1M+ customers were in the Powered By program. They ended Q1 with 206 customers of this size, implying that 18-19 Powered By participants were generating more than $1M in annual product revenue. With the rapid expansion of the Powered By program, it’s likely that its relative contribution to the $1M+ customer count will grow over time.
This is because Powered By participants inherently generate high utilization of Snowflake. In their case, the foundation of their service infrastructure is running on Snowflake’s platform. This is in contrast to the normal enterprise use cases around analytics and machine learning. As more companies choose to build their business on top of Snowflake, we will likely see this contribution to utilization grow faster. In a sense, the Powered By program elevates Snowflake to spending patterns on par with the hyperscalers (which are usually the largest line item in an IT budget).
Acquisition
As part of the Q2 results, Snowflake announced the intent to acquire Applica, based in Poland. Applica provides services to automate the ingestion and decomposition of unstructured documents into usable data structures. To accomplish this, they have developed models and deep learning methodologies to process documents “similar to the way the human mind does”. This takes into consideration layout, graphics, and semantics, enabling precise data capture for typical business processes.
Applica claims that they are very R&D based, with over 80% of staff working on the product. Personnel include data scientists, mathematicians, statisticians, computational linguists and philosophers. They have about 30 staff members in R&D and claim over half have PHDs. With the shortage of top engineering talent, this would represent a valuable asset. Given that most personnel are located in Poland, retention should be favorable after acquisition.
Users can apply Applica’s product to extract information from any type of document without coding required. They claim that their technology is unique in that it can be applied to a variety of document types without requiring large amounts of training data in advance of learning to deconstruct each type. The system learns quickly as it processes new documents, and applies that to new document structures without requiring significant new supervised learning.
Applica has developed a unique technology that accounts for variability— and processes documents regardless of their structure. Most AI requires supervised machine learning using large volumes of annotated documents. Thanks to our progressive neural language modeling techniques, Applica reduces supervised learning to a minimum.
Because Applica’s deep learning-based solution is pre-trained on huge datasets, it “learns” to recognize all kinds of text and, unlike machine learning-based solutions, isn’t limited to specific layouts.
The advantage? The majority of training will already be done by the time you log on to Applica. Simply input your own documents. Only minimal fine-tuning is needed to fully process them.
Applica Web site
Snowflake’s intent with the acquisition is add staff and technology capabilities. According to Snowflake’s SVP of Product, they will apply Applica’s capabilities and staff to improve the Snowflake platform’s ability to process unstructured documents and store that data in the Data Cloud. We don’t know the terms of the acquisition and it likely doesn’t represent a meaningful revenue contribution. Applica’s technology and team should further accelerate Snowflake’s capabilities in processing and storing unstructured data types for their customers.
Product Commentary
Snowflake released a lot of product additions and enhancements during their Summit conference in June. I provided an overview of everything launched in a prior post. Combined with some of the commentary from the Q2 earnings call, these are the broad product strategy themes for Snowflake that I see emerging:
- Single Data Store. Snowflake clearly aspires to provide a compelling reason for enterprises to locate as much data as possible in the Data Cloud. The announcement of Unistore adds transactional (OLTP) support to the Snowflake platform. While I don’t expect every consumer application to replace their existing transactional database with Snowflake, it starts the process of moving in this direction by replacing databases typically provisioned to handle high currency, low latency workloads that serve summary data output from a Snowflake job. During Summit, Snowflake’s customer Western Union talked about replacing a Cassandra cluster that serves pricing data to their customer-facing applications.
- Enable Application Workloads. The introduction of the Native Application Framework represents a first step in allowing developers to build new data-rich application services in an environment that has direct access to their customer’s data. Typical SaaS applications bring customer data into their own hosting environment. This is problematic from a security point of view and generates redundant data storage cost, which is passed on to customers. With the Native Application Framework, a new breed of SaaS developers can move their applications to work on the customer data in Snowflake directly, without needing to store a separate copy of it. I think this could represent a disruptive development for typical enterprise SaaS application delivery.
- Programmability. Besides providing the runtime environment, Snowflake is bringing the tools to support real programmability. The integration of the Streamlit acquisition will provide a framework for developers to easily build and run data-rich applications. Snowpark for Python makes Python’s rich ecosystem of open-source packages and libraries accessible in the Data Cloud. With a highly secure Python sandbox, Snowpark for Python runs on the same Snowflake compute infrastructure as Snowflake pipelines and applications written in other languages. Snowflake has struck a relationship with Anaconda, which extends access to more Python packages within the Snowpark development environment. On the Q2 earnings call, leadership said that many large customers are eagerly waiting to use Python.
Snowflake for Python is red hot and people are chomping at the bit for us to declare it GA, which is something – and we have customers that are really wanting us to let them use it in production now, some of the largest customers that we have. So, the pressure is on, because the demand is there.
Snowflake q2 earnings call
- Powered By Program. As I discussed, I think that Snowflake’s Powered By program has a lot of potential. Given its growth, I would even speculate that in the future, revenue from Powered By approaches revenue from regular customer use of the Snowflake platform. This is because Powered By program participants are building their entire business on the Snowflake platform. We have already seen several sizable security vendors take this approach.
- Vertical Solutions. Snowflake is focusing on building out industry verticals and workload verticals. In each case, the Snowflake Data Cloud is being leveraged to deliver capabilities that provide specific benefits to that vertical. For industry verticals, this takes the form of data sharing, additional governance and security capabilities. The first workload vertical introduced was Cybersecurity. I think there will be more of these. In this case, Snowflake provides resources and support that allows enterprise customers to power their own internal security practice using Snowflake’s data processing engine. I could see similar efforts in the future around types of data monitoring (observability), marketing or financial services.
Underpinning these business opportunities are the capabilities of the Snowflake Data Cloud. While competitors badger Snowflake for being a “closed” system, I think this works to Snowflake’s advantage. They deliver a working product that requires little specialization to consume or support. They can also move quickly, without being tied to adherence to open standards and interoperability between interfaces. If they need to tweak the storage model to improve performance, those decisions can be managed internally.
In other categories of software infrastructure, this open versus closed argument doesn’t seem relevant or important. A reasonable parallel is Datadog (closed) as compared to Elastic (open). It is true that developers can view Elastic’s source code and more easily customize the functionality if needed. But, at the end of the day, DevOps personnel are more interested in a working system that can quickly address their observability needs and doesn’t require a lot of infrastructure configuration.
In fact, some open data management systems moved to a cloud-based managed service for just this reason. MongoDB offers Atlas so that DevOps teams don’t need to provision and manage MongoDB clusters themselves. Confluent does the same. As a result, their cloud-based services are growing faster than their equivalent self-hosted products. The cloud-hosted services are effectively closed, at least from the perspective that software engineers can’t apply their own code patches to the runtime. They prefer the simplicity of a managed service, so that they can focus their time on doing their primary job, which is to create incremental business value.
With all that said, I think Snowflake put aside the “closed” system argument by adding support for Iceberg Tables. These provide customers with the ability to move data in and out of Snowflake using the open Apache Iceberg format. Iceberg is a popular open table format for analytics. Iceberg brings high performance, reliability and simplicity of SQL tables to big data. It is compatible with other data processing engines like Spark, Trino, Flink, Presto and Hive.
Besides the Apache Iceberg project, alternatives for an open table format are Delta Lake (from Databricks) and Apache Hudi. The Snowflake team claims that they evaluated all three options and chose the Apache Iceberg project. Iceberg was open-sourced by Netflix is backed by Apple and Amazon.
I think Iceberg compatibility delivers a smart compromise for Snowflake. They can retain the benefits of a closed system, while still demonstrating to customers a willingness to support an open data exchange format, so that they don’t feel “locked in”. I think that move will deflate the closed system argument.
Take-aways and Investment Plan
After challenging earnings reports in Q4 and Q1, Snowflake’s Q2 results provided a welcome change. Several of the concerning trends from prior quarters have been mitigated, including revenue growth deceleration, large customer additions and rapid DBNRR rate erosion. The narrative around Snowflake’s growth potential appears positive again and the stock is exhibiting strong momentum, up about 20% since the earnings announcement.
Looking forward, I think Snowflake’s high growth can continue. Likely not at 80% per year, but probably 50% annually for several more years. The strongest signal that this would be possible lies in the spending growth for large customers and extremely elastic cap. Many customers are exceeding $5M, $10M and even $20M of annual spend, and yet show little signs of slowing down. On the Q2 earnings call, the CFO stated that none of their customers (even the largest) are showing signs of saturation.
When we land, we land small, and they go workload by workload and they just keep moving stuff over to Snowflake that drives that. And it’s a multiyear journey within our customers. And I don’t see any of our customers that are fully saturated.
Snowflake Q2 Earnings CAll
During their Investor Day event in June, the CFO shared a slide showing growth in customers with large amounts of spend. While we don’t get this update every quarter, we can infer the rapid growth in spend even at large scale. In the first quarter of this year, Snowflake added 1 customer with $20M+ TTM product revenue, 1 with $10M+, 10 with $5M+ and 22 with $1M+. Those are big spending numbers. In the past, leadership has commented that some of their largest customers are still growing spend faster than Snowflake’s overall revenue growth.
With that kind of growth and considering the addition of 40 more $1M+ customers in Q2, we can start to see how Snowflake might achieve and even surpass their FY2029 product revenue target of $10B. That estimate was based on reaching 1,400 customers spending over $1M in product revenue and raising their average spend to $5.5M. Given the momentum we are seeing now, this appears easily achievable in 6-7 calendar years. Snowflake reached 246 $1M+ customers in Q2, more than doubling over the prior year’s count.
Additionally, these numbers were primarily based on Snowflake’s existing business from their cloud data platform. When they put forth this model in 2021, it didn’t account for new product offerings. I think that Snowflake’s product strategy expansion into new areas will provide additional tailwinds to this model. These growth opportunities include the incremental contribution from the Powered By program, new OLTP workloads running on Unistore and data sharing through the Marketplace and Industry Verticals.
Currently, I have a 21% allocation to SNOW in my personal portfolio. This has grown from about 18% previously, as a result of the surge in share price following the Q2 earnings event. I plan to maintain this allocation for now. If Snowflake’s progress over the remainder of 2022 shows evidence of further improvement going into 2023, I may increase the allocation a bit more.
Further Reading
- Peer analyst Muji over at Hhhypergrowth published two recent, detailed updates on Snowflake. First was an overview of their product moves, including the announcements from Summit. Second was coverage of the Q2 earnings results. Both of these provide deep analysis that should supplement my coverage. With one of these, Muji is approaching the record for longest blog post.
- I recommend watching the Snowflake Summit presentations, particularly the 4 keynotes. They are available on-demand, after a light registration.
NOTE: This article does not represent investment advice and is solely the author’s opinion for managing his own investment portfolio. Readers are expected to perform their own due diligence before making investment decisions. Please see the Disclaimer for more detail.
Hi Peter,
Thank you for this extremely well written post. My dissection of an Earning report is not as extensive and well thought as yours and that´s why it is nice to fill up the gaps where I missed something.
One note which you maybe can shed some lights on. If you look at the numbers of Global 2000 Customers count announced at Investors day 2022, they are a bit lower than reported at say 2021Q4, 2022Q1 and 2022Q2. Why the change?
Thank you very much for your time!
Good question – companies that are considered as part of the Fortune Global 2000 list are updated quarterly (similar to the S&P 500). Some move out and others move in. This can cause historical counts of Global 2000 customers to change.
From the earnings call:
Snowflake CFO Mike Scarpelli then jumped in with the details, saying, “The majority of our customers — 80+% — run in AWS, about 18% is Azure, and 2% is GCP.”
How do you see the strong tilt towards AWS translate into snowflake’s customer and product strategy going forward? Will they continue with their multi cloud strategy as is, or do you expect they’ll roll out some AWS specific products or product functionality in the future? If data cloud runs on AWS for 80%+ of consumed credits, maybe even 90%+ for data cloud customers, does it make sense to offer all of the same functionality across the three hyperscalers equally or will they focus on AWS?
The high customer representation on AWS doesn’t surprise me. Of all hyperscaler relationships, Snowflake has the most productive co-sell arrangement with AWS. With Azure, Snowflake is tolerated. With GCP, the relationship is competitive. Snowflake does consider it an important product requirement that they maintain consistency in user experience across all three hyperscalers. This is because some customers may have data on multiple clouds. With that said, I could see them at least ensuring that they maintain integrations with the most AWS services and try to make the Snowflake experience on AWS as seamless as possible.
Got crushed on Mongo yesterday ugh, need to see your recap of the earnings
It was a tough reaction by the market for sure. I am working on the recap now, and will publish something by early next week. There were some positive points and a number of challenges for the quarter. I think the long-term thesis around expanding workloads and enterprise reach is intact. But, usage drops from high-growth digital native and European companies are a headwind against normal expansion. As a consumption model, revenue is hit much more quickly than other SaaS businesses. Of course, the opposite effect can happen when the climate clears. I think the next couple of quarters will be challenging, but 2H of next year could see re-acceleration. The other issue is the step backwards in operation margin and FCF margin. This added to the negative reaction, which is fair in this environment. I don’t think the model is broken and re-acceleration of revenue growth should pull margins back up, assuming the rate of spend increase tempers.
Thanks for yet another very informative article.
Is ERP dying slowly, would that benefit Snowflake and other companies you write about, and is it already factored into TAMs and revenue targets?
(I watched “ERP Software: The End of Enterprise Technology As We Know It” on youtube. In another vid, the guy said “cloud is a hoax”. He rolled it back a lot, but I’m not sure how seriously I should take him.)
Hi Peter,
as always great article!
I’m a big fan of Snowflake for years now. Yet in the long term a big risk I see is that the big cloud provider want to have the business for themselves or having growing concern regarding snowflake as they might become competitors.
As you stated that Snowflake has the closest productive co-sell arrangement with AWS. With Azure, Snowflake is tolerated. GCP sees Snowflake as a competitor.
My understanding is in the short term they coexist or even like AWS push it as it helps them to grow customer base
In the long term, 5+ years, what stops the big players to make Snowflakes life hard (No live data integration, etc.) As they want to grow their business and leave the data in their system to offer customer leading edge analytics solution .
How much do you see that as a concern or risk? What is you take on that?
Please let me know in case I missunderstood anything or you see flaws in my argumentation.
Hi Samuel – It’s a fair concern. One could argue that the hyperscalers ultimately want to take all this business for themselves. Obviously, AWS tried it once already with Redshift and couldn’t compete with Snowflake effectively. They could certainly try again.
What we have to keep in mind, though, is that AWS and the other hyperscalers generate a lot of revenue from Snowflake usage. Snowflake pays them for the underlying compute, storage and network they they consume. So, if the hyperscalers tried to compete for Snowflake’s business, they would lose this revenue stream and have to take on the additional cost in staff to build and maintain their own product. I imagine AWS has run the numbers and decided that it is more profitable to help Snowflake maximize their business (running on AWS).
In the CFO’s opening remarks of the CC he stated:
“We collected a $33 million invoice in Q2 from a customer who had paid its invoices in Q3 in prior years.”
I viewed this as a sort of pull-forward, and if it was backed out of the current quarter’s revenue, QoQ growth would have been 10% instead of 18% (and YoY would have been 71% instead of 83%).
However, as you pointed out, the optimization and workload expansion may be the reason the customer paid earlier. I thought these two paragraphs from your write were enlightening:
“In this example, a large Retail customer experienced a platform optimization (labelled as Quarter 2). In the next quarter, the realized product revenue for Snowflake did in fact dip below their prior contracted forecast. However, two quarters after the optimization, their product revenue surpassed the previously anticipated usage level, as a result of moving more workloads to Snowflake. From that point forward, the actual utilization continued to accelerate relative to the previous contracted commitment.
Since the most recent platform optimization was rolled out in Q4, we are now past the two quarter period. It may be that the workload expansion effect started kicking in during Q2, providing an explanation for the revenue outperformance. While Snowflake’s forward revenue guidance is still tempered by the macro environment, this expansion following an optimization may start to provide an increasing tailwind for large customer spend over the next few quarters.”
Thanks for your thoughtful analysis.
Thanks for the feedback. On the $33M invoice, it wasn’t clear to me if that would affect revenue recognition (based on actual consumption) or billings. Regardless, I think we are seeing that Snowflake’s growth metrics can be lumpy, but I think the larger opportunity is intact.
On the workload optimization delayed impact, yes, I thought it was very interesting that we are now 2 quarters past the optimization and may be seeing the benefits now of additional workloads.