Snowflake reported their Q3 FY2023 earning results on November 30th. The company beat revenue estimates, delivering 67% annual growth. The Q4 product revenue guide, however, missed expectations by about 2%, with annual growth decelerating to 49%. Initially, the market’s reaction was unfavorable, as the stock dropped by over 10% after hours. At the tail end of the guidance portion of the call, however, management shared a preliminary outlook for next year (FY 2024) for 47% revenue growth with 23% FCF margin. While the revenue guide was roughly inline, the implied FCF target was higher than analysts had modeled. SNOW’s stock price immediately began rising and ended the following day up 8%.
This movement underscores the situation for many software infrastructure providers currently. While investors have become attuned to the impact of the pressured IT spending environment, they are trying to see past the current macro headwinds. Coming off the Covid-inspired spending surge, macro is obfuscating post-Covid growth deceleration. Investors need to discern which companies would have maintained elevated growth for the next couple of years, separate from the broader impact of macro. Identifying the companies with real durable revenue and FCF growth could drive investment outperformance.
By effectively setting their baseline for next year’s revenue growth rate linear to the Q4 guide and increasing the FCF target, Snowflake is signaling that their growth rates are sustainable. Since a big part of the Snowflake valuation thesis hinges on the durability of revenue growth towards the $10B target by FY2029 (six years out), this guidance signals that target is achievable. Additionally, that can be accomplished with a significant increase in free cash flow, quickly approaching their 25% long term target.
Audio Version
View all Podcast Episodes and Subscribe
Given this guidance, it’s not surprising that the market quickly overlooked the soft Q4 estimate. The Q4 estimate was further propped up by the CFO’s assertion that it was more conservative than normal, given the upcoming Holiday period and an expectation for another platform optimization to temporarily step down revenue recognition.
The reaction was also buoyed by the market’s realization that Snowflake has additional leverage on cash flow generation. With this preliminary guidance for next year, total adjusted FCF could very well double next fiscal year. And, going forward from there could continue to increase at a high rate. If one considers the EV to FCF valuation multiple to be fair at that point (about 60 with current FCF estimates combined with 47% revenue growth), then future stock growth should be driven proportionally to FCF growth. That could provide a nice tailwind for the stock over the next few years.
That hypothesis provides the crux of the investment thesis for Snowflake. Durable growth will be supported not just by expansion of existing customers for traditional data warehouse upgrades, but the potential for new initiatives around industry data sharing, a unified data store and the Powered By program. These layer usage over the existing growth engine and backfill the normal decay from the law of large numbers.
With that set-up, let’s look at what Snowflake delivered and then how this ties into their evolving platform strategy. For more background on the Snowflake investments thesis, interested readers can refer to my prior coverage. Additionally, our partners over at Cestrian Capital Research published a review that includes more financial and technical analysis.
Growth Metrics
Following Q2’s huge beats, the results for Q3 were tempered a bit. Snowflake delivered $557M in total revenue for 66.6% y/y growth and 12.0% sequentially. This beat analyst’s estimates for $539M by about 3.3%. Product revenue was $522M, up 67% y/y as well. This beat management’s prior guidance for a range of $500M-$505M by about $20M.
Looking forward to Q4, the picture is a bit more muddled. Snowflake management guided to $535M-$540M in product revenue, representing annual growth of 49%-50% and 2.8% sequentially. Analysts had expected $549.2M, implying a miss of 2%. Whether we agree with the analyst estimate, management’s guide represents a substantial slowdown. Even with the typical beat of about 4-5% of annual growth, Q4 product revenue would drop from 67% growth in Q3 to about 54% in Q4. Further, the sequential revenue increase to Q4 is much lower than their typical guide upwards of 7%-12% (or more). The preliminary guide for Q3 product revenue issued in Q2 called for sequential growth of 7.7% and the actual result represented 12.0% of sequential growth.
This “miss” for Q4 revenue estimates was the primary driver of the stock’s initial after hours sell-off around 12%. This was reinforced by the full year product revenue raise of $9M to $14M, as compared to the Q3 beat of about $20M. Investors were likely projecting the deceleration forward into FY2024 (calendar year 2023), which could bring full year revenue growth into the low 40% range. Prior to the Q3 earnings report, analysts had modeled revenue growth of 51% for FY2024. So, growth at the lower end of 40% would be disappointing.
As mentioned earlier, Snowflake management countered this conclusion by pre-announcing next year’s targets. They couched this as being part of an initial planning cycle to guide hiring targets. This likely assumes some conservatism and would be subject to revision when Snowflake reports Q4 results in early March of 2023.
While we are currently in our planning cycle, we would like to discuss next year’s growth outlook based on the consumption we are seeing today. For the full fiscal year 2024, we expect product revenue growth of approximately 47% and non-GAAP adjusted free cash flow margin of 23% and continued expansion of operating margin. This outlook includes a slowdown in hiring, which we evaluate on a monthly basis, but assumes adding over 1,000 net new employees. Our long-term opportunity remains strong, and we look forward to executing.
Snowflake Q3 FY2023 Earnings call
For FY2024, they set a product revenue growth rate of 47%, which is just 2% below the range for Q4. The linearity was construed as bullish, implying that revenue growth would level out next year, or even reaccelerate in the second half if the dip continued into Q1-Q2. Additionally, investors normally assume these targets represent a floor and the actual growth rate could be higher.
In addition to the revenue target, management shared an adjusted FCF margin estimate of 23%, which was also higher than expected. This follows two quarters with 11%-12% FCF margin and comes surprisingly close to their FY2029 (in six years) target of 25% adjusted FCF margin. Management also mentioned continued expansion of operating margin without providing a number. They have been delivering single digit operating margin up to now, and finished Q3 with 8%.
Over the past six weeks, we have seen weaker consumption in APJ and the SMB segment. However, recent consumption patterns give us confidence that our largest and most strategic customers will continue to grow. With the holidays approaching and uncertainty with how customers will operate, we believe taking a more conservative approach is responsible as we resource plan for Q4 and fiscal 2024.
Snowflake Q3 FY2023 EArnings Call
Additionally, in response to a question, management explained the Q4 guide as being “conservative” and accounted for typical holiday slowdown. As investors will recall, Snowflake leadership has shared in the past that about 30% of their consumption is driven by ad hoc queries from humans, versus recurring, automated jobs. This human-driven consumption would presumably drop off during Holiday periods. It was the source of some softness reported in Q4 results from last year.
Other growth indicators in the Q3 report were favorable. At the end of Q3, total RPO was $3.0B, up 66% y/y and 10.6% sequentially. Leadership projects they will recognize 55% of total RPO in the next 12 months (defined as Current RPO). This represents $1.65B and is up 68% y/y. The fact that current RPO growth is higher than revenue growth represents a positive indicator and underscores the FY2024 revenue projection. Additionally, RPO growth in Q4 is typically outsized, so we could see another nice bump here. Last year, the sequential growth from Q3 to Q4 in RPO was 47%.
Management offered some additional commentary about the demand environment. They attributed challenges to SMB consumption, APJ due to currency headwinds and continued softness in particular customer segments. As with the prior quarter, customers in the technology segment are the most impacted and have limited their consumption. The CFO mentioned that there were 3 technology companies in particular that decreased spend from Q2 to Q3.
Those three customers, actually from when we guided at the beginning of the year, they took out $41.8 million of revenue from us for the full year. These were three customers who have gone through some challenging times, but also did some heavy optimization. And by the way, all these three customers, we’ve been telling them for a while, we can help you save money, but they didn’t — they were growing so fast, they didn’t pay attention. And now, customers are paying attention.
snowflake CEO, Barclays Global Technology Conference, November 2022
This has been offset by strong growth in other customer categories, like financial services and advertising media. Some of the impact on the technology segment is an overhang from Covid-driven overspending. These companies were growing quickly in 2021 and didn’t bother with trying to optimize spend to reduce expenses. That downward pressure should normalize as these delayed optimizations are completed.
Sponsored by Cestrian Capital Research
Cestrian Capital Research provides extensive investor education content, including a free stocks board focused on helping people become better investors, webinars covering market direction and deep dives on individual stocks in order to teach financial and technical analysis.
The Cestrian Tech Select newsletter delivers professional investment research on the technology sector, presented in an easy-to-use, down-to-earth style. Sign-up for the basic newsletter is free, with an option to subscribe for deeper coverage.
Software Stack Investing members can subscribe to the premium version of the newsletter with a 33% discount.
Cestrian Capital Research’s services are a great complement to Software Stack Investing, as they offer investor education and financial analysis that go beyond the scope of this blog. The Tech Select newsletter covers a broad range of technology companies with a deep focus on financial and chart analysis.
Profitability Measures
Snowflake continued its march towards greater operating profitability in Q3. Non-GAAP product gross margin improved to 75.3%, which was above the Q2 result of 75.1% and year ago value of 74.6%. Improvements in gross margin are attributed to a few factors. As Snowflake scales their usage, they can negotiate more favorable terms with the hyperscalers through volume discounts. This reduces their hosting cost for the underlying storage and compute. Additionally, they can make product improvements that either reduce the amount of storage or compute needed for a typical customer workload. This can be accomplished through data compression, more efficient representation of data or query optimizations. As part of their long term plan, they have set a FY2029 target to reach 78% Non-GAAP product gross margin.
Non-GAAP operating margin similarly improved. Coming out of the Q2 report, management had set an operating margin target of 2%. Snowflake actually delivered operating income of $43.4M, which translated into 7.8% operating margin. This compares to $17.5M or 3.5% operating margin in Q2. In the year ago quarter, operating income was $8.5M for 2.5% operating margin. Snowflake improved operating margin by 520 bps y/y and grew the total amount by 5x.
This operating income outperformance translated into a beat on the EPS target. For Non-GAAP EPS, Snowflake delivered $0.11, which beat the consensus estimate for $0.05 by $0.06. This compares to a Non-GAAP EPS for $0.03 a year ago.
Looking forward to Q4, leadership set a Non-GAAP operating margin target of 1%. This is below the 2% target that had been set for Q3. Given that they guided product revenue for Q4 about 2% below the consensus range, this tick down in operating margin by just 1% is understandable. For the full year, they guided for 3% operating margin, which represented a raise of 1% over the guidance set with the Q2 results.
Switching to free cash flow, Snowflake also showed improvement. Adjusted free cash flow was $65.0M in Q3 for a FCF margin of 11.7%. This is an improvement over Q2’s $53.8M and 10.8% margin. A year ago, adjusted free cash flow was $21.5M, for a margin of 6.4%. Snowflake grew their free cash flow by 2.5x year/year.
With this momentum, management raised the full year adjusted FCF target from 17% set in Q2 to 21% this quarter. That represents a pretty significant jump. Further, as mentioned earlier, their preliminary guidance for FY2024 (next year) is for 23% adjusted FCF margin. This is quickly approaching their long-term target for FY2029 of 25% Non-GAAP adjusted FCF margin.
Given this rapid improvement, we may see management raise the FY2029 target at some point. For comparison, Crowdstrike delivered a FCF margin of 30% in its Q3 results. Their infrastructure configuration and usage patterns are similar to Snowflake’s, with a heavy data processing component over infrastructure rented from the hyperscalers.
Hiring
Snowflake accelerated their pace of hiring in Q3. While other companies are moderating hiring this year, Snowflake leadership has been pressing forward. Overall headcount increased 11.1% sequentially in Q3, which accelerated over Q2’s growth of 9.5%. On an annual basis, headcount increased by almost 56% in Q3 over the year ago period.
Hiring growth rates increased in R&D and S&M while they decreased sequentially for G&A and Cost of Revenue. I view that as a positive indicator of reaching economies of scale for those departments, allowing Snowflake to focus incremental hiring in the areas that will impact revenue growth and product expansion the most.
These investments demonstrate Snowflake leadership’s confidence in the opportunity going forward. Since sales reps can take 9-12 months to ramp, leadership anticipates plenty of demand to chase next year. They are also keeping the product development funnel flowing with R&D headcount up a staggering 75% y/y. I view the increased investment in R&D as an indication that Snowflake plans to continue expanding the Data Cloud product feature set. This will provide more capabilities to drive consumption for Snowflake’s largest enterprise customers.
Looking to next year, though, management has signaled that hiring will moderate. They are planning to end this fiscal year with about 1,500 new employees, to reach 6,000 total. For FY2024, they have projected adding about 1,000 more employees for a growth rate of about 17%. Given the macro environment, this preliminary target is prudent. Snowflake has been hiring aggressively for the past year and can afford to slow down a bit. They did add that they would “evaluate on a monthly basis”, implying that if the macro situation improved, they might accelerate hiring.
As mentioned previously, Snowflake set a preliminary revenue growth target for next fiscal year of 47%. They plan to achieve this, even with slower hiring in sales. During the recent Barclay’s Global Technology Conference, their CFO discussed how selling into their largest customers is very efficient and requires less sales support staff. This is because customers often drive consumption expansion themselves with little support required from Snowflake. With a DBNRR over 160%, the majority of Snowflake’s revenue is coming from existing customers.
The net benefit of these sales efficiencies shows up in operational leverage. In addition to their revenue growth target for next year, Snowflake set a FCF margin target of 23%. As they keep the revenue growth rate at almost three times the headcount growth rate, the bottom line improvement will manifest quickly.
While the macro environment is impacting IT budgets, Snowflake is well positioned to continue investing in the market opportunity, even if they have to rely more heavily on their $4B in cash. Start-ups and private companies that had intended to disrupt Snowflake will be set back, as they are pressured by their VC investors to preserve their runway and reduce their cash burn rate. Any pause on growth investment will help cull the competitive field. After the macro headwinds clear, Snowflake will be further ahead. This reasoning applies to the hyperscalers as well, who have been pausing or significantly pulling back on hiring.
Customer Activity
In spite of macro conditions, Snowflake performed well on landing new customers in Q3. The incremental additions of total customers was largely linear with Q2 and higher than the average of the prior 4 quarters of 454 additions. This translated into an annual growth rate of 34.4% and 7.1% sequentially. Combined with Snowflake’s high DBNRR, a total customer growth rate over 30% should keep overall revenue growth above 50%.
Large customer growth is also driving the outperformance in revenue. For Q3, snowflake added a record 41 new customers with trailing 12 month revenue over $1M, bringing the total to 287. This represented 16.7% sequential growth and about 94% annually. Q3 delivered the largest sequential increase in $1M customers to date.
It’s also worth noting that these $1M customers represent just 4% of total customers. While not all customers can be expected to ever reach $1M in spend, a lot of them could. This implies that Snowflake has plenty of room to expand their $1M customer count. As part of their long term projection for $10B in product revenue by FY 2029, they are assuming about 1,400 $1M customers with an average spend of $5.5M. Given the trajectory this year (287 customers at $1M growing nearly 100% y/y), they appear to be well on-track to hit that long term goal.
This growth in large customer spend is reflected in Snowflake’s Dollar-Based Net Revenue Retention Rate (DBNRR rate). Loosely, this captures the increase in spend y/y for existing customers, including churn. For Q3, the DBNRR rate was 165%, ticking down from last quarter’s 171%. A best of breed DBNRR rate is considered to be over 130%.
Leadership has indicated that they expect the DBNRR rate to decrease over time. Based on commentary from Q4’s results, the CFO expects NRR to stay above 150% for FY 2023 (this calendar year), and added that he predicts “it will remain well above 130% for a very long time.” Combined with continued growth in customer additions, these two factors should allow Snowflake to maintain 50% or higher revenue growth for a few more years.
As another measure of large customers, Snowflake tracks their penetration into the Global 2000. In Q3, Snowflake added one more record of 28 new G2K customers. This represents a doubling of the increase in Q2. The sequential growth rate accelerated to 5.4%, while the annual rate of increase was a respectable 18%.
On the earnings call, the CFO stated that they expect these accounts to become their largest customers over time. Once landed, these Global2000 customers are increasing their spend rapidly. The CFO stated that the average trailing 12-month product revenue from G2K customers is $1.3M, up from $1.2M last quarter. While this sounds impressive, the average spend for the largest Snowflake customers (>$1M a year) is $3.7M, implying that G2K customers could triple their spend over time.
In fact, some of Snowflake’s largest customers are still in hypergrowth mode. The CFO shared that quarter over quarter, six of their top 10 customers grew faster than the company overall. A top 10 customer is easily spending over $10M a year on Snowflake and would have increased their spend by more than 12.0% sequentially in Q3.
Snowflake leadership has discussed in the past how many large customers are not in the G2K, usually representing digital natives that have a large data footprint. During their Investor Day presentation in June, the CFO pointed out that only 45% of their customers spending over $1M annually in product revenue are in the Global2000.
During the Q2 earnings call, the CFO referred to an example in which “a Global 2000 technology company is now a top 10 product revenue customer less than two years after signing their initial deal.” A top 10 customer generates over $10M in product revenue, based on the metric from Q1 that they had 10 customers spending more than $10M. This underscores how fast customers can become large, where the referenced technology company went from $0 to $10M+ in annual spend in less than two years.
During Investor Day, the CFO provided an example with a large Telecom customer of how this happens. The customer started using Snowflake as part of a migration from their on-premise data warehouse. This required the first year to kick-off the migration, allowing them to really start ramping up workloads in year two. These workloads encompassed data warehouse activity, and also began addressing other application workloads associated with a mixture of OLAP and OLTP (transactional) activity. The end result was that compute credit consumption increased by 6x in the second year.
As part of guidance for Q4, the CFO mentioned the impact of another platform optimization. As investors will recall from the Q4 FY2022 earnings report, Snowflake management revealed that they rolled-out platform efficiencies that would deliver in a 10-20% improvement in workload performance on average. In a simple example, if a customer could run 100 queries for every Snowflake consumption credit before the optimization, now they can run 110 to 120 per credit.
Management then put forward a thesis that platform optimizations eventually drive greater utilization over time, as large customers find the unit economics more compelling and move additional workloads to Snowflake. During the Investor Day presentation in June, the CFO provided an example of this.
In this example, a large Retail customer experienced a platform optimization (labelled as Quarter 2). In the next quarter, the realized product revenue for Snowflake did in fact dip below their prior contracted forecast. However, two quarters after the optimization, their product revenue surpassed the previously anticipated usage level, as a result of moving more workloads to Snowflake. From that point forward, the actual utilization continued to accelerate relative to the previous contracted commitment.
Data Ecosystem
A big part of the growth narrative for Snowflake is grounded in the work they are doing to build industry ecosystems around their data platform. This primarily focuses on two initiatives:
- Data Collaboration. Enabling secure data sharing between companies. Layer on governance and frictionless management.
- Native App Development. Allow developers to build applications directly over a customer’s data set within the Snowflake environment.
Both of these provide the benefit of eliminating copying of data to another destination. For data sharing, two companies can exchange data without requiring complex APIs or more rudimentary file transfer processes. More importantly, the scope of the data can be limited to just what is needed with a fixed duration. The recipient can’t “keep” a copy of the data after the partnership ends.
While I think these initiatives represent the future opportunity for Snowflake’s Data Cloud, analysts and competitors often focus on comparisons of the core data analytics engine. I think this is a dated view of where competitive advantage will lie in the future. Data processing will become a commodity. Claims of “x” times faster performance will become irrelevant.
Snowflake’s competitive moat is being built on the capabilities that enable data sharing ecosystems and centralization of data. If data processing for core analytics is largely the same across providers, then competitive differentiation will be determined by network effects. More data sharing partners, more available data sets for enrichment and more applications that address data in one place. No more data copies proliferating everywhere, introducing inefficiency, lack of governance and security risks.
A useful analogy might be found in the space of smart phones. While one could debate the usability of iPhone or Android based on their core feature sets, the value of each is primarily driven by the ecosystem of apps available on their respective platforms. A new smart phone would have difficulty breaking into the market at this point, as they would require thousands of apps to be built for their platform just to reach parity. Snowflake is establishing a similar head start building an ecosystem around their core data platform through rich data sources, programmability and native apps.
I think this future-looking perspective was underscored by several announcements at AWS:Reinvent. Amazon introduced a couple of new capabilities that look strikingly similar to Snowflake’s recent moves in this area. While on the surface, this could be construed as negative, with another large player mirroring capabilities from Snowflake, I view it as quite bullish. AWS has a birds-eye view of customer activity and requests across their platform. If they decide to introduce a new product line, it is generally because they perceive demand for it.
The worse situation for Snowflake would be if competitors largely ignored the potential for data sharing, eliminating data transfer between data stores and enabling app building directly on the data lake. This validation by AWS (and other competitors) shows that Snowflake is on the right track.
Specifically, AWS announced the following new capabilities at Reinvent:
- Clean Rooms. In preview, AWS Clean Rooms enables controlled data collaboration between multiple partners within the AWS environment. From the web site, “AWS Clean Rooms helps customers and their partners to more easily and securely match, analyze, and collaborate on their combined datasets–without sharing or revealing underlying data.” This obviously sounds a lot like Snowflake’s Data Sharing and Clean Rooms capabilities.
- Security Lake. Also in preview, Amazon Security Lake automatically centralizes security data from cloud, on-premises and custom sources into a purpose-built data lake stored in a user’s account. Security Lake has adopted the Open Cybersecurity Schema Framework (OCSF), an open standard. The service can normalize and combine security data from AWS and a broad range of enterprise security data sources. Amazon’s Security Lake offering is similar to Snowflake’s new Cybersecurity Workload, which provides a data lake to serve as the central data source for security analytics. Customers can create their own threat detection data processing jobs within the Snowflake environment or utilize one of the many partner apps made natively available in the Snowflake Marketplace.
- Elimination of ETL between data stores (Zero ETL). Amazon sees a future in which data engineers don’t need to worry about setting up ETL jobs to move data from transactional databases into their data warehouse. Similarly, they want to make that data available for machine learning jobs. As a first step, they introduced direct integrations between Redshift and Aurora and Redshift and Apache Spark driven systems like Sagemaker and EMR. This is similar to Snowflake’s strategy with Unistore and Snowpipe to enable all data processing to converge on a single platform. Amazon’s focus here is more about integration, whereas Unistore moves the transaction engine (AWS’ Aurora in this case) into the Snowflake Data Platform.
Do I think that AWS’ moves will disrupt Snowflake’s position in these areas? No – not significantly. First, for data sharing, AWS would be limited to customers on AWS. The same could be said for Snowflake, except that they span multiple hyperscalers. Enterprises would prefer the optionality of exchanging data with partners across cloud vendors.
Second, Snowflake has been working on these capabilities for a while. They have a significant first mover advantage, both in terms of product capabilities and existing data sharing relationships. Data sharing benefits from network effects. Much like a social network, as more enterprises participate, the value attributed to the network increases. This is why consumers generally join Facebook or Twitter, as opposed to a knock-off. The same argument can be made for Snowflake, with over 1,500 customers (and growing) with an established data sharing relationship. Snowflake’s largest customers adopt data sharing at an even higher rate.
Finally, AWS’ track record with look-alike products has been mixed. They have taken a similar product strategy with several open source projects, offering hosting for a stripped down version of the open source project, competing with the maintainers. For the most part, these efforts haven’t yielded much, with customers still choosing to use the cloud product offered by the open source project maintainers (MongoDB, Elastic, Confluent, etc.).
While not open source, a similar argument can be made around Snowflake’s capabilities in these areas. Discerning enterprise buyers will want the optionality, breadth of features and long-term commitment that Snowflake provides. Amazon will throttle resources for new products that experience limited adoption, or even shut down programs eventually.
Snowflake has long recognized the opportunity to move beyond their core data platform to build a robust set of data services and native applications on top of the customer’s data, keeping everything in one place. This has the benefits of lower cost, better controls and a simpler system architecture. Customers are gravitating towards these advantages, recognizing that Snowflake’s scope across all hyperscalers gives them optionality.
To track their progress in building an ecosystem of data sharing and native applications, Snowflake leadership introduced a new slide to the Investor Presentation which summarizes their “Data Cloud Metrics”. Prior to this, management would reference these metrics separately, usually in the prepared remarks. Q2 FY2023 represented the first quarter in which all three of these metrics driving growth of the larger Data Cloud opportunity were included together in a formal slide.
Snowflake leadership included the same slide for Q3 with updated metrics. That reinforces their intent to report on these as formal KPI’s going forward.
To capture Data Sharing activity, Snowflake reports a measure called “stable edges”. Snowflake leadership sets a high bar for considering a data sharing relationship between two companies as actively being used. In order to be considered a stable edge, the two parties must consume 40 or more credits of Snowflake usage each day over a 6 week period for the data sharing relationship. I like this measure, as it separates empty collaboration agreements from actual value creation.
In Q3, 22% of total customers had at least one stable edge. This is up from 21% last quarter and 17% a year ago. If we apply these percentages to total customer counts in the period, we get the chart below. While customers grew by about 34% y/y in Q3, the number of customers with at least one stable edge grew by 74%.
To me, that growth represents an important signal for the value-add of data sharing. If we assume that new customers take at least one year to get around to setting up a stable edge, then almost 30% of customers over a year old have a stable edge in place (total edges / customer count Q3 FY2022).
We also know that data sharing is a popular feature for Snowflake’s largest customers. For Q3, leadership reported that 66% of $1M customers have at least one stable edge. This is up from 65% in Q2 and 63% at the end of Q1. Given that the penetration of stable edges is greater for large customers, it may be that these data sharing relationships encourage higher usage levels. Enterprises may be exchanging data with other parties and then applying more data processing to combine, enrich, enhance and then share that data.
Facilitating these data sharing relationships represents a competitive advantage for Snowflake, in my view. They increase customer retention, generate network effects to attract new customers and drive incremental utilization as shared data sets are filtered, cleansed and combined with other third party data. This network of data sharing relationships elevates Snowflake’s value proposition for customers onto a higher plane beyond focusing on tooling for analytics and ML/AI workloads within a single company.
To enable data sharing and enrichment, Snowflake’s Data Marketplace provides users with access to relevant data sets from third-party data providers. Companies can subscribe to these data sets for a fee and then seamlessly combine them with their Snowflake instance through data sharing. This eliminates the overhead of setting up separate integration processes to import, filter and combine this data. Additionally, secure data sharing handles updates automatically. That represents a huge cost savings.
At the end of January (Q4), Snowflake had 1,100 data sets from 240 providers. For Q1 FY2023, listings grew 22% q/q to 1,350 data sets from over 260 providers. For Q2, marketplace listings grew another 13% sequentially to 1,539. In Q3, Snowflake reported another 11% sequential increase in Data Marketplace listings to reach 1,702 total. I couldn’t find a count for Q3 of FY2022, but I imagine the listing count has almost doubled in a year.
If the Data Marketplace is seeing strong growth, the Snowflake Powered By program appears to be garnering even more participation. This represents companies that have decided to build their data-driven product or service on top of Snowflake’s platform, that they then sell to their customers. For Q1, Snowflake announced there were 425 Powered by Snowflake partners, representing 48% growth over the prior quarter’s count of 285. For Q2, Powered By participation took another large jump forward, increasing by 35% q/q to reach 590 registrants. In Q3, Snowflake reported another 20% q/q growth, hitting 709 registrations by quarter’s end.
That is quite a jump. As these companies grow their businesses, their consumption of Snowflake resources should increase significantly. As part of Investor Day in June, leadership revealed that 9% of their $1M+ customers were in the Powered By program. They ended Q3 with 287 customers of this size, implying that about 26 Powered By participants were generating more than $1M in annual product revenue.
At their Snowday event in early November, Snowflake’s SVP of Product shared that the 4 fastest growing companies from $1M to $100M in ARR in history are built on Snowflake. Powered By provides participants with tools and resources to build, market and operate applications in the Data Cloud. Some example participants include:
- Blue Yonder: Supply chain solution provider
- Panther: Cybersecurity
- Piano: Media and advertising
- Simon Data: Marketing technology
This is because Powered By participants inherently generate high utilization of Snowflake. In their case, the foundation of their service infrastructure is running on Snowflake’s platform. This is in contrast to the normal enterprise use cases around analytics and machine learning. As more companies choose to build their business on top of Snowflake, we will likely see this contribution to utilization grow faster. In a sense, the Powered By program elevates Snowflake to spending patterns on par with the hyperscalers (which are usually the largest line item in an IT budget).
Product Commentary
Snowflake’s product work in the quarter focused on continuing the major new capabilities introduced at their Summit conference in June. Incremental improvements were capped off during their Snowday event on November 7th in San Francisco. This was part of Snowflake’s Data Cloud World Tour. Similar to prior events, the Snowflake team used the occasion to showcase new product innovations and to interact with customers.
At a high level, Snowflake’s product roadmap has centered on these themes this year:
- Single Data Store. Snowflake’s grand vision is for enterprises to centralize as much data as possible in the Data Cloud. This eliminates the need to copy data out to other databases to serve as the data source for application workloads. To facilitate this, Snowflake announced Unistore at the Summit conference, which adds transactional (OLTP) support to the Snowflake platform. I don’t expect every consumer application to replace their existing transactional database with Snowflake. This capability starts the process of replacing databases typically provisioned to handle high concurrency, low latency workloads that serve summary data output from a Snowflake job. During Summit in June, Snowflake’s customer Western Union talked about replacing a Cassandra cluster that provides pricing data to their customer-facing applications. At Snowday in November, the SVP of Product shared that “demand is through the roof” for Unistore. Customers include Novartis, Tecton, Iqvia, UI Path, Adobe and Tapestry.
- Enable Application Workloads. The introduction of the Native Application Framework represents a first step in allowing developers to build new data-rich application services in an environment that has direct access to their customer’s data. Typical SaaS applications bring customer data into their own hosting environment. This is problematic from a security point of view and generates redundant data storage cost, which is passed on to customers. With the Native Application Framework, a new breed of SaaS developers can move their applications to work on the customer data in Snowflake directly, without needing to store a separate copy of it. I think this could represent a disruptive development for typical enterprise SaaS application delivery.
During Snowday, the team announced that Snowflake has been moving forward on the integration of Streamlit into the core platform. This will allow developers to build data visualization applications on top of their data within Snowflake, maintaining security and governance over the data by obviating the movement off to another platform. They expect to have private preview Streamlit integration available at the beginning of 2023.
Streamlit serves as the interaction engine for the vast majority of our Data Science & Machine Learning models today, actively transforming how our teams build, deploy, and collaborate on powerful applications with other stakeholders across the business. With Snowflake’s Streamlit integration, we can go from data to ML-insights all within the Snowflake ecosystem, where our data is already present, making it easier and more secure for us to create impactful applications to further mitigate the negative impact of flight disruptions, provide more predictability to our operational planning teams, and more customer personalization to give our customers the best possible experience.
SAI RAVURU, GM DATA SCIENCE & ANALYTICS, JETBLUE
- Programmability. Besides providing the runtime environment, Snowflake is bringing the tools to support real programmability. Snowpark for Python makes Python’s rich ecosystem of open-source packages and libraries accessible in the Data Cloud. With a highly secure Python sandbox, Snowpark for Python runs on the same Snowflake compute infrastructure as Snowflake pipelines and applications written in other languages. Snowflake has struck a relationship with Anaconda, which extends access to more Python packages within the Snowpark development environment. During Snowday, Snowflake leadership shared that demand for Snowpark for Python has exploded, with customer adoption increasing by 6x. They now have hundreds of customers making use of the capability, including Charter Communications, EDF, NerdWallet, Northern Trust and Sophos. The benefit these companies cite is the ability to build applications with access to their data directly on Snowflake (versus porting it to another data source).
Snowpark for Python has created new opportunities and use cases for our team to build and deploy secure and compliant data pipelines on Snowflake, so we can more efficiently provide our customers with the tools needed to handle every aspect of their finance journey. Snowflake’s continued investments in Python allow us the flexibility to code in our programming language of choice, and accelerate the speed of innovation for our end users.
SATHISH BALAKRISHNAN, DIRECTOR OF DATA ENGINEERING, NERDWALLET
- Powered By Program. As I discussed, I think that Snowflake’s Powered By program has a lot of potential. Given its growth, I would even speculate that in the future, revenue from Powered By approaches revenue from regular customer use of the Snowflake platform. This is because Powered By program participants are building their entire business on the Snowflake platform. We have already seen several sizable security vendors take this approach. As mentioned previously, during Snowday the SVP of Product shared that the 4 fastest growing companies from $1M to $100M in ARR are built on top of Snowflake. This could become a significant revenue driver if we consider that a typical SaaS vendor might spend 10-20% of revenue on their software infrastructure. Not all of that would go to Snowflake, but a good portion of that $10M-$20M+ in annual IT spend could.
- Vertical Solutions. Snowflake is focusing on building out industry verticals and workload verticals. In each case, the Snowflake Data Cloud is being leveraged to deliver capabilities that provide specific benefits to that vertical. The first workload vertical introduced was Cybersecurity. Leadership has indicated that two more are coming next year. During Snowday, leadership highlighted the launch of four separate industry-focused Data Clouds as well, with over 120 partners supporting them. The four industry-specific Data Clouds include the Financial Services Data Cloud, Media Data Cloud, Healthcare and Life Sciences and Retail Data Cloud. Within each Data Cloud, Snowflake collaborated with partners to provide interoperability, secure data sharing and best practices per industry. These industry-focused Data Clouds serve to attract companies within the same industry to share data in a controlled environment.
Underpinning these business opportunities are the capabilities of the Snowflake Data Cloud. While competitors badger Snowflake for being a “closed” system, I think this works to Snowflake’s advantage. They deliver a working product that requires little specialization to consume or support. They can also move quickly, without being tied to adherence to open standards and interoperability between interfaces. If they need to tweak the storage model to improve performance, those decisions can be managed internally.
With all that said, I think Snowflake put aside the “closed” system argument by adding support for Iceberg Tables. These provide customers with the ability to move data in and out of Snowflake using the open Apache Iceberg format. Iceberg is a popular open table format for analytics. Iceberg brings high performance, reliability and simplicity of SQL tables to big data. It is compatible with other data processing engines like Spark, Trino, Flink, Presto and Hive.
Besides the Apache Iceberg project, alternatives for an open table format are Delta Lake (from Databricks) and Apache Hudi. The Snowflake team claims that they evaluated all three options and chose the Apache Iceberg project. Iceberg was open-sourced by Netflix is backed by Apple and Amazon.
I think Iceberg compatibility delivers a smart compromise for Snowflake. They can retain the benefits of a closed system, while still demonstrating to customers a willingness to support an open data exchange format, so that they don’t feel “locked in”. During Snowday, Snowflake’s SVP of Product also shared an update on Iceberg Tables. More than 50 customers have participated in the private preview, including Dropbox, Indeed and DriveTime.
Take-aways and Investment Plan
Snowflake delivered another strong set of results in Q3, when accounting for the macro backdrop and pressure on IT spend. While this resulted in a projected revenue growth deceleration for Q4, the leadership team provided an optimistic preliminary guide of 47% annual product revenue growth for the next fiscal year. This was nearly linear to Q4’s estimate for 49% growth. As a conservative measure, Snowflake’s actual revenue growth for FY2024 could land in the 50% range.
Revenue growth for FY2023 is currently projected to be 68%-69%. Analysts now have FY2023 total revenue estimated at $2.053B, representing 68% growth. For FY2024, they have updated their models with an estimate for $3.034B in revenue or 48% growth. If we apply the 23% FCF margin target, it implies just about $700M in FCF for FY2024. Snowflake’s current EV is $42B, representing a forward EV/FCF multiple of 60. This is high, but not outlandish for a company growing revenue at 48%. For comparison, ServiceNow (NOW) has a trailing EV/FCF multiple of 41 for 21% annual growth. And, of course, this is a multiple on FCF, not sales.
Looking forward, I think Snowflake’s high growth can continue. Likely not at the nearly 70% this year, but closer to 50% annually for longer than the market expects. This is supported by the relatively stable customer growth rate above 30% and DBNRR over 150%. Even as DBNRR descends towards 130%, high annual growth rates above 40% a year are achievable.
We are still very much in the growth phase and the early innings of migrations with customers… And data is becoming one of the core assets of most companies today. Whether you are a media advertising company or financial institution, getting real-time data and insights into your business is pretty important piece to companies. And we are getting data. There is one large oil and gas company, I know, we sold them $3 million worth of Snowflake… And they themselves have said they see $30 million in value.
Snowflake CFO, Barclay’s Global Technology Conference, November 2022
The strongest signal that durable growth would be possible lies in the elasticity of a spending cap for large customers. Many customers are exceeding $5M, $10M and even $20M of annual spend, and yet show little signs of slowing down. Similar to prior quarters, the CFO casually mentioned that 6 of their Top 10 customers grew their spend in Q3 at a faster rate than the company overall.
During the Q3 earnings call, the leadership team shared some staggering anecdotal evidence for the potential size of Snowflake’s largest customers. The fact that Snowflake is now periodically citing $100M customers just underscores the magnitude of the opportunity and brings the FY2029 $10B revenue target into scope.
Yeah. I’ll just say on the G2K, there’s no reason why a G2K can’t spend well north of $10 million a year on Snowflake, and that’s a conservative number. But it will take time to get there. This is really a marathon.
It’s not a sprint for our customers, and it will take time. And we are starting to see very large customer relationships. We did sign a $100 million contract in the quarter, again, with an existing customer on renewal. And we will have $100 million plus contracts this quarter in Q4 with customers that I know are running out of credits and want to lock in for long term.
Snowflake Q3 FY2023 Earnings Call
With that kind of growth and considering the addition of 41 more $1M+ customers and 28 of the Global 2000 in Q3, we can start to see how Snowflake might achieve and even surpass their FY2029 product revenue target of $10B. That estimate was based on reaching 1,400 customers spending over $1M in product revenue and raising their average spend to $5.5M. Given the momentum we are seeing now, this appears achievable in 6-7 calendar years. Snowflake reached 287 $1M+ customers in Q3, almost doubling over the prior year’s count.
Additionally, these numbers were primarily based on Snowflake’s existing business from their cloud data platform. When they put forth this model in 2021, it didn’t account for new product offerings. I think that Snowflake’s product strategy expansion into new areas will provide additional tailwinds to this model. These growth opportunities include the incremental contribution from the Powered By program, new OLTP workloads running on Unistore and data sharing through the Marketplace and Industry Verticals.
Currently, SNOW is one of the larger positions in my personal portfolio. I plan to maintain this allocation. With strong free cash generation, more than doubling year/year, Snowflake’s valuation based on a multiple of FCF could be considered fair for its growth rate. Assuming revenue growth continues at an elevated rate and FCF margin ticks up incrementally, we could see SNOW’s stock price start to appreciate proportionally to annual growth.
Further Reading
- As highlighted above, our partners at Cestrian Capital Research published a review of Snowflake’s Q3 results that includes more financial and technical analysis.
- Peer analyst Muji over at Hhhypergrowth published some updates on Snowflake. First was coverage of the Q3 earnings results. Additionally, he reviewed their BUILD event and Snowday. These are very useful supplements to my coverage, usually providing additional insight on product development.
- I recommend watching the Snowflake Summit presentations, particularly the 4 keynotes. They are available on-demand, after a light registration. So too are the presentations from Snowday.
NOTE: This article does not represent investment advice and is solely the author’s opinion for managing his own investment portfolio. Readers are expected to perform their own due diligence before making investment decisions. Please see the Disclaimer for more detail.
Excellent information as usual. Thank you for proving great insights. Peter
Hey Peter, great content as usual. I’m trying to wrap my head around the Native Application Framework and concept of moving applications to the data, and not the data to the applications. Could you give me and example of how this might work?
I’d imagine data needs to be in a specific format / schema for an application to properly use it. If this is the case will customers be following some steps the application will provide to put their data into a usable format?
An example of how one of these applications may work on Snowflake vs historically by copying the data to another location would be helpful!
Thanks,
Calum
Hi Calum – good question. If the data were not in an optimized format for read traffic, it could be transformed and stored in Unistore. A good example of connecting applications directly to Snowflake and eliminating an external data source was provided by Western Union at the Summit conference. Currently, they pull their pricing data out of Snowflake and insert it into a Cassandra cluster to serve as the data source for their business applications that need pricing data at a relatively low latency and high concurrency response. Their plan is to instead store this data in Unistore and then connect their business applications directly to Snowflake, eliminating the Cassandra cluster. Other examples can be found with the Powered By program participants. They are essentially running their security analytics, observability and other types of applications directly on the Snowflake instance. This is different from the current approach by security and observability vendors (S, CRWD, DDOG) to collect and maintain their own copies of a customer’s log data.
Hi Peter;
Always enjoyed your posts, full of information and insight as always. I am very comfortable that SNOW can achieve its $10b revenue target by FY29. Most likely they will surpass that target. But let’s assume the baseline $10b revenue and $2.5b free cash flow at FY 2029 while growing revenue at 30%. What reasonable P/S multiple you think Snow should have? Or P/ FCF margin multiple it would deserve?
Thank you very much.
Larry
Thanks, Larry. As a baseline, I think NOW provides a reasonable comparable for P/FCF. It enjoys a multiple of 44 on P/FCF with an annual growth rate of 21%. I think SNOW would command a higher multiple with an exit growth rate of 30%, so 50 is probably reasonable. That provides about a 2.5x increase from here.
Thanks Peter for your reply. 2.5 x of today’s stock price in 6 years is a decent investment but hardly a great one. To be a great investment , SNOW would need to exceed its $10b revenue target and 25% FCF margin by a good amount. I think it can and will as more of its large customers come to scale. Additionally, SNOW will achieve its targets with the products currently in market. I cannot believe it won’t introduce one or more revenue -producing new products in the 6-year span.
Having said that, do you think Cloudflare, with its much smaller market cap and $5b revenue target in 5 years , would be a better investment?
Thanks.
Larry
Thanks.
Larry
Hi Larry – I agree that Snowflake will likely introduce new products/revenue streams that could bolster the $10B revenue target. I like the idea that the target is achievable with current products as a baseline. There is certainly an upside scenario.
Regarding Cloudflare, I believe that an equal opportunity exists. They are leaning into more product categories than Snowflake. If those all hit, then Cloudflare could outperform Snowflake over the next 5 years. However, their penetration currently into the large growth categories (edge compute, storage, Zero Trust) is small currently. So, the risk is that their offerings don’t resonate in one or more of those markets. Snowflake, on the other hand, has clear leader status in its category. So, I think the risk/reward between the two is balanced. I own both in roughly equal allocations.
Peter, thank you for the excellent breakdown of the Snowflake business and their future pathway. As a non-techie this is priceless.
Thanks for the feedback – glad you found it helpful.
Thanks for the informative review. Are ChatGPT or PaLM likely to have any significant effect on Snowflake or other stocks you cover?
Hi Peter, excellent write-up, always enjoy it.
Can you explain in simple terms the relation between DBNRR(150%) and CUST. GROWTH(30%) for the assumption of durable rev growth of 50%+. Just want to understand the math for future ref,
Thanks again!
Thanks, Isaac. There isn’t a set formula for deriving revenue growth from the combination of DNBRR and new customer additions. Most software infrastructure companies (DDOG, SNOW, CRWD) derive the majority of their new revenue in a quarter from existing customer expansions. If a company’s DBNRR is 150% and churn was minimal, they would grow revenue by 50% in the quarter without adding new customers. Revenue from new customers then contributes a smaller percentage of recognized revenue and rounds out the rest. For some companies, like Datadog, that is about 15% (85% of new revenue in a quarter from existing customers, 15% from new). For Snowflake, we don’t have a metric, but could assume a similar range. So, I usually apply a 3/4 weighting to DBNRR for a rough growth value and then round out the remainder with total customer growth. For Snowflake, I think the high customer expansion rate will be the primary driver of my 50% durable revenue growth target.