Leading up to Snowflake’s Q4 FY2023 earnings report, investors felt insulated from the risk of a low revenue guide for the full year. This concern had been abated in the Q3 report, as management blunted a reduced Q4 revenue guide with a preliminary estimate that FY2024 would deliver product revenue growth of 47%. During the Q3 earnings call, this had the immediate effect of propping up the lagging after-hours stock price.
When management lowered their actual guidance in the Q4 report on March 1st to reflect product revenue growth of 40% y/y, investors were disappointed. Expectations for higher growth had already been set. Stepping back though, without the pre-announcement, a 40% guide may have been fine. Combined with an adjusted FCF margin target for the year of 25% puts Snowflake in the rare position of maintaining performance above the Rule of 60 in a tough spending environment. That target would deliver about $700M in free cash flow for this year.
Compared to other software infrastructure peers, Snowflake enjoys one of the highest valuations. Its market cap is about $44B with a trailing P/S ratio of 21. With optimistic revenue estimates, this ratio comes down quickly. At analysts’ revised revenue target for FY2024 (current calendar year), the forward P/S ratio is 15.5. Looking out two years with the FY2025 revenue estimate for $4.024B in revenue, the forward P/S approaches 11. Currently, analysts are projecting a revenue growth rate of 38.9% for FY2025, which is just a tick below the revised projection for FY2024 (this year) for 40.2% revenue growth.
That linearity forms the crux of the Snowflake investment thesis. If the company can continue expanding into their seemingly uncapped TAM at a durable revenue growth rate around 40%, then the opportunity for upside is reasonable. Combined with an adjusted FCF margin of 25% (or more), a premium valuation multiple appears fair. The price to FCF multiple for FY2024 lands at about 63 – not outlandish considering that Snowflake more than doubled FCF over the past year.
Audio Version
View all Podcast Episodes and Subscribe
The factor necessary to determine whether Snowflake is fairly valued hinges on the durability of revenue growth. A view that Snowflake is overvalued at this price inherently assumes a marked decrease in revenue growth rates over the next few years. The bullish narrative asserts that growth will remain higher, possibly even hovering around 40% for some time, particularly as headwinds from IT budget cuts normalize. Bears see competition, law of large numbers or the curtailing of cloud migrations pushing down growth quickly. In either case, one can’t take take a position on valuation without answering the growth question for themselves.
A view of Snowflake’s valuation has to be derived from analysis of the market opportunity, Snowflake’s position in it and the likelihood that Snowflake can realize its vision to become a central data repository for the enterprise. If they can, then it is quite possible they amass a few thousand customers that spend $5M-$10M (or more) annually.
On the Q4 earnings call, Snowflake management reiterated their FY2029 product revenue target of $10B. Going forward 5 years from the FY2024 product revenue estimate of $2.897B, the target results in a forward growth rate of 28.1% compounded. Considering that we are starting at nearly 40% growth for the first two years, then the implied exit growth rate would be in the low 20% range. Or, Snowflake beats this target. When Snowflake first introduced the $10B projection, they assumed a 30% revenue growth rate in the final year.
In this post, I will examine the factors that contribute to the sustainability of Snowflake’s revenue growth going forward. As is usually the case, a determination of growth durability hinges on the size of the market, the product offering to fit that and the ability to maintain a competitive moat. I will examine these and then reconcile them with Snowflake’s latest earnings results.
Snowflake Product Strategy
Snowflake’s vision is to become the single cloud-based data store for all of an enterprise’s data. This involves combining application and analytical workloads into one Data Cloud. While this sounds far-fetched, it is actually how data storage worked from the beginning. Analytics and transactional workloads were separated in the 90’s for performance reasons, when OLAP became its own data processing branch separate from OLTP. Given that compute and data storage capacity were expensive in fixed on-premise data centers, this separation of concerns made sense.
Elastic cloud-based infrastructure introduced the opportunity to consolidate the two functions again. To be clear, Snowflake is not inventing a new technology to efficiently store and query data to serve analytics and transactional workloads in the same process space. Rather, they are recognizing that the cloud allows them to present the same entry point for data analysts and developers. Data is still duplicated in memory and optimized for the type of query pattern the workload requires. On cloud infrastructure, Snowflake can balance these workloads efficiently, providing customers with simplicity and reduction of infrastructure management overhead.
Assuming that Snowflake can deliver a platform that supports the consolidation of data workloads into one location and programming interface, then several competitive advantages emerge.
- Reduction of Duplication. Separation of application and analytical workloads inherently duplicates data. The standard model of collecting business data in transactional databases and pushing them back to a central data warehouse through a variety of ETL processes becomes less relevant. If a single data store can address most transactional and analytical workloads (or at least those for data applications), then data duplication is greatly reduced. This lowers costs of infrastructure hosting and staff to maintain all the jobs necessary to move data around.
- Security and Governance. By definition, security and control over enterprise data decreases as the number of storage locations increases. Most enterprise SaaS applications (CRM, HRM, ERP, etc.) rely on their own copy of the data. If these applications can be ported to run directly on a single, secure data platform, then the number of copies of data decreases. The same benefit is gained for data sharing use cases. Distributing copies of an enterprise’s sensitive data through FTP and API’s just creates more attack vectors for hackers. Secure data sharing within a single platform ensures that enterprise data stays in one place and isn’t sitting in multiple clouds with different SaaS vendors. Further, data clean rooms allow combination of sensitive data without the need to share a complete copy.
- Infrastructure Footprint Reduction. Consolidating application and analytical workloads onto a single platform reduces the amount of data storage infrastructure that an enterprise needs to maintain. As a simple example, during Snowflake Summit in June 2022, Western Union talked about replacing a Cassandra cluster that provides pricing data to their customer-facing applications. That Cassandra cluster could be eliminated by pointing those applications directly to Snowflake, as well as removing the need to maintain an ETL job to load data into it.
The next phase of Snowflake’s evolution is grounded in the work they are doing to build industry ecosystems around their data platform. If enterprises can consolidate all of their data into a single cloud-based platform, then business application development and data sharing with partners can be greatly simplified and better secured.
While they continue to improve their capabilities in enabling core analytics and data science workloads, future disruption revolves around two primary initiatives:
- Data Collaboration. Enabling secure data sharing between companies, with granular governance and frictionless management layered in. This effort was started in 2018 and has been the catalyst for Snowflake’s growth in data sharing and enabling industry ecosystems for customers. By providing customers with seamless mechanisms to distribute data securely to their industry partners, Snowflake is building strong network effects. These can’t be easily duplicated by competitors who are either on a single cloud (hyperscalers) or offer rudimentary solutions to data sharing that still create copies (Databricks).
- Native App Development. Allow developers to build applications directly over a customer’s data set within the Snowflake environment. This represents the next driver of Snowflake’s growth. The rationale is simple. It is more expensive and less secure for an enterprise application provider to maintain their own copy of an enterprise’s data in their cloud hosting environment. Rather than trusting your CRM (or other SaaS application variant) provider to keep your data secure and pass on any cost efficiencies, why not allow CRM app developers to host their solutions within Snowflake’s environment on top of your enterprise data housed in one location? This is the crux of Snowflake’s strategy to “disrupt” application development. While early, the value proposition makes sense.
Both of these growth strategies provide the benefit of eliminating copying of data to another destination. For data sharing, two companies can exchange data without requiring complex APIs or rudimentary file transfer processes. More importantly, the scope of the data can be limited to just what is needed with a fixed duration. The recipient can’t “keep” a copy of the data after the partnership ends. The same benefit applies to customer data for a CRM app, employee data for HRM and every other SaaS enterprise app derivative.
One Data Cloud
If Snowflake can realize their vision for a single cloud-based enterprise data store (the Data Cloud), they will unlock an enormous market opportunity. To size the opportunity, Snowflake leadership identifies a set of workloads that the data platform can address currently. Those represent the serviceable opportunity currently with Snowflake’s existing product set and go-to-market focus.
They size the market at $248B for this year, while projecting revenue representing just over 1% of that. The core of the market still encompasses Snowflake’s foundational workloads in analytics, data science and data sharing. They are slowly adding new workload targets, like security analytics, which they estimate as a $10B opportunity. The reasoning for this addition is straightforward – enterprises and third-party app developers can build security scanning solutions on top of Snowflake’s data cloud, taking advantage of the large scale data processing platform that Snowflake has already built.
This new workload in cybersecurity (with more workloads coming) is supported by Snowflake’s Powered By program. For cybersecurity, they already have several partners including Securonix, Hunters, Panther Labs and Lacework. The benefit to Snowflake with workloads like cybersecurity is twofold. First, these application builders generate revenue for Snowflake through their consumption of compute and storage resources. During their Investor Day in 2022, leadership revealed that 9% of their $1M customers were in the Powered By program. Second, having these capabilities available to enterprise customers provides one more reason to consolidate their data footprint onto Snowflake.
Given its growth, I would even speculate that in the future, revenue from Powered By approaches revenue from regular customer use of the Snowflake platform. This is because Powered By program participants are building their entire business on the Snowflake platform. We have already seen several sizable security vendors take this approach. During Snowday in late 2022, the SVP of Product shared that the 4 fastest growing companies from $1M to $100M in ARR are built on top of Snowflake. This could become a significant revenue driver if we consider that a typical SaaS vendor might spend 10-20% of revenue on their software infrastructure. Not all of that would go to Snowflake, but a good portion of their $10M-$20M+ in annual IT spend could.
While a $248B TAM is one of the largest in software, Snowflake leadership isn’t capping it there. They project a bigger market opportunity if they realize the full vision of the Data Cloud. That market projection is much larger. Leadership represents it as an amorphous cloud at this point.
To accomplish this, the Snowflake team still has a lot of work to do. That work requires both continued product development and sales efforts. It will likely entail a lot of marketing, proof of concepts and hand-holding to convince enterprises to consolidate all of their data operations onto a single platform. Competitors will continue to push their more point-focused solutions.
Snowflake’s competitive moat revolves around several principles:
- Network Effects from Data Sharing. As more customers join Snowflake’s platform, they can seamlessly share data with their partners and customers. Because of the ease of use, improved security and governance offered by Snowflake’s data sharing, participants often encourage their non-Snowflake partners and customers to join. This creates strong network effects as these relationships grow. Snowflake’s enhanced focus of services for specific industry ecosystems only increases the benefits for participants. Competitive data platforms are at a significant disadvantage here. Hyperscalers lack independence. Other data platforms facilitate data sharing through controlled copies of data or API interfaces. Only Snowflake offers an independent, secure and seamless data sharing solution.
- Ecosystem of Data Partners. By establishing a secure environment for sharing enterprise data, Snowflake can allow their customers to leverage data sets from third-party providers to enhance it. Customers can subscribe to a variety of data set publishers within the Snowflake Marketplace. Snowflake has the largest set of third party data providers for customers to leverage to enrich their data and enrichment can be accomplished without moving any data outside of the controlled Snowflake environment.
- Efficiency of a Single Store. As Snowflake expands the number and types of workloads that the platform can support, enterprise customers can consolidate their data footprint. This reduces costs and complexity, by eliminating copies of data for different workloads and the overhead of transforming and moving data between data stores. This will reduce the TCO for the average enterprise and allow them to benefit from economies of scale. Native application development will push this advantage further. Enterprise app developers can distribute their applications on top of the Snowflake platform, eliminating the need to keep a copy of an enterprise’s sensitive data within the SaaS providers’ cloud environment.
Snowflake has long recognized the opportunity to move beyond their core data platform to build a robust set of data services and native applications on top of the customer’s data, keeping everything in one place. This has the benefits of lower cost, better controls and a simpler system architecture. Customers are gravitating towards these advantages, recognizing that Snowflake’s scope across all hyperscalers gives them optionality.
To track their progress in building an ecosystem of data sharing and native applications, Snowflake leadership introduced a new set of “Data Cloud Metrics” within the last year. Prior to this, management would reference these metrics separately, usually as part of the prepared remarks. Snowflake leadership included the same slide for Q4 with updated metrics, reinforcing their intent to report on these as formal KPI’s going forward.
To capture Data Sharing activity, Snowflake reports a measure called “stable edges”. Snowflake leadership sets a high bar for considering a data sharing relationship between two companies as actively being used. In order to be considered a stable edge, the two parties must consume 40 or more credits of Snowflake usage each day over a 6 week period for the data sharing relationship. I like this measure, as it separates empty collaboration agreements from actual value creation.
In Q4, 23% of total customers had at least one stable edge. This is up from 22% in the prior quarter and 18% a year ago. If we apply these percentages to total customer counts in the period, we get the chart below. While total customers grew by about 31% y/y in Q4, the number of customers with at least one stable edge grew by 68%.
To me, that growth represents an important signal for the value-add of data sharing. If we assume that new customers take at least one year to get around to setting up a stable edge, then almost 30% of customers over a year old have a stable edge in place (total edges / customer count Q4 FY2022).
We also know that data sharing is a popular feature for Snowflake’s largest customers. In Q4, leadership reported that 65% of $1M customers have at least one stable edge. Further, they shared that of these customers, the average number of stable edges is 6. That is an important metric, showing that Snowflake’s largest customers really value the capabilities of data sharing.
Given that the penetration of stable edges is greater for large customers, it may be that these data sharing relationships encourage higher usage levels. Enterprises may be exchanging data with other parties and then applying more data processing to combine, enrich, enhance and then share that data.
Facilitating these data sharing relationships represents a competitive advantage for Snowflake. They increase customer retention, generate network effects to attract new customers and drive incremental utilization as shared data sets are filtered, cleansed and combined with other third party data. This network of data sharing relationships elevates Snowflake’s value proposition for customers onto a higher plane beyond focusing on tooling for analytics and ML/AI workloads within a single company.
To enable data sharing and enrichment, Snowflake’s Data Marketplace provides users with access to relevant data sets from third-party data providers. Companies can subscribe to these data sets for a fee and then seamlessly combine them with their Snowflake instance through data sharing. This eliminates the overhead of setting up separate integration processes to import, filter and combine this data. Additionally, secure data sharing handles updates automatically. That represents a huge cost savings.
At the end of January 2022 (Q4 FY2022), Snowflake’s Data Marketplace had 1,100 data sets from 240 providers. For Q1 FY2023, listings grew 22% q/q to 1,350 data sets from over 260 providers. For Q2, marketplace listings grew another 13% sequentially to 1,539. In Q3, Snowflake reported another 11% sequential increase in Data Marketplace listings to reach 1,702 total. Finally, in Q4 FY2023, Snowflake reported 8% sequential growth in listings to 1,838. This is 67% growth over the past year.
If the Data Marketplace is experiencing strong growth, the Snowflake Powered By program appears to be attracting even more participation. This represents companies that have decided to build their data-driven product or service on top of Snowflake’s platform, that they then sell to their customers. For Q1 FY2023, Snowflake announced there were 425 Powered by Snowflake partners, representing 48% growth over the prior quarter’s count of 285 in Q4. For Q2, Powered By participation took another large jump forward, increasing by 35% q/q to reach 590 registrants. In Q3, Snowflake reported another 20% q/q growth, hitting 709 registrations by quarter’s end. Finally, in Q4 FY2023, they reported 16% sequential growth to reach 822. This represents almost 3x growth over the past year.
That is quite a jump. As these companies grow their businesses, their consumption of Snowflake resources should increase significantly. As part of Investor Day in June, leadership revealed that 9% of their $1M+ customers were in the Powered By program. Snowflake ended Q4 with 330 $1M+ customers, implying that almost 30 Powered By participants were generating more than $1M in annual product revenue.
This is because Powered By participants inherently generate high utilization of Snowflake. In their case, the foundation of their service infrastructure is running on Snowflake’s platform. This is in contrast to the normal enterprise use cases around analytics and machine learning. As more companies choose to build their business on top of Snowflake, we will likely see this contribution to utilization grow faster. In a sense, the Powered By program elevates Snowflake to spending patterns on par with the hyperscalers (which are usually the largest line item in an IT budget).
Sponsored by Cestrian Capital Research
Cestrian Capital Research provides extensive investor education content, including a free stocks board focused on helping people become better investors, webinars covering market direction and deep dives on individual stocks in order to teach financial and technical analysis.
The Cestrian Tech Select newsletter delivers professional investment research on the technology sector, presented in an easy-to-use, down-to-earth style. Sign-up for the basic newsletter is free, with an option to subscribe for deeper coverage.
Software Stack Investing members can subscribe to the premium version of the newsletter with a 33% discount.
Cestrian Capital Research’s services are a great complement to Software Stack Investing, as they offer investor education and financial analysis that go beyond the scope of this blog. The Tech Select newsletter covers a broad range of technology companies with a deep focus on financial and chart analysis. They recently published an insightful review of Snowflake’s Q4 results with detailed financials and technical analysis.
Q4 Earnings Results
Now that we have reviewed Snowflake’s market opportunity, product strategy and competitive moat, let’s look at how these are manifesting in the most recent earnings results. The critical question for investors will be to pass the results through the lens of overall macro impact and the current cycle of software infrastructure spend.
It’s no surprise that Snowflake’s revenue generation is being impacted by overall scrutiny of IT budgets and delays in customer expansions to conserve cash. These factors are limiting consumption of Snowflake workloads by customers. Additionally, during 2020-2021, a wave of new digital natives accelerated demand as they grew their data sets and Snowflake usage without worrying too much about costs. In 2022 and 2023, Snowflake has been experiencing a pullback from the digital natives and steady, but slower, growth from more traditional enterprise customers.
While these macro forces are weighing on growth, the risk for Snowflake’s thesis would be if these forces are masking a broader impact from competition, technology shifts or product position. I don’t think they are, as new customer lands are proceeding at the same rate as before. On the Q4 earnings call, Snowflake’s CFO eluded to the two-sided nature of their consumption model. In times of economic pressure, enterprises can quickly reduce their Snowflake spend and realize savings. As that pressure subsides, however, they can just as quickly revert their spending growth. This could actually result in a reacceleration of Snowflake revenue going into 2024.
Revenue
After setting expectations for Q4 product revenue in the range of $535M-$540M, Snowflake delivered a nice beat with $589M for 54% annual growth. This represents a deceleration from 67% growth in Q3 and is up just 6.2% sequentially. The sequential growth slowdown is a concern as Snowflake normally achieves q/q growth above 10% and hit 12.1% sequential growth in Q3.
Looking forward to the new fiscal year, the deceleration continues, but does appear to be reaching a bottom. Product revenue is estimated to fall in the range of $568M-$573M in Q1 for 44%-45% annual growth. At the midpoint, this would represent just 2.7% growth sequentially, which is about the same magnitude of sequential guide coming out of Q3. A similar beat in Q1 would result in about 6-7% sequential growth and 49% annual growth.
For the full year, Snowflake management had originally set a product revenue growth target of 47% coming out of the Q3 earnings report. As I mentioned, this optimistic view for the next fiscal year softened the impact of a weak Q4 guide. For FY2024, management set a product revenue target of $2.705B, representing annual growth of 39.5% over the actual product revenue of $1.939B delivered in FY2023. This is an almost 30% deceleration from the 70% annual growth in FY2023.
Interestingly, management must be expecting some sequential acceleration going into the second half of the year, as a 6.2% q/q growth rate annualized would be just 27% growth. To reach their 40% target for the year, we should see sequential growth rates pick back up to at least the 8-9% range in the second half of 2023. RPO growth provides some support for this view.
Total RPO was $3.661B at end of Q4, up 38.4% annually. It shot up nearly 22% sequentially, but this is expected as sequential RPO growth a year ago was nearly 47%. Looking at current RPO, the annual growth rate is a bit higher at 46.3%. This is due to a larger percent of total RPO expected to close in the next 12 months (55% in Q4 FY2023 versus 52% a year ago). The greater contribution of current RPO is a consequence of enterprises being reticent to making longer term commitments in this environment.
Profitability
Snowflake finished Q4 with record adjusted FCF of $215M for a FCF margin of 37%. This compares to $102.1M in Q4 FY2022 for a margin of 27%. Snowflake increased their adjusted FCF by 110% year/year. While adjusted FCF doesn’t take SBC into account, that rate is much higher than the increase in SBC over the same time period. For the full year, the rate of increase in adjusted FCF is even higher, at 2.5x. Snowflake generated $520.4M in adjusted FCF for FY2023, versus $149.8M in the prior year.
Non-GAAP income from operations hit a respectable $32.8M or 6% operating margin. This compares to $18.0M for a 5% operating margin in Q4 of last year. For the full year of FY 2023, operating margin was 5%, compared to -3% in FY2022.
On a GAAP basis, operating loss was $239M for a GAAP operating margin of -41%. This includes $256.3M of SBC. Stock based compensation increased by 53% in Q4 over the prior year. For the full year, the SBC increase is better at 32%. For all of FY2023, SBC represented 45.8% of revenue, as compared to 59.2% in FY2022. From my perspective, SBC is a cost of doing business in the software space. It is necessary to attract and retain top talent. As an investor, my primary attention for SBC is to 1) see it gradually decrease over time and 2) understand the impact on share count. Otherwise, I don’t pay much attention to it, relative to actual cash generation. Stock options don’t cost money to issue.
To get a sense of dilution, Snowflake reported weighted average shares outstanding of 318.7M for the period of FY2023, versus 300.2M for FY2022. This represents an increase of about 6.2%. Besides masking the true state of profitability, this share count increase (dilution) represents the cost of SBC to shareholders. The announced share repurchase program of $2B in the Q4 report should help to counteract this effect. A rough calculation of the 18.5M increase in shares at $135/share would yield a cost of about $2.5B to address. For me, this illustrates the value of the share buyback.
Looking forward, Snowflake expects strong profitability to continue. For the full year of fiscal 2024, they have set a preliminary adjusted FCF target of 25%. For Non-GAAP operating income, they expect an operating margin of 6%. Both of these are roughly inline with the results from FY2023. Using FCF, this represents a preliminary “Rule of 40” value of 65 (40% revenue growth + 25% FCF). If they deliver $2.8B in revenue, Snowflake would generate $700M of FCF.
Investors will note that the adjusted FCF target of 25% represents the current value of the FY 2029 long term target. Achieving this several years ahead of target represents a significant accomplishment. Management will update the FY2029 goals at the next Investor Day in May and we might see the bar set even higher.
Customer Activity
We are observing an interesting divergence between revenue growth and new customer additions among a number of software infrastructure companies in recent quarterly reports. While annual and sequential revenue growth rates contract, the count of new customer additions remains roughly inline. For companies that report metrics for adoption of product modules, the same linearity applied. This pattern was noticeable with Datadog, MongoDB, Cloudflare, Crowdstrike, SentinelOne and more. Several of these companies even commented on their “record pipelines” coming into 2023.
Yet, realization of revenue from these new customers was muted, as evidenced by revenue growth rates. This may provide a silver lining to an otherwise dismal outlook. One interpretation of the data is that the secular trends of digital transformation and cloud migration are intact. Enterprises are continuing to embrace the need to automate their operations and migrate to digital channels to reach their customers and partners. If they weren’t, then new customer additions would be dropping off a cliff.
Yet, most companies are reporting roughly equal counts of new customer additions across the past four quarters. Datadog added 1,000 new customers in Q4, which was the same as Q3. They also reported consistent growth in adoption of multiple product subscriptions. MongoDB saw 1,700 new customers in Q4, just below the 1,800 – 2,000 they had been adding quarterly. Growth in their Direct Sales customers was right inline. Security companies talked about having a record new customer pipeline. Cloudflare accelerated total customer additions in Q4 and similarly commented on their strong inbound pipeline.
All of these companies reported decelerating revenue growth, attributing the headwind to slower customer expansion. Large deals received more scrutiny or were broken up into smaller commitments. In those businesses where consumption levers could be pulled (Datadog, MongoDB), there were likely even spend reductions, creating a negative headwind to growth. In all these cases, the slower (or reduced) customer expansion masked the benefit of any new customer contribution.
Snowflake reflected the same trends. Snowflake added 536 new customers in Q4, representing the largest number of additions in the past 2 years and even accelerating sequentially from Q3’s rate. While the annual growth rate continues to decrease, it may be stabilizing in the low 30% range. Annualizing the Q4 rate represents 33% growth.
Growth in $1M customers exhibited similar linearity. Snowflake transitioned a record number of customers to $1M in product revenue in Q4.
The dollar-based net revenue retention rate was 158% in Q4. This has been dropping for the past four quarters. However, the CFO pointed out that the Q4 rate is the same as Snowflake experienced at IPO in September 2020. If we look at the same chart for Q4 FY2022, we see that the DBNRR rate had increased from 168% in Q4 FY2021 to the peak in Q4 FY2022. With that background, the deceleration over 2022 was likely more of a return to the mean.
Looking forward, Snowflake’s CFO acknowledged that the DBNRR rate should continue to come down, but will do so slowly and can remain high for some time. He doesn’t see it hitting 130% anytime soon.
Analyst: Even with some of the impacts you’re mentioning, the NRR is still holding strong at 158%. Not lost on us, but any change in how you’re thinking about target levels? Realize there’s variability that you said you expect those to remain above 130% for a long time. Just anything you’re seeing currently that could cause that metric to dip more meaningfully. Or anything you can add there is helpful.
CFO: We’re not forecasting it to dip to that level (130%) anytime soon. But clearly, as the numbers get bigger, it becomes harder. And that number is still going to be a very high number, and it really all depends upon the customers we land today and the ones that we landed over the last two years that will come into our cohort next year. But clearly, if you recall back in 2020, we actually had an acceleration in our net revenue retention rate.
I’m not saying that’s going to happen, but that is possible that that could happen as well, too. You look through 2022, our net revenue retention went up. And that’s the beautiful thing of a consumption model. Just as companies can really control their spend on Snowflake, when they open up their budgets more, they can ramp very quickly existing customers on Snowflake that could drive that up, but we’re not seeing a precipitous drop off longer term in the net revenue retention.
It will potentially come down longer term, but it’s going to still stay very high.
Snowflake Q4 fY2023 Earnings Call
Surprisingly, the CFO even hinted that the DBNRR rate could re-accelerate from 158%. Snowflake’s consumption billing model allows customers to quickly reduce their spend when needed by dialing back on usage, whether the frequency of analytics runs, migration of new workloads or ad hoc queries. That same flexibility on spend reduction can quickly reverse when pressure on enterprise IT budgets abates and companies look for growth opportunities again.
This also speaks to the elasticity of large customer spend. The CFO mentioned that Snowflake’s Global 2000 customers had $1.4M on average in TTM revenue in Q4, up from $1.3M in Q3. At their Investor Day in June 2022, they provided a chart showing the growth in this value going back two years. Over the past two years, the average spend for Global 2000 customers has increased by 2.6x ($540k in Q4 FY21 growing to $1.4M in Q4 FY23).
When a Global 2000 customer starts on Snowflake, their spend is very small, in the range of $50k-$100k. This increases over time, as they migrate more and more workloads to Snowflake. While the growth of spend by Global 2000 customers is slower than Snowflake experienced with the digital natives during the height of Covid, it will increase more consistently over a longer duration. Additionally, due to the gravity of data and the accumulating network effects, usage of Snowflake will be very sticky.
Of their $1M+ customers, the spend is $3.7M on average. This was flat q/q, but is continuing to grow. To hit their FY2029 revenue target for $10B, Snowflake leadership expects to reach about 1,400 $1M customers with each spending $5.5M on average. Given their progress over the last two years and considering the headwinds in 2023, I think this target is easily achievable.
Stock Buyback
As part of the Q4 results, Snowflake management announced a $2B stock repurchase program. This program will be funded by Showflake’s working capital and will last through March 2025 (2 years). Management didn’t provide further commitments to timeframe when purchases will be made. At the time of earnings, SNOW stock was trading at $154, which is still far below the $254 closing price on the first day of trading and its ATH of $400 in November 2021.
The net benefit for shareholders is to counteract dilution from SBC. I have no issue with this and consider it to be a reasonable use of capital. The stock price is low and management likely considers it a good buy at these levels. Snowflake has about $5B in cash and equivalents. As stated by the CFO on the earnings call and in subsequent analyst events, the $2B buyback commitment represents what Snowflake expects to generate in free cash flow over the next 2 years. They consider $5B to represent sufficient cash on hand, given where they are in the business.
In terms of other uses of the funds, Snowflake is willing to hire where they expect product opportunity. Given the macro environment, they don’t see a need to invest more in growing headcount beyond the 1,000 additional employees that they plan to add in 2023. They have been hiring aggressively up to this point, so a slowdown in hiring should not be disruptive.
Acquisitions
The other area where cash could be applied would be in acquisitions. Snowflake’s strategy has been to acquire smaller companies that have a unique technology and staffing expertise. We could generally categorize their strategy as one of acquihire. They haven’t made any acquisitions for the purposes of adding significant new revenue streams immediately. This has kept acquisition costs low, which also minimizes the amount of cash they need to maintain.
As we consider the path forward for Snowflake to realize their vision for the Data Cloud, recent acquisitions will help support it. In January, Snowflake announced the acquisition of the SnowConvert product from Mobilize.Net. SnowConvert provides a toolset that accelerates the migration from popular data warehouse platforms to Snowflake. These include Teradata, Oracle and SQL Server. Much of Snowflake’s new customer workloads involve migrations from these technologies. The SnowConvert toolkit makes these migrations easier and faster to execute.
Additionally, they offer a conversion tool for Spark jobs, which integrates through the Snowflake API and Snowpark. This is targeted directly at Databricks and Amazon EMR. As a Mobilize.Net blog post points out, the big advantage to converting Spark jobs to Snowpark is that the data is immediately available. With traditional Spark jobs, source data has to be loaded into the Spark instance. In Snowpark, the equivalent code runs directly on the data already stored in Snowflake.
This is a big deal, as it brings together analytics and machine learning code written in Java, Scala or Python with the cloud data platform. Effectively, this cuts other Spark implementations out of the loop. These solutions from SnowConvert deliver two benefits. First, they should speed up migrations from legacy data warehouses to the Snowflake Data Cloud. Second, with Snowpark’s new capabilities, it provides an alternative to Spark implementations for running analytics and machine learning jobs. That code in Java, Scala and now Python can be converted to run in Snowpark, bringing the execution closer to the data source in Snowflake.
While SnowConvert was probably the most significant acquisition (announced on their Investor Relations site), they have conducted two other acquisitions in 2023. These were both revealed on the Snowflake blog.
In early January, Snowflake announced their intent to acquire Myst.ai. Myst offers a platform that helps data science teams build and maintain accurate forecasting models for time series data. Time series data consists of a continuous stream of values for a particular measure. It is often seen in observability metrics, but can apply to any entity, like a temperature gauge, an item’s cost, a financial metric, etc. With the emergence of IoT, the number of data sources for time series sequences has increased significantly.
In order to make this data more useful to businesses, time series forecasting represents a branch of machine learning which tries to predict future values for a metric, based on history. This is a popular segment of data science with applications in manufacturing, distribution, health care and finance. As Snowflake focuses on advancing the Data Cloud Platform by adding machine learning extensibility, the Myst team will contribute their capabilities for this in the area of time series forecasting. Snowflake will provide further details around the integration in the future.
In February, Snowflake announced their intent to acquire LeapYear. As investors know, Snowflake has created significant competitive advantage by facilitating data sharing relationships between their customers. Data sharing through Snowflake and application of Clean Rooms provides customers with the confidence that their data set is tightly governed and secure. Through Clean Rooms, they can combine data sets with feeds from partners without revealing the raw data or trusting that the third-party destroys copies of it.
However, highly sensitive data with PII can be risky to share, even through the enhanced security and governance available from Clean Rooms. LeapYear addresses this challenge with Differential Privacy. While it consists of sophisticated mathematics, differential privacy ensures that no individual record can be derived from the output of the analysis. LeapYear is one of the first to offer a commercial-grade privacy-enhancing technology platform that uses differential privacy-based guarantees to enable large-scale, secure sensitive data sharing and monetization.
The technology and team from LeapYear will be integrated into Snowflake’s platform to enhance their data sharing and collaboration capabilities. With differential privacy, even the most sensitive data sets can be combined, aggregated and analyzed. This will have implications for Snowflake industry clouds in healthcare, advertising, finance and others.
Hyperscaler Relationships
Coinciding with the Q4 earnings report, Snowflake announced an expanded relationship with AWS. Snowflake and AWS have always enjoyed the tightest relationship and this renewal reinforces that further, spanning multiple years. Snowflake committed to a larger spend on AWS and both companies are contributing to joint go-to-market efforts.
As part of the press release, Snowflake shared some metrics that highlight the depth of their relationship with AWS. Snowflake and AWS share over 6,000 joint customers, with 84% of Snowflake’s customers running their deployments on AWS. The two companies have increased their co-selling goals by 5x since their original agreement in 2020.
As part of the announcement, the two companies highlighted several areas for their increased collaboration.
- Co-developing Joint Industry Solutions. This was new. Dedicated industry teams from each company will collaborate to build new industry-specific solutions for their biggest customers. These efforts align with the industry Data Clouds that Snowflake has launched, including Financial Services, Media & Advertising, Healthcare & Life Sciences, Retail and Telecom. Collectively, they plan to expand this to new industries in the future. I think this collaboration is the most interesting, as it creates real value-add for enterprise customers in these industries. As more industry participants join that Data Cloud, network effects create a competitive moat for the Snowflake/AWS solution.
- Deepening Product Integrations. The two companies are improving the depth of integration between systems, including machine learning, governance and streaming. Snowflake also takes advantage of AWS’ custom chips, including Graviton instances.
- Increasing Sales Collaboration. Both companies are adding sales resources to drive increased go-to-market collaboration globally. This effort is particularly relevant for large scale cloud migrations from on-premise. AWS presumably wins those migration deals over the other hyperscalers as a consequence of their close relationship with Snowflake, allowing them to offer a more expansive set of services.
- Expanding Marketing Strategies. In addition to co-selling, the two companies are investing in marketing efforts to drive awareness for joint industry solutions and product integrations. This includes having noticeable presence at each company’s customer events.
Of the three hyperscalers, AWS has been positioning themselves as the platform most open to outside software solutions. While they offer self-built products that span all developer needs, they also embrace solutions from third parties, even if these compete with their internal products. The rationale is that they want to be viewed as the most comprehensive offering for enterprises. By not limiting customers to a view of just their internally built products, enterprise customers have the most choice. Of course, Amazon still benefits as third party solutions consume compute and storage resources from AWS.
The big question is what this implies for the other two hyperscalers. Snowflake views GCP and Azure as both partners and competitors. GCP and Azure clearly prefer that enterprise customers adopt their whole suite of products. They will offer bundles with deep discounts in services that they consider strategic. While they allow third party products on their platform, they engage in co-selling relationships selectively.
What this translates into is a strong relationship between Snowflake and AWS, with decreasing collaboration with Azure and almost none with GCP. This explains why 84% of Snowflake’s customers run on AWS. That is a staggering statistic. On the surface, it might be construed as a limiting factor for Snowflake. I view it as the opposite. AWS is the largest hyperscaler by revenue and Snowflake has its deepest relationship with them. AWS sales uses Snowflake as a competitive weapon to win enterprise deals over Azure and GCP.
I think that the co-selling relationship with GCP and Azure should actually improve over time. The hyperscalers generate revenue from usage of Snowflake. Granted, they would prefer to generate more revenue by selling those Snowflake customers BigQuery and Synapse. But, they likely appreciate the seamless sales cycle enjoyed by AWS in collaborating with Snowflake and the natural customer preference for more choice. An option to use Snowflake also counteracts concerns over hyperscaler lock-in.
In terms of the relationship with the cloud vendors, I would say, the new AWS agreement is a great step forward in improving an already really good relationship with AWS, to begin with.
We had a $1.2 billion commit. Now, we have a $2.5 billion commit over the next five years, and it’s much better alignment go-to-market between the two. AWS, we’re still — Azure, we’re still two and a half years into that five-year contract. We will start discussing with Azure trying to get better terms.
I’m not just talking pricing. I’m talking go-to-market working together with one another, and there’s no change in GCP to date. I’m hopeful there could be something in GCP longer term. We will come to the end of our GCP contract in May of 2024, and we’re tracking to fully consume what we committed to with GCP, but we’re clearly running ahead with Azure and AWS, and that’s why we did an early renewal or a new contract with AWS.
Q4 FY2023 Snowflake Earnings CAll
I think increased collaboration with Azure and GCP will be driven by Snowflake’s network effects. If Azure and GCP lose deals to AWS because they are trying to push their internal products, they may become more open to collaboration. AWS deliberately plays both sides – pitting internal teams against one another. This allows the market to decide. AWS benefits either way, versus the “all or nothing” approach taken by Azure and GCP.
And the reality is in large accounts AWS partners with us out of the gate because they want to see those customers land in AWS. And history has shown that Snowflake helps those customers land in AWS. And that’s good for AWS because they can sell a lot of other software services around Snowflake.
Morgan STanley TMT Conference, March 2023
With AWS, investors often wonder whether Redshift competes with Snowflake. It does for smaller customers and simpler implementations. For large customer implementations, AWS generally partners with Snowflake out of the gate. This is because the large enterprise already has a preference to use Snowflake. Forcing them onto Redshift would likely be discouraging. Most enterprise cloud migration deals for AWS are being conducted in competition (or at least consideration) with the other hyperscalers Azure and GCP. This incentivizes the AWS account team to present the best overall solution for the customer, versus trying to shoehorn in their data warehouse offering. Until Azure and GCP open up to Snowflake as an option for enterprise customers, they risk alienating a subset of enterprise customers as well.
Investment Plan (and a word on AI)
It has been an adjustment this year to lower my expectations from the hypergrowth of 2020-2021 for all software infrastructure companies. As low interest rates during Covid fueled an acceleration of investment in cloud infrastructure, it was easy to assume the elevated growth train would continue at the same rate. Of course, 2022 and now 2023 have brought the fastest interest rates increase in decades, putting pressure on IT budgets. Enterprises and start-ups that had few checks on their ballooning cloud infrastructure bills are now scrutinizing them and finding ways to cut costs.
Some analysts are viewing this as the end of growth in cloud migration and digitalization of business processes. Given the rapid slowdown in IT spend, that is an easy conclusion to support. However, I don’t agree with this thesis. I was recently asked by an analyst why I think this. My one word answer was competition. Software enables better service, lower cost and enhanced consumer experiences. As soon as one company in an industry rolls out a new digitally-enabled service, they all have to.
Digital transformation and data are at the heart of software-driven competitive disruption. Currently, all companies in each sector are pausing IT investment and focusing inwardly at the same time. Under those circumstances, it’s easy to dial back competitive awareness. However, competitive pressure still exists and will likely resurface as macro headwinds normalize. I think this will drive a resurgence in demand for software infrastructure.
Reviewing the performance of most software companies in the most recent quarter, we see a similar divergence between short term and long term demand. Software providers are signing up new customers at the same rate as during the Covid rush. Existing customers are adopting new modules at a linear rate as well. The problem is that budget pressures and prior over-provisioning are allowing large existing customers to temper their near term spend and delay new contracts.
I view this as an understandable, but temporary, phenomenon. The slowdown is cyclical, not secular. The same forces that drove investment in software infrastructure prior to Covid still exist. While Covid caused a pull-forward in investment, we haven’t reached a point of saturation. Enterprises still have plenty of work ahead to improve business outcomes, automate processes, serve customers and collaborate with partners.
While I think a reacceleration in software spend would happen regardless, AI-enabled experiences will provide a new catalyst for software and cloud infrastructure. Putting aside all the hype, AI injects new productivity and effectiveness into all flavors of existing software. This significantly increases the ROI for software tools (SaaS of all flavors) and is spurring a flood of new applications that do the same. Just look at this sample list of AI-enabled applications from The Rundown. There are hundreds of new AI-infused digital experiences that offer more efficient and effective outcomes, and the list is constantly growing. Established companies are injecting AI into their software applications as well, in all categories of business processes.
How does this benefit software infrastructure companies? Well, the vast majority of AI-enabled software experiences are still delivered over the Internet or through a mobile app. That means they will need application security (NET), monitoring (DDOG), infrastructure provisioning (HCP), cloud hosting (hyperscalers) and more. ChatGPT’s new plug-ins offering was seeded with links to existing popular consumer Internet services in travel, shopping and dining. As OpenAI’s agent is used by consumers to offload their day-to-day tasks, these online services will experience increased usage.
If all of these new AI-infused applications are improving the ROI for enterprise software, then how will companies pay for them? Another one word answer – productivity. If the average knowledge worker can get more work done with AI-enabled software, then we won’t need as many workers. Or, at least, staffing growth can decrease in proportion to business growth. Those cost savings in staff can be invested in software. Instead of the average department budget for knowledge workers in a Global 2000 company being 90% labor and 10% software, it could shift to 50/50, or more.
The largest ingredient by far to drive these new automated experiences will be data. By providing a single, secure, cloud-based data store for all of an enterprise’s data with built-in governance and safe data sharing, Snowflake is well positioned to feed AI models. While Snowpark should enable a large amount of data processing and machine learning workloads by itself, more sophisticated generative AI models require large sources of data.
This is why the system diagram for NVIDA’s new NeMo service running on the DGX Cloud has a big block of input labelled as “Enterprise Data”. Snowflake can be the single source of data to feed enterprise AI models, keeping in mind that data security and governance capabilities have been core to Snowflake’s platform. While an end run around Snowflake might be possible, those data controls are a big part of Snowflake’s value proposition and are necessary to prevent leakage.
As I don’t see changes in Snowflake’s competitive position, market opportunity or customer adoption, I see no reason to reduce the allocation to SNOW in my personal portfolio. While the valuation remains high, I think Snowflake maintains the ability to generate durable revenue growth with improving cash flow margins as they march towards their $10B product revenue target.
Further Reading
- Our partners at Cestrian Capital Research published a review of Snowflake’s Q4 results as well, with detailed financials and technical analysis.
- Peer analyst Muji over at Hhhypergrowth published an update on Snowflake’s Q4 results. Additionally, he offers thorough coverage of past product announcements and customer events. These are very useful supplements to my coverage, usually providing additional insight on product development.
NOTE: This article does not represent investment advice and is solely the author’s opinion for managing his own investment portfolio. Readers are expected to perform their own due diligence before making investment decisions. Please see the Disclaimer for more detail.
Thank you very much Peter for yet another very informative article.
I would be realy glad to have your thoughts on the current AI race between hyperscalers. Do you think that AWS is falling behind on that race? If yes, and if that leads to a loss of competitive position of AWS, doesn’t this affect SNOW indirectly since 90% of its customers are using AWS? What about MSFT adding more AI capabilities to Synapse and/or GCP to BigQuery which might make them more competitive against SNOW.
Thanks for the feedback. I think that AWS is keeping up on the AI capabilities that would be leveraged by businesses, versus the consumer-facing AI services emerging out of OpenAI. The risk might be that customers hosting their infrastructure on AWS don’t have access to desired ML/AI capabilities that are only available on one of the other hyperscalers. However, I think it will be difficult to contain a real technology advantage around AI in just one hyperscaler. I think that AI models will proliferate and become somewhat commoditized over time. The real value for enterprises will be in the ability to analyze their data, versus the consumer Internet cases that are emerging. Finally, I wouldn’t be surprised if Snowflake (or their partners) start rapidly adding more AI capabilities that enterprises can use. With that said, I would be more comfortable if Snowflake showed additional traction with hyperscalers outside of AWS. I think that Azure and maybe GCP will have to collaborate eventually, for the same reasons that AWS did.
… and thanks from me!
With Cloudflare’s R2 storage having no egress fee, and the growing partnership between Snowflake and AWS, is Snowflake competing with Cloudflare, especially in serving AI, and if so, what are the likely consequences?
Interesting question. They aren’t competing at this point, but you can see how offering inexpensive data storage and transit could cause both companies to pursue overlapping business use cases with companies (like AI) that have large data sets.
Thanks for the analysis Peter!
Great work as always and an enjoyable read.
Agree that we should see a reacceleration in IT spending once macro uncertainty bottoms out which I see happening in the next 3-6 months.
Hi Peter,
It is really in-depth information of Snowflake and thank you so much for sharing with us.
Question?
If snowflake is successful in Unification of transactional DB , EDW and analytic DBs, how this will impact other transactional DBs like Oracle, MongoDB, SQL Server etc and other EDW DBs and Analytics
Well, if Snowflake is successful in becoming the single Data Cloud, then the need for other transactional databases and EDW’s will be greatly diminished. I think there will be a middle ground, however. Snowflake will likely become the transactional data store for data applications (those built by the data organization). It will take a long time before Snowflake displaces relational databases backing the general consumer application. For MongoDB, of course, their primary data schema is document-oriented, which is different from Snowflake’s structure. So, I think there will be a separate market for high volume, consumer-facing application databases for a long time.
Hi Peter,
As an investor in Snowflake are do you view Snowflake’s position in the marketplace relative to Databricks?
I have written about the relationship between the two at length in prior posts on Snowflake. For now, I think they can co-exist, as they primarily focus on two different audiences and technical buyers. Over time, they will increasingly overlap, as each has a vision to manage a larger portion of an enterprise’s overall data set. There is an argument for efficiency, security and simplicity to have all enterprise data in one store. I think both companies will be targeting that over time. If that vision is achievable, though, then the TAM will be very large for both to pursue.
Thanks for the reply Peter!
Have a great weekend my friend!