How Buoyant Drives Open-Source-Led Growth with Linkerd
William Morgan is the CEO of Buoyant, creators of Linkerd, the world's fastest, lightest service mesh.

Buoyant
William Morgan is the CEO of Buoyant, creators of Linkerd, the world’s fastest, lightest service mesh.
Buoyant was founded in 2015 with the mission of making the fundamental tools for software reliability and security accessible to every engineer in the world. Buoyant pioneered the service mesh category with Linkerd, but didn’t stop there. The team donated the project to the Cloud Native Computing Foundation (CNCF) and even achieved graduate status.
Challenge: Trying to Answer “Who is Using Linkerd?”
Today, Buoyant’s software powers critical production infrastructure for leading organizations around the world. Like most open source projects, it was unclear which organizations were actually using Linkerd.
The question, however, posed an important problem for Buoyant. Knowing which companies use Linkerd means knowing how to better support their community and help target who would most benefit from their commercial products like Buoyant Cloud.
Solution: Fast Delivery of Actionable Information
Scarf provided a solution for Buoyant. Rather than continuing to distribute their container images to users from a domain they don’t control or have observability into, they use Scarf to make their containers available from their own `cr.l5d.io` domain. Scarf is then able to provide best-in-class visibility to all container downloads from this domain. This helps them understand which companies download and use Linkerd, and identify which organizations are potential customers of Buoyant Cloud.
To further enhance their understanding of how companies were discovering and using Linkerd, Buoyant added Scarf’s cookie-free analytics pixel to their website and documentation. This integration allowed Buoyant to gain insights into high-intentionality actions, such as reading about pricing or support, as well as sustained usage of the software. This combination of data points provides a stronger signal than either metric alone, allowing Buoyant to gain a more accurate understanding of how companies are discovering and using Linkerd as well as its proliferation within each organization. The resulting insights help them size up leads, and prioritize its bandwidth toward their best opportunities.
Workflows for Marketing and Sales
Buoyant leverages Scarf’s real-time data insights to power its account-based marketing (ABM) efforts. They use Scarf to identify prospects that are already qualified by their open-source usage, which they then target with personalized digital ads promoting webinars, ebooks, and other educational resources about Linkerd.
By combining usage data with Buoyant’s ideal customer profiles, Scarf helps build on their ABM efforts by providing Buoyant with contact information (email, LinkedIn, etc) for the most promising leads at each target company, automatically filtering out unsuitable matches at both the company and contact level. Scarf is capable of incorporating complex customer profile criteria, generating relevant contacts, and accurately prioritizing target companies. ;Moreover, Scarf is robust to evolving criteria, ensuring that Buoyant can adapt their approach over time as their ideal customer profile evolves. Buoyant uses this continuous stream of curated contacts to execute optimally timed, targeted outreach campaigns to drive sales.

“Scarf gave us concrete information about which companies were adopting Linkerd, without us having to do anything difficult. We get weekly reports about new companies using Linkerd, and, thanks to Scarf’s new lead generation product, contact information that’s directly relevant to our sales and marketing teams.”
Result
When an open source project has significant traction, its reach and impact are often much the GitHub stars or even raw downloads might suggest. As Willam found: “There are a huge number of companies using Linkerd that we were not aware of! [Scarf] helped reinforce that the people they were seeing in their community forums, such as their Linkerd Slack, were just the tip of the iceberg.”

“Scarf allowed us to get concrete data about the many organizations around the world that are adopting Linkerd that we actually had no idea were out there, and this data has been critical to building a modern, successful business based on open source.”
Throughout the technology industry, companies are increasingly choosing to build their core software products as open source, as the power of a community-centric approach is becoming increasingly clear. Of course, no choice is without its tradeoffs.
William wrapped it up succinctly: “Building a modern open source business is tricky because you have to balance the desires of the community (who largely don’t want to be sold anything) and the fact that in order to keep funding the open source project, you need to sell something to someone. I’d recommend Scarf to anyone who is tasked with finding that balance today.”
You Can’t Build an Open Source Business Blind: Stirling PDF Case Study
StirlingPDF is one of the largest PDF platforms on GitHub, with an open-source core and an enterprise offering around it. Their platform includes:
Building a Predictable ICP: How Liquibase uses Scarf for GTM Operations
Liquibase is the open source standard for automating database change, with more than 100 million downloads and a community that has been growing for over a decade. Teams adopt Liquibase Community to keep database schema changes in lockstep with fast-moving application releases, then graduate to Liquibase Secure when they need governance, compliance, and control at scale.
From “Flying Blind” to Full Visibility: How Wherobots Uses Scarf to Guide GTM and DevRel
Wherobots is a Series A-stage startup building the Spatial Intelligence Cloud that makes it possible to build production-ready data products with data about the physical world up to 20X faster and at a fraction of the cost of existing approaches. Founded by the creators of Apache Sedona (used by more than 20,000 organizations), Wherobots brings the performance and governance of a modern lakehouse architecture to spatial data workloads through its optimized Sedna-compatible engine and SedonaDB, a spatial-first single-machine runtime. Teams move from complex, do-it-yourself pipelines to 5–20× faster processing without having to manage infrastructure.