SaladCloud Blog

INSIDE SALAD

SaladCast Episode 12: Our Product Roadmap With Daniel Sarfati

Salad Technologies

Welcome to SaladCast! In this podcast series, we introduce you to Salad Chefs from all corners of the Infinite Kitchen. We hope you’ll join us as we get to know members of our community, indie developers, and teammates from our very own Salad staff.


In this episode, The SaladCast staff series continues with Bob at the helm! Now that trustless mechanisms have given Salad the chops to take on virtually any computing task, Product Director Daniel Sarfati crashes the ‘Cast to lay out our roadmap to diversified workloads. Learn how users in the Salad Chefs Discord have helped us develop the Salad app, grok tech at the ephemeral edge, and hear all the things the Kitchen can do that data centers just can’t.

Watch the full episode at the SaladChefs YouTube channel.


Episode Highlights

Highlights content has been edited and slightly reordered for clarity.

You’ve been here for 90% of Salad’s history. What first attracted you to Salad?

Before coming to Salad, I had worked in the aerospace industry, and done some experimental “side hustles” in gaming. Our teams were tackling complex data problems that required coordination between distributed servers, like plotting aerial traffic routes, or linking social gaming networks. The question was always, “How do you distribute these massive data processing workloads?” Taking that and applying that to consumer PCs was really exciting.

That’s a problem that still isn’t solved at the enterprise level. And as the need for compute grows, there’s no way to solve it just by building more data centers. There has to be a huge shift in the way we think about and use computers to handle the scale of the data that needs processing.

We’re obviously focused on solving that supply problem. How do you shepherd a novel value proposition like ours to market?

Salad had this unique way of “hacking” the two-sided market problem. Other people who talk about this focus on getting data scientists and getting users online. We just focused on the latter. We always knew we could hire data scientists down the line, but we didn’t need them to grow the market. Our goal is to make Salad easy enough for anybody to use it—which is why we abstract away detailed stats like your hashrate. We’re building an intelligent platform that saves the users all that configuration. In a way, we’re doing twice the work as a traditional cryptominer.

Is there a conflict between making something easy and making it individually optimized?

There’s been a natural evolution of the product. People who have been around for a while will know that we used to have flat earning rates. As long as Salad was Chopping on your machine, you were getting X dollars per hour, no matter what. So that’s come a long way. The big improvements we’re making in the near future will get us closer to that ideal state where Salad is truly optimized for every user’s machine to get them the most value. Today, we’re taking these cryptomining workloads and figuring out the best way to run them. When we start to move into different workloads in the future, we’ll be doing the same thing—having a deep understanding of the problem and figuring out what to change on individual users’ PCs.

What was the thinking behind fixed rates?

When we first started, we had only a few dozen users on the app at once. Then, we were very focused on refining the user experience. Fixed earning rates were a business risk for us that gave us the leeway to adjust things in the background without changing the user experience. We saw it as an opportunity to move quickly, get something in front of users, grow the community, and create those mechanisms for getting feedback. That work took us to where we are today—verifying every single coin that’s mined and attributing every single share to a user.

How has the community helped to grow the Kitchen?

Having access to 35,000 people in Discord is incredible. Our community is so engaged that when I go into the lobby channel on our server and start typing, I see dozens of messages saying, “Daniel is typing! What’s Daniel going to say?” They’re so engaged with us that whenever we have questions or prototypes, we can have twenty users work through a design and help us find the stumbling blocks before it even gets into the hands of the engineering team. That’s huge for us, and that’s why our focus has always been building our community and enhancing the user experience.

While blockchain workloads helped us to build, fund, and maintain the network.

Right. Crypto is such a powerful product because it is completely trustless. Any machine can come onto the network and start earning money from day one. If they try to act fraudulently, the network will automatically reject that action. It was extremely beneficial to be able to leverage those existing systems because we now have a baseline for establishing the new requirements of diversified workloads. With SaladCloud, we can support much more than blockchain mining—though it could be the first workload for a new user. You come in and start mining on Salad. After a certain number of hours, we’ll have benchmark data on that PC, which gives us room to say, “This PC is perfectly suited for AI training” or for some other workload.

So trustless mechanisms were actually key to building out our trust rating system. Where do we go from there?

With that solid foundation in place, we’re going to unveil dynamic pricing models pretty soon so that, as the markets fluctuate, we’re changing the miner that you’re running and optimizing it for an individual PC like never before. Those unlocks will enable new workloads with completely different use cases. Some of them can be run 24/7. If I decide to run my machine in the middle of the day or at night, there will be workloads available. But some of the workloads we’re considering are dependent on the consumer on the cloud side—data science and development tasks. So those require not only different usage patterns but also different ways of monetizing your hardware. Your PC isn’t just a GPU. It’s got a hard drive, it’s got bandwidth—even just having a unique, residential IP address is something you could share to enable proxy streaming.

We also have the critical node volume to perform load balancing or penetration testing. That’s something big companies do all the time. If someone tries to commit a malicious attack on a huge provider like IBM, they want to constantly check their cloud to see if there are vulnerabilities. If you have known sources for those attacks, it’s very easy to block those off. Salad could be hundreds of thousands of machines distributed across a state, a country, a region, or a hemisphere, simulating an attack to ensure your system’s integrity.

BOB: Simulated attacks, of course.

Yeah. Simulated testing of that kind has become a standard part of software development, especially in today’s world, where anything can be exploited, and everything can be hacked. Most companies do it on a regular basis. “I just pushed out a new build of my app. Now I want to see what happens when I get 10,000 requests per second.” That’s just an example of the new use cases that emerge within the Salad network. Users can start to participate in white hat testing and provide increased security for different websites or applications. Salad is so unique that we’re in a totally different game from a traditional cloud provider. They can’t provide those sorts of infrastructures. At this stage, we’re finding new use cases that can only work on Salad.

Which of Salad’s diversified workloads are you most excited about?

BOB: We currently monetize compute cycles. When you consider the unit economics, third parties can deploy a workload on SaladCloud for a fraction of the cost of spot instances from the big cloud providers. As we look to monetize resources like bandwidth—which costs most people around $90 monthly—therein lies a whole lot more value. Soon, we’re going to give users access to that à la carte menu of opt-in workloads. You and I are talking to big companies about edge computing, peer-to-peer (P2P) gaming, and fully encrypted AI workloads. Which are you most excited about?

Data centers are more suited for those applications where you need to send a packet of data across a distance. The speed of light hasn’t been solved yet. But that doesn’t mean we’re a smaller market than the traditional cloud. I get most excited about the things SaladCloud can do that traditional data centers can’t. We haven’t determined every use case just yet, but we have some very interesting ideas in that P2P gaming or edge computing space.

If I can tether a low-powered mobile device to a high-powered PC nearby, what else can I do? That’s huge for cloud gaming or something like Steam Link, where you can stream games from your library from room to room. But what if you’re processing video and adding effects in real time? Or doing something with VR while you’re recording a live video? Instead of waiting for that video stream to go all the way to a central server, SaladCloud can enable closer, minimal latency processing.

Think about how different the world is every time there’s a Pokémon Go update. Suddenly, you see people milling around a park, and they’re all playing Pokémon Go. That’s having an impact on the real-life world. Now, if you had servers that were connected to those Pokémon Go devices and doing things in real-time for those users, that’s a use case that doesn’t exist today.

How will Salad power the Metaverse?

BOB: You’re getting dangerously close to describing how we’re going to power the Metaverse. Considering all the compute resources that’s going to require, how will Salad support that ecosystem in the future?

We’re in a place where we can train massive AI networks through federated learning. I’m oversimplifying, but it’s comparable to how Swype keyboards work. That’s not sending every single word back to a central server—because it would be a privacy and security nightmare for your passwords and other sensitive information. Your phone does the local processing, sends a snapshot of that information, and infers things about the input gesture to improve its knowledge base.

When you aggregate that across hundreds of thousands of users doing similar actions, you can allow individual users to contribute to something greater. You can build a single AI model for an entire network without giving away any information. That is a type of AI training that didn’t exist before we had high-powered devices sitting in our pockets. Now imagine scaling that to the orders of magnitude difference between a mobile phone and a high-end gaming PC. What are the use cases there? This is not only with regard to the gamer’s PC, which is locally connected to Salad’s P2P network but also how those two devices work together.

And those are new use cases that no one is thinking about yet. If I can do 10% of the work on my phone but 90% of the work on a device that’s close, that will unlock new capabilities for both the end user and the companies running on SaladCloud. We won’t even be the ones who figure out what all those use cases are. When the SaladCloud network becomes as easy to use for developers or other startups as AWS or Microsoft Azure, they’re going to leverage, use, and build applications on top of it in ways we can’t imagine.

More SaladCast Coming Soon

Like this episode? Stay tuned for a continuing series of interviews featuring Bob and other faces from Kitchen HQ in the weeks ahead. These upcoming episodes promise an open-source look at the Salad recipe. In the meantime, browse our full SaladCast episode catalog.

Have questions about enterprise pricing for SaladCloud?

Book a 15 min call with our team.

Related Blog Posts

Stable diffusion 1.5 benchmark on SaladCloud

Stable diffusion 1.5 benchmark: 14,000+ images per dollar on SaladCloud

Stable diffusion 1.5 benchmark on consumer GPUs Since our last stable diffusion benchmark nearly a year ago, a lot has changed. While we previously used SD.Next for inference, ComfyUI has...
Read More
Stable diffusion XL (SDXL) GPU benchmark on SaladCloud

Stable Diffusion XL (SDXL) benchmark: 3405 images per dollar on SaladCloud

Stable Diffusion XL (SDXL) benchmark on 3 RTX GPUs Since our last SDXL benchmark nearly a year ago, a lot has changed. Community adoption of SDXL has increased significantly, and...
Read More
Flux.1 schnell benchmark for image generation

Flux.1 Schnell benchmark: 5243 images per dollar on SaladCloud

Introduction to Flux.1 - The new standard for image generation Flux.1 is a new series of models from Black Forest Labs that has set the new standard in quality and...
Read More

Don’t miss anything!

Subscribe To SaladCloud Newsletter & Stay Updated.