Looking for more podcasts? Tune in to the Salesforce Developer podcast to hear short and insightful stories for developers, from developers.
63. Streaming Music to Livestreamers
Hosted by Julián Duque, with guest Nate Beck.
Over the past few years, the popularity of livestreaming has taken off immensely. All over the world, viewers tune in to services like Twitch to watch people do everything from play video games to knitting to programming. Often, these streams are accompanied by music; but due to licensing issues, streamers can't play any song they want, or else they'll be hit by fees. Pretzel Rocks aims to fix that. By making it easier to access licensed music, viewers and streamers can enjoy the same tunes simultaneously. Julián Duque discusses the technical and legal challenges of the platform with Nate Beck, the service's founder and principal architect.
Julián Duque, a developer advocate at Heroku, interviews Nate Beck, the principal architect and founder of Pretzel Tech. Pretzel Tech is a company which has built Pretzel Rocks, a service that allows livestreamers to safely use licensed music. They do so by wrangling the needs of three different customers: broadcasters, who want to play fun music; record labels, which hold the rights to artists' music; and viewers, who want to play the same music at the same time as the broadcaster they are watching.
The technical ability to stream music is not terribly difficult; but the challenge lies in fanning that out, to thousands of listeners, all at once. When a streamer logs in, 20,000 users might get notified and jump onto the stream all at once within seconds. Pretzel deals with these network spikes in several ways. Although their backend is a basic Rails app hosted on Heroku, they use Lambda to handle the broadcast, as they don't have a constant stream of traffic requiring many dynos. They use a CDN with plenty of POPs around the world to host the music, ensuring that it's fast and stable for users. Other difficulties including dealing with the music industry. For a single popular song, Nate needs to coordinates with 28 different companies to agree on a licensing rate. The industry also uses a standard called DDEX, which is an XML format to track metadata for artists, albums, and releases. This requires a custom pipeline to parse.
Aside from that, Nate has nothing but praise for Twitch's API platform. The company is very community focused, with a public roadmap and excellent documentation for integratos. Its users often have plenty of positive feedback for Pretzel and other extensions. Pretzel's main focus right now is on ensuring the financial success for artists on the broadcasts of their music.
Links from this episode
- Pretzel Rocks provides music for livestreamers
- 99 Lives is a record label which has a catalog stream-safe music
- Related: Pretzel Tech Handles Extreme Peaks in Demand with a Multi-Platform Architecture Centered on Heroku
Julián: Hello, hello, welcome to Code[ish] My name is Julián Duque. I'm a developer advocate here at Heroku. And today we are going to be talking about something that I'm really interested in.
Julián: Today we have Nate Beck. He's the principal architect and founder of Pretzel Tech. Pretzel Tech is a company that has a product known as Pretzel Rocks, which enables livestreamers to use music that is licensed, and they can use it safely on the streams.
Julián: These will add a lot of nice atmosphere to all of the different livestreamers. Use by life coders like me and other people that usually stream while gaming. Nate how you doing?
Nate: Hey, good to be here. I am doing well.
Julián: Excellent. I hope I described well what you do, but seems you know way more about Pretzels Tech and Pretzel Rocks, I will let you now introduce yourself. And tell me a little bit more about what you're doing.
Nate: I have been a software developer for over a decade, and done all kinds of stuff from background development, to Front-end architecture. I'm actually I come from the Flash community way back in the day. So I was a Flash front-end and Flash game developer, many years ago. And so that was kind of my first inkling in the career.
Nate: And then when that all came crashing down in 2011, I switched over to more back end, Ruby on Rails, AWS, and Python, that kind of stuff.
Julián: Yes. As you mentioned choosing the music for livestreams is really really hard. I used to have Twitch channel where I do live coding. At the beginning I had an issue. I was using common artists music on my first episodes. Obviously my audience love the music I was picking, but then after I finished my recordings, and I publish my videos on YouTube, I started getting certain complaints. And my episodes started to get muted. Tell me more about the Pretzel Rocks and the problem you are solving, and how are you solving it?
Nate: Music licensing is incredibly complex. And fun little story about me is I was working at a gaming startup back in 2011 or so. And when I was pulling like 10, 16 hour days, the new version of Starcraft came out, the new expansion. And I was like, I really want to go play that game, but I just don't have the time to do it. And that's when I really started to find livestreamers and livestreaming, who were playing that content that I wanted to play.
Nate: I'd have them up while I was working. That's when I fell in love with kind of this livestream content creator community. And so being a person who works in technology, I was like, I should be able to use my skills to help this community, the people that I really like and enjoy spending time with, to solve problems for them, using technology and a lot of things I do.
Nate: And so I kind of sought out. Okay, what are the things that I can do? There's chat bots, there's all kinds of things. But man, what is something that no one's really approaching or trying to crack yet? And then music was the answer to that. And that was around five years ago that I really started to do that. And I learned more and more about music every single day. It is one of the hardest things to license appropriately to use in all this stuff.
Nate: And so lots of phone calls with attorneys, lots of phone calls ... I think we've worked with over seven or eight different attorneys. Because even in music or any type of entertainment law, there's different specialties that they all work in. We have specialties that work in the publishing of music. And then we have specialists that work in the masters and the record label side of the house.
Nate: The idea is exactly as you described, that standard broadcast or content creator isn't going to have the time nor want to go do all of this legwork to get these artists to be on board and even explain what Twitch is or what livestreaming is. And so we do that work, and then pass that on to the broadcasters. That's essentially the product of Pretzel in a nutshell.
Nate: One of the other business units inside of Pretzel is a record label called 99 Lives. And 99 Lives is ... that record label owns all of the rights for the music. They have the ability to license it however they see fit. And 99 Lives works directly with the artists who are making the music to sign that music.
Nate: The problem with that is, even though 99 Lives has 650 tracks in its catalog, a broadcaster is going to go through that entire thing. A broadcaster is going to consume, eight hours of music a day on their broadcast, because their use cases is so different.
Nate: If you're live coding on a stream or playing a video game on a stream with music in the background and you're going for six or eight hours, You need so much music, up to 40 or 50 hours a week of music content that you can include in your own content.
Nate: The needs for livestream are very different than the needs for like someone who's making a YouTube video, for example, who finds the right song that they want, puts it in Adobe Premiere, edits it together and then pushes that up to YouTube. They're only using like 5, 10 minutes worth of music as opposed to 40 hours in a week.
Nate: And so 99 Lives is really good for that model, for the model of produced content. Whereas livestreamers, they need even more and more music. Pretzel has a catalogue of already have over 10,000 tracks. And we're getting ready to ramp that up to hundreds of thousands of tracks. Because what we're trying to do is find the musicians that are out there who want content creators to be able to use their music.
Nate: And so just Like you said, I played the music that I like to play, like from Spotify or any of those things. The grand vision of Pretzel is that those artists that you already like, can see the benefit of allowing broadcasters to use their music.
Nate: And so that's the grand vision. We want to make it easily licensable to license the music that you want. And there's a big chasm that we have to get over of traditional music to like, get to there so we're you know, it's baby steps as we're walking through
Julián: That sounds beautiful. And I guess it has like a big database different genre and you can like filter and pick from that. Is that correct?
Nate: Yeah, it's mostly EDM right now, but that's because EDM is much easier to license than other music typically. So the reason for that is EDM is typically a single producer working on their own computer because, you can use your own Mac laptop now and make music in an afternoon.
Nate: As opposed to a rock band that has five members, that are on the road. It's just harder to get them all to agree to license or to try this thing or to do any of that stuff. And so we have a lot of EDM, but we do have some really fantastic rock.
Nate: Funny enough, we actually have rock in EDM pretty well covered. But then we have people now asking for classical and jazz and country of all things that people are looking for, to kind of have us expand into. And so we're working on that.
Julián: Let's get into the more technical details about--How are you solving this problem? I bet that there are a lot of people using Pretzel tag. And not only the streamers, but also the people that are watching the streams. Are like popular streamers that have thousands of people watching them. So how are the low then system demands of the Pretzel Tech platform?
Nate: Yeah, you bring up a very, very valid point that when we're talking to investors or other folks like that, we have to differentiate the concept of broadcasters who is ... essentially Pretzel has three sets of customers, or three types of customer.
Nate: We have the broadcasters.
Nate: We have the rights holders of the music.
Nate: And then we also have the viewer of the broadcaster. So that's that extra kind of step.
Nate: And so all their needs are slightly different. And so when you talk about like the needs of the broadcast, or the person who ... what are we talking about doing? Well, we have a music player that's kind of like a traditional music player that streams audio from our CDN and they can select their song, select their playlist.
Nate: Just like you said, if I want a certain mood, I can select that station. I want the chill station or the ambient station because I'm going to be coding and then a Pretzel actually has a filter that will filter out vocals. So if you're like I want to be hyped, but I don't want vocalist singing I don't want to be talking over someone singing.
Nate: You can actually turn off vocal From a filter in the system, so that's where Pretzel kind of focuses on how nice features specifically for broadcasters. The nice thing about providing just that streaming system to a broadcasters, that's your traditional kind of streaming service, that's been proven right.
Nate: Streaming services have come out in the last decade have really kind of run rampant. So the technology to stream music to a user for music player, that's all pretty standard. And so your load is pretty linear, you've got a certain amount of concurrent people streaming. And then you have your peaks and valleys of the day and stuff and it's pretty predictable traffic. It's not that bad.
Nate: Where we really run into some really interesting technological challenges that are when we deal with that kind of other group, the viewers. So one broadcaster is one audio stream coming into them, and then that audio stream gets composited into their Twitch stream that's been, then multiplied and brought ... funned out essentially to all of these viewers of that stream.
Nate: So like you said, there might be a streamer who is currently streaming to 20,000 viewers. So that's one audio stream from our service, being composited into a video sent to Twitch and then funned out from Twitch to 20,000. So the audio is still being heard by 20,000 people, but we're only serving one audio stream. And so it's kind of nice that we can reduce our load that we have to deal with to handle that.
Nate: However, we have features in the application that allow the viewer those 20,000 people to interact with our service. So a perfect example of this is, What song is currently playing? Where do I go get this? How do I go get? I like this song, can I go get it on Spotify? Where can I go listen to it elsewhere?
Nate: Which is one of the big reasons why artists license their music to us is because we can drive traffic to their other sources of revenue, like other DSPs that are playing their music. And so that is one of the things that we provide.
Nate: But it creates this really interesting traffic pattern that is really hard to predict. And so if you have a broadcaster that's live broadcasting to 20,000 people, and then he says something about the song, for example, "Oh, this song is really good." Those users might then interact with our Twitch extension, or go search through our catalog or do any of those things. And we might see a massive spike of data.
Nate: The other thing that can happen as well is Twitch extensions, which are this it's basically an iFrame that's embedded inside of Twitch's own page. When a broadcaster goes live, and those extensions all come online, or when they introduce, they can edit those extensions on the fly.
Nate: So a broadcaster who's broadcasting 20,000 people might be like, "Oh, I don't have my Pretzel extension turned on." And they go into their editor in Twitch and add it. All of a sudden, all 20,000 of those users all at the same time, load up this client that hits our back end. All within the same hundred milliseconds. And so you go from zero traffic to 20,000 requests in the matter of one to 300 milliseconds it happens and so-
Julián: What a spike.
Nate: Yeah, there's a lot of weird stuff that you run into. And so we've hit some really interesting challenges in trying to deal with that.
Julián: So yeah, those sound like very, very great challenges, especially when you're dealing we with the whole audience, the people that are watching the streams. So obviously, I guess scaling ... I mean, I'm also a software developer and I work on production environments. And scaling is I will say one of the most difficult things when you're working with our production rate application.
Julián: So how are you solving these issues? Tell me a little bit more about, for example, your architecture? Or how are you dealing with this amount of traffic? And the type of tools how you use? Now let's get way more technical.
Nate: The backend is all written in Ruby on Rails. And so we have a pretty standard kind of Ruby on Rails deployment, we use mostly API only mode. So we're not generating most of our stuff because most of our clients these, front-end music players and even the Twitch extension, those are just kind of your standard React based applications that are in the front end communicating with our back end.
Nate: So we don't composite pages or anything in Rails. We're just using it for ActiveModel and the ability to have our ORM. So we can talk to our data model and, and set a lot of that up.
Nate: So that all runs on our main API currently is running on Heroku, and Rails and Postgres. And then we actually have a significant amount of Redis that we use for caching as well as some other interesting things we have to solve.
Nate: So for example, we have a concept of what's called a play token. And so a play token is the song that is playing, so this user playing this song this time. And we use this for tracking things like our revenue, so how do we pay out fair play? Like how do we track our plays and all those things? So we have these tokens that we generate.
Nate: And we generally generate those at a rate of three to six tokens per second is kind of our during our peaks and valleys during the day. And so there's constantly new music, like there's new players that are listening to new music every second.
Nate: But selecting the next song can be computationally expensive. So for example, when we generate a play token, we need to have information about that current user that we're generating for. So we keep a history of what songs they've listened to what filters they haven't placed, we talked about that it can instrumental only filter. So we need to filter out things that don't, that are not instrumental.
Nate: And then also, each individual user has the ability to blocklist tracks. So we need to make sure that those aren't being selected. And so we have kind of this thing that could touch many tables. And so what we end up doing is we very heavily use Redis in our song selection logic. And we kind of temporarily cache kind of the state of each person's account so that we can very quickly using a lot of the built in Redis operations very, very quickly, kind of select the next song that we want to play.
Nate: And so eventually we're going to fill that in with kind of a recommendation engine and pre generate cues and a whole bunch of stuff that a lot of the traditional DSPs do like streaming platforms like Spotify. But yeah, we're not there yet. We're a small team. That's kind of the some of the interesting pieces that we do in terms of the Rails app that runs currently on Heroku.
Julián: Oh nice. I guess you're using also Heroku Redis, Heroku Postgres.
Nate: Yeah, yep. I am not an ops person. So monitoring servers, pager duty type stuff. I still I still get notified if our site goes down and whatnot. But knowing that a server like that's just not something I want to spend my time dealing with. And so Heroku has been great for that. Because I don't have to worry.
Nate: And I've been using Heroku for years and years and years and it just kind of runs I do get notices when it's like, "Oh, your Redis needs to go down for maintenance." And we'll deal with that. Or, "Hey, your Postgres went down." And I think to myself, and when I get those emails, it's like, "Oh, good, Heroku took care of it." I don't really need to worry about that.
Nate: That's nice that's one of the reasons I run here, I run on Heroku. There's other places, though, that are other things that we do that are not really well suited for Heroku. So we use Netlify for all of our front end stuff, which is a CDN that also does a bunch of pre build and pulls together a bunch of our React applications and serves those applications. So we use Netlify for a lot of that.
Nate: However, one of the big problems we run into and this is not necessarily a Heroku thing as much as a Rails thing is how do you deal with those big massive spikes of traffic, even with Heroku auto scaling or any of those types of things, it's really, really difficult to scale Rails, if I was to just spin up 100 dynos, I'm going to then bottleneck my Postgres, or I'm going to have ... there's all kinds of things that you're going to run into.
Nate: And there are really, really good ways to scale Rails with sharding, and all kinds of other things you can do. But that's when you have a constant heavy load, that's when you're going to want to focus on doing that kind of stuff. When you have this like, intermittent, really spiky traffic, that's not ideal. Why are you spending thousands of dollars a month on infrastructure that only sees its peak utilization once a week? Or something like that. It's just it's not worth it.
Nate: For that we actually use Amazon Web Services. And so we use for all of the viewer facing stuff as opposed to the broadcaster facing stuff interacting with the music service. So our search, our song request service, are what's currently playing. That is all being stored in a API gateway, Lambda DynamoDB system that's all running out on AWS so that we can actually slam that thing with like a sledgehammer, when needs be.
Nate: And so it kind of caches that extra data over on this kind of almost like a infrastructure caching layer that we use to deal with those users that is separate from kind of the main Heroku API stack that we use.
Julián: Getting benefit from the whole elastic platform that-
Julián: The Lambda--
Nate: And we have hit, like Lambda has a soft limit, but by default, when you set up a Lambda, you actually get 20,000 concurrence. And we've hit that with Twitch, like just people turning on our extensions on Twitch. I think the first time that we did it was two very large broadcast on Twitch went on with live at the same time. And we just we nailed that 20,000 concurrent limit. And in fact, we got a phone call from AWS, trying to legitimize whether or not our traffic was legitimate. Or if we were ... why are we trying to DDos them.
Nate: But yeah, that's where kind of our stack pulls together. And then we also have a CDN. So we've gone through a couple of different CDN providers, we currently use a company called StackPath. They're great because Twitch is global, we can't just get away with having like one or two POP locations like in the United States. And so we end up using StackPath because they have like 50 or 100 different POPs that we use all around the world.
Nate: And then also dealing with you know, one of our biggest expenses that we deal with is bandwidth transfer. So the actual price of moving the data from point A to point B. The actual music and images that we send for just the player. So again, this is not to the ... sometimes to the viewers, but we don't send music data to the viewers.
Nate: But we're running around 30 terabytes a month of just raw music transfer every month. And so that, of course, there's no reason to run that through her Heroku or anything else. And even if it was through Heroku, you're still going to have to be pulling from S3. And so you run into some really weird things that happened.
Nate: Like AWS, the standard rate for kind of the lowest tier AWS bandwidth transfer is eight cents a gigabyte. And we're getting there's a lot lower than that about half that with StackPath. But we still have to pay that eight cents, just to get the files from S3 to StackPath.
Nate: And so if you're not careful, you can actually get double bill yourself. So even if like say we were serving through Heroku, like serving a file through Heroku that you would do this. But even if you were, you'd still have to pay the bandwidth coming out of S3 to the internet. And that's an eight cents a gig.
Nate: And so it's very, very easy to accidentally double bill yourself when you do that. So yeah, that's one of the big expenses is just figuring out how to efficiently get stuff and from point A to point B and I think one of our hardest ones is like making sure there's a very vibrant and active livestream community in Australia.
Nate: And getting our data to Australia and into the Australia POP is sometimes a challenge. And so making sure that their experience is really good in Australia. And they'll come and complain that I think we've done pretty good getting that put together, because it also the more users we have in Australia, the better the service ends up being because everything gets is hot in the cache in that CDN POP.
Nate: Just making sure that like making sure all of this is running and copacetic. And that's where I spent a lot of time and making sure, Tuning was a really big thing about 10 months ago, we had a huge, huge burst in growth. And so we didn't have a lot of caching happening in our Rails app. And so I had to kind of spend a good about a month going through and kind of tuning the application because we were running into scaling issues. Just on the API and getting that to scale up and handle the amount of load hitting the Rails API.
Nate: And so we used Scout which was fantastic for figuring out where our inefficiencies in our application were. And so going through and tracking those down and fix those.
Julián: I mean, it seems to apply to one of the concepts I always try to ... or the advice I always try to give to people when I'm doing consulting or something and today you need to use the right tool for the job. If you try to use one only single thing to solve all type of problems you're going to have like issues. So it seems you have like a very well distributed set of tools and other services to solve the problem you are getting.
Nate: Another factor for us is cost. Pretzel is completely bootstrapped. It started as a side project for me and has now turned into my full time job. Yeah, which is fantastic, but it means that we're not the type of people ... we're still doing things like making sure that our worker queue is doubling up or using multiple processes or multi threading so that we can get the best bang out of our buck for our dynos like on Heroku.
Nate: Those are the things that we're still doing to cut a few hundred dollars a month off of our bill. And it's important for us to do that because we're still really really trying to make this work.
Julián: Besides the limiting that Lambda is imposing to you, you told me that you are reaching the 20K concurrent users. What other challenges or limitations are you having with you with your current platform?
Nate: Search is probably one of our biggest issues right now that we're working with. And we just, we're working now with Algolia to help us with our kind of our search and our indexing and making sure that it is kind of pricey, but ends up being actually quite good for us for our use case. And so one of the things that's kind of nice about the way we work. So here's an example search use case that we're trying to solve.
Nate: You have a viewer who wants to request a song on a broadcasters channel, and so they're going to pay you know, $1 or something to request a song and they need to be able to search through the Pretzel catalog and find the track that they're trying to request. So we need to be able to have them type in and search.
Nate: But we also have to apply. There's a rule that a viewer cannot request a song that a broadcaster has blocklisted. Or let's say the broadcaster has certain filters in place like that instrumental only filter that we talked about. And you're not if you are trying to request a song that has vocals, but they're not allowing vocals right now, it's going to be grayed out that you can't select that song at the moment.
Nate: So we're talking about search, but we're talking about at that viewer layer, which needs to be able to scale. And so making sure that we do those, but then we also have to apply, we get the search results back in and then we have to apply the search results across against the filters of the particular broadcaster whose channel is being searched right now.
Nate: One of the ways we get around this though, is we don't apply that filtering. The results are still coming back. As you know, this track is a track that is matching your search query. But it's being filtered out after the fact. So we still want to render that track as being available but having it disabled from selection in the interface.
Nate: And so that's actually one of the tricks that we've been doing is we actually don't need to apply the filter on the actual Algolia search itself. So we still get to use the really high speed Algolia searching. But then we have our API, which then returns another data set of okay, here is based on those search results, here is the data that you need specifically for whether or not these are acceptable selections for the viewer. And so we kind of merge those things together.
Nate: But yeah, search has been a saga to make sure that we do search really well.
Julián: This is depressing. Tell me more about the future of Pretzel. What are you thinking about, not only on the product side of things, but more specifically about the technical challenges you're having, and the current architecture? Where are you going with the product?
Nate: One of the big things that we just recently made a migration because we have quite a few front end clients. So there's a front end client, that is the web app, there is the Electron app that then wraps the web app. So that's mostly the same app. But then we also have there's Streamlabs OBS, which is broadcast software, so we actually have a Streamlabs OBS app. So that's a whole separate cloud runs over there.
Julián: I was going to ask about that because I'm now ... I was a streaming in Mac and then Linux and I have a lot of issues like dealing with OBS and setting things up and I found the stream labs on Windows and I'm now moving everything to the Windows as my main OS for streaming.
Julián: Tell me a little bit more about the clients. Excited that you support Streamlabs. So I will be able to use it.
Nate: Streamlabs has been a fantastic partner for us over the last year. And so yeah, we provide, they have what they call the Streamlabs App Store, which is your traditional kind of app store. But our app runs internally to their OBS software, so you can actually install the Pretzel Rocks Streamlabs app into OBS or into Streamlabs OBS and then be able to use that directly in it. And so that gives you the music and everything else.
Nate: Because ultimately you could use it there or you can use the standalone Pretzel desktop application and then set the audio inputs from that application into an input in OBS anyways. That's kind of how the whole system ... whenever you're broadcasting, that's your typical broadcast setup. And so yeah, we have quite a few improvements.
Nate: One of the things that we're doing right now, also other clients, we have the Twitch extension, which is a separate extension, dub dub dub pretzel.rocks, which is our like homepage to get for like account management and all those things. That's a separate application. And so yeah, we have a whole bunch of applications that are being used.
Nate: And so right now because we bootstrapped it, and we didn't really architect them all to be interchangeable or usable, or any of those things. And so for us, there's going to be sometime this year that we basically need to take a bit and pay down some technical debt. And make sure that we have ... right now we have three or four different ways that we facilitate search. And we want to standardize them all through one main service.
Nate: That is our search service that has the flexibility of doing what we need to do and go much more micro service Kind of architecture. Especially because we now know, our data access patterns, so much better than we did two years ago.
Nate: We know what our usage patterns are, we know where our pain points are. And so one of the things we are probably going to end up doing is moving kind of this play token service, we're going to be moving that into a Dynamo table, just because we're talking over the last two years we've had .... man that table is probably over 60 million records now.
Nate: So we're doing Heroku Postgres, but we have a table that's getting three to six writes per second, all day, every day. And then reads on that table as well. And so and on a single table that's got 60 million records and it's going to keep growing and as we grow goes faster. And so there's there's quite a bit of stuff that we do anytime we try to do a join or anything on that table, that can bring the system to a halt.
Nate: Early on, that was something as I was experiencing these kind of growing pains that I ran into. One of the things we have to do is we pay our artists every month, we pay the artists out based off of 70% of the revenue that comes in on the subscriptions goes back out to the rights holders themselves, which is fantastic.
Nate: But we need to know how much each person gets. And so we need to pull that play token data. And by pulling that play token data, there was a point in the past where everything was in the production Heroku Postgres, and I ended up locking up the service and nobody could actually go just because I was doing a heavy analytics query against it.
Nate: Those are some of the pain points that we experienced while we were growing. And so yeah, now we have an analytics follower, we have a production follower, that's a read only that we can do that suck completely separate from the system. So we can run an analytics workload without affecting production.
Nate: There's a lot of different kind of places is to take it. We're kind of right now in a hybrid between some of the clients are using kind of our original API that was handwritten. And then others are using GraphQL. So we move most things to GraphQL. So like the stream labs OBS app is all GraphQL. And we really, really like how that works. And so we need to go back to our other clients and upgrade them to use kind of the new GraphQL endpoint.
Nate: And so for us, the big thing is, is moving the really critical paths out of Heroku, or like putting them on stuff that can actually deal with the critical paths that aren't going to fall over when I you know, hit it really hard or be able to scale.
Nate: But I'm a big fan of Rails. I'm a big fan of Ruby, and I love ActiveRecord, Postgres always makes sense to me. I'm a big fan of relational databases. You know when they're specifically being used and so when I have a data model that is, here is a track here is an album here is some of these other things. I could go into a noSQL type style. And, and for some of the use cases, I'm going to go to a noSQL type key-value data structure like play tokens, the perfect use case for that.
Nate: But now we're at the point where it's like, "Okay yeah, we have debt." Like being relational data for this thing is detrimental. But it doesn't mean we have to move everything wholesale over. Which seems to be a lot what the industry tries to do. Is they try to, they tried to see that thing like, "Well, this isn't working for this use case. So let's go to this other use case" which also has its downside. And so we really try to be as pragmatic as possible in figuring out how to build .
Julián: That's what I call high driven development, pretty much. Technology and let's move everything because this is going to solve all my problems.
Nate: Exactly. And so-
Julián: Even though it's not the best.
Nate: Yeah, one of the big things that we're looking to do in terms of features. So right now I'd say one of our most lacking features is just the ability to browse through the Pretzel catalog, see what's in there, because we have a lot of really good music.
Nate: And right now if you want to sample this the Pretzel catalogue, you just kind of have to open the app and let it run and listen to music. And people don't have a lot of time to do that. Originally, the desktop app of Pretzel is a very, very small form factor application.
Nate: And the reason we did that is because we identified that when a broadcaster is broadcasting, they don't want a big huge window in the way they want something small out of the way that has the functionality that they need, that they can make changes at a glance that they need to. So we call that our streamer mode.
Nate: But the other use case is when they're not broadcasting when they actually want to go through and browse the music and select it and all those other things, we call that library mode. And that's actually what the Streamlabs app has.
Nate: Streamlabs is only library mode, there isn't a stream mode. And so once we get those all kind of unified, we also want to really, really focus on making our library mode publicly available. So anyone going to Pretzel Rocks website can see. Go through the catalog sample, see what's in there, you know, like things, dislike things and really engage with the catalog and see what's in there, because there's a lot of really good stuff in the catalog.
Nate: And that catalog is just going to grow bigger and bigger and bigger. And so we want to enable that to happen. And right now, our system is really well situated for that one broadcaster use case. And so we want to extend it so that it serves the greater customer base a lot better.
Julián: Nice. So future looks bright.
Nate: Yeah, I'm pretty excited about it. And then I think the one other thing that is crazy is that the music industry uses a standard called DDEX, D-D-E-X. And it's an XML based standard for communicating metadata about music and releases and all of that. And so over the last two weeks, that's what I've been working on is getting our DDEX ingestion pipeline working.
Nate: And that's been a whole thing and of itself that getting that to work, but that that is the necessary part to kind of expand the catalog larger and larger and larger is setting up an easier way to get music into the catalog into the system and scaling it up and not being a big headache for rights holders.
Nate: That's actually something that is kind of a pain point right now is right now if you are a rights holder and you want to get your music on Pretzel, you have to send us a CSV with all of your lossless audio files, and your images for that stuff and you put that CSV and then you send it over. And then we go back and forth and like try to get it imported like there's a whole process.
Nate: To do that, as opposed to what traditionally happens now with streaming music platforms, is you go to a distributor like TuneCore, CD Baby or DistroKid, and you give them your music, and then they distribute it.
Nate: Pretzel has grown enough now that the distributors are willing to work with us. And so we are now in the process of integrating directly with those distributors, but in any sort of like standards based thing, just trying to make sure that the standard is or you're implementing the standard in the way it's meant to be implemented. And then there's still difference of opinions within the standard.
Nate: And so that leads to some really interesting issues that you run into. That's been my--I don't want to call it a waking nightmare, that's not right. But that's been my headache for the past two weeks is it's getting that all kind of setup.
Julián: Speaking about nightmares, in wrapping up this episode. What keeps you up at night?
Nate: Well, literally last night it was DDEX. But yeah, that's one of the biggest things is just there's so much work to be done here still like we have a product and I'm really proud of what we've built. I'm excited to keep building and growing this thing. But man, there's so much more work to be done.
Nate: There's analytics for the rights holders so they can see where their musics being used, like, that's a huge thing. And then there's making sure that the experience is really good. And one of the big things that we're focusing on this year is how can we enable broadcasters to be able to make more money. And so one of the things that is really unique to how Pretzel works, especially on Twitch is remember we talked about that song request thing?
Nate: Well, the way that works is the broadcaster has it enabled, a viewer pays $1 to request a song, the broadcaster gets 80 cents, and then 14 cents of that goes to the rights holder. And then six cents comes to pretzel. And so that's how the split kind of works.
Nate: And so we have facilitated a system where a person can pay for this, the broadcaster made 80 cents on that request. And so we have broadcasters who are able to actually make increase their revenue for their channel, just by accepting song requests.
Nate: And so it hasn't really taken off as much as I want to. We have a couple of outliers in our broadcasters who there's one broadcaster who made over over $10,000 in the last year in song requests alone.
Nate: Yeah. And so but how can we take his success and replicate it all the way down? And I think that's where catalog familiarity is a big thing and setting those things up. So how do I how do I extend that? How do I enable broadcasters to be able to make more money and make sure that the right holders are being fairly compensated for their stuff being used.
Nate: And those are the things that keep me up at night I've got that I see that vision, that vision of I want because the grand vision of what we want to do with Pretzel, all of this traditional music, like the music that you want to play, we want to make that music available for this use case. And it's really hard to do that.
Nate: Take Drake for example. Drake's an interesting person because he went on stream with Ninja and they had 600,000 concurrents watching them and they went through the thing what's crazy is Ninja can't actually play Drake's music on stream. Even though he knows Drake, and talks to Drake and plays video against Drake. He can't play that music on stream. So "God's Plan" by Drake, it has 19 publishers associated with it, and nine record labels.
Nate: So that means you have to get 28 different companies to agree on a licensing rate, you have to agree like we have to get them to even respond to you, then you have to get them to agree on a licensing rate. And then you have to make sure that everyone gets paid. And then if one person gets a higher licensing rate, they all want that same licensing rate. And it's all done through email and phone calls. It's all a manual process.
Nate: Most people don't understand, that's how hard it is to license music. And so you could do it, you could go get that license and play it on your stream, it's just going to cost you 50 or $100,000, for the use case, to do the work just to get the rights to play that song.
Nate: And you're not going to make that money back. And so that's the goal. The grand vision is to get to the place where it makes sense and that in the power of influencers, and that there's a lot of power to that multiplication effect. Last year we had, we had 30 million plays, so 30 million plays--and that's again to the broadcaster.
Nate: So 30 million plays for the broadcaster. But those plays that were then I think it was close to 15 million were broadcasted play. So there's 50, about half of our catalog was just people listening to the player not actually broadcasting. And so 15 million plays that were broadcast on Twitch and Mixer that generated 431 million impressions.
Nate: So that 15 million then turned into 431 million people hearing that music. And we generate every month. One of the things that Pretzel does is we actually generate, we have a link that drives traffic to the artists' Spotify to that song, and we actually generate a quarter of a million clicks every single month, driving traffic to those things. And so we're trying to prove that there's something here. And that's what keeps me up at night.
Nate: How do I do that? How do I prove this? You know, all the tech stuff, we can figure out, we can figure out how to solve. And I mean, that's what we do as tech people. We see problems, we figure out how to solve the DDEX. I'll figure that out. But yeah, what keeps me up at night is absolutely. How do I how do I build this thing? How do I take this to the next level? That's at least for me.
Julián: So to finalize, do you have any advice for developers that are hitting that want to build these type of problems dealing with a lot of bandwidth or scaling?
Nate: Yeah, if people want to target Twitch, which I highly recommend, it's a really great community, broadcasters are great, the viewers are great. Twitch is very community focused. And so that's where you have people who can make a living playing video games that they love, and that it's fantastic to see that.
Nate: And so one of the things I would recommend from a tech standpoint is if you're targeting Twitch, it's a lot different now than when it was even a year or two ago, Twitch has made a lot of great strides to be able to enable broadcasters. So one of the big things is that with Twitch extensions when they first came out.
Nate: Pretzel has been basically the alphas and betas of all of the new products that Twitch has come out. We're pretty well partnered with them when it comes to a lot of this. And so when extensions came out that we had to build our own infrastructure. We're talking about that lambda back end, and a lot of those things.
Nate: One of the things that they have is state management, so what song is currently playing on this channel, when the extension loads, we need to pull down that data into the front end so we can show it to the viewer. Well, when extensions first came out, you had to roll that all out yourself. But now Twitch has set up their own kind of configuration back end that has a persistent data store.
Nate: It's not just huge but you can put some of that data. And you can leverage that. And so Twitch knows that this is a problem that not everybody is going to come out here and say have enough success and be like, "Oh, man, I had a huge streamer, use this Twitch extension, and then all of a sudden, I have a $2,000 AWS bill."
Nate: And so what they do is they have these new services that they've put out that allow you to kind of offload a lot of that effort onto them. So read the documentation on their site. They're constantly doing that they even have a public facing roadmap about what they're adding. So yeah, it's a great industry to get into. I like the Twitch community very much and so I highly recommend it.
Julián: Nate, thank you, thank you very much for this very inspiring and deeply technical story too. I can't wait to get back to my life coding sessions, and give it a try. I mean, it will definitely solve a bunch of my problems while picking the music for my streams.
Julián: And I hope all the people out there that are listening to us like, take a look at Pretzel. And this is may seem technology you can use for you to stream and, hey, it's running here on Heroku.
Julián: Nate, thank you, thank you again for these inspiring story and for joining us.
Nate: Thanks for having me. It was great. I'm looking forward to drop it into your stream.
A podcast brought to you by the developer advocate team at Heroku, exploring code, technology, tools, tips, and the life of the developer.
Developer Advocate, Heroku
More episodes from Code[ish]
Jim Jagielski and Alyssa Arvin
Jim Jagielski is the newest member of Salesforce’s Open Source Program Office, but he’s no newbie to open source. In this episode, he talks with Alyssa Arvin, Senior Program Manager for Open Source about his early explorations into open... →
Lisa Marshall and Greg Nokes
This episode of Codeish includes Greg Nokes, distinguished technical architect with Salesforce Heroku, and Lisa Marshall, Senior Vice President of TMP Innovation & Learning at Salesforce. Lisa manages a team within technology and product... →
Innocent Bindura and Greg Nokes
How do you know an application is performing well beyond the absence of crash reports? Innocent Bindura, a senior developer at Raygun, shares the company's tools and utilities, discusses the importance of monitoring P99 latency, and talks... →