Welcome to the podcast. Today, we’re joined by Chris Allen, CEO of Red5, and Brett Fasullo, SVP of Sales. Red5 is known for ultra low latency streaming powers interactive video experiences. Chris and Brett, welcome. How are you guys doing? Great, man. Thanks for having us on the podcast. Thank you. To kick off, how do you define innovative broadcast workflows with real time streaming? Yeah. I mean, innovative is kind of a loaded word. Right? Because, for some people, innovation is, like, pretty basic for others. So I guess for me, the people leveraging real time video in these kind of broadcast workflows is a lot of it’s that kind of REMI thing where you got remote, people who need to be they can’t be in the studio, but they need to be a part of it. Obviously, real time video streaming is absolutely essential for that, to be able to be, like, in sync with the comms and everything else. So I think the remote, you know, you know, contributor to the broadcast is a big part of it. But even just even if they’re at the studio in the network, a lot of the older workflows involved high latency streams, like using HLS to actually monitor what’s going on. And then if you have, the and then that’s adding many seconds of delay in it. And if you’re trying to do, like, commercial cuts, during a live sports game, for example, like, one of our, professional, like, baseball groups that we work with with Zixi and and and Red5 as a part of that. They’re they’re using this for commercial slating. So they’re watching the live games. And if they have a walk off home run, they need to trigger a certain type of commercial for that versus something which is, like, between innings. You know? So but those things are dynamic. They’re happening during the game. They’re not always at a set time, and they need to be pretty precise about when they’re gonna cut. Right? And, before they had us in the mix, it was always a little bit of guesswork. You’re like, you’d be like, okay. We’re gonna be cutting, but, you know, they were cutting off commentators as they were going. It was a little bit, a nasty guesswork. So I don’t know if that’s innovative, but it’s very practical, and it’s a good use of real time streaming in it, and those kind of workflows. Yeah. It’s interesting because a lot of broadcasters used to do this all on prem. People would be listening in a truck. But now they’re moving that to just fully remote, and your technology is really helping that. Are you seeing a lot of, like, you know, customers, moving from the hardware, production to more of, like, the remote production with your technology? Yeah. Absolutely. And and kind of the movement of cloud, like, deployments is a is a big part of that. And even when it’s not necessarily cloud, they may be hosting it themselves or on, like, a provider like Lumen where it’s, like, using edge computing. It’s it’s still important that it’s accessible outside of the traditional studio or broadcast, you know, group, and then made accessible for people. And then, obviously, the latency has to be low enough that they can’t really perceive that they’re not in the studio and don’t have all the or or the, you know, production truck or whatever. You know? And then how many, you know, how many events are you know, can you handle? So we’re talking about that major group. But what about our other, you know, shared customer like a FloSports where they’ve got thirty thousand events a year? You’re not exactly gonna be doing, you know, on-site production with everybody being there all the time. So yeah. It’s really You bring up a good point, Brett. A lot of it’s a cost type thing. Like, large, larger groups or big, you know, corporations that have studios already, this is more about the remote worker kind of, scenario, you know, REMI type production. But in the cases of, you know, second, third tier kind of groups, they’re looking to just make it a practical, like, get the same experience, but create, but not had that huge production studio to do it. Right? And those kinda the tools that real time video kinda provides with Red5, make that really possible. Yeah. With new technologies, like this, you know, there’s a lot of sometimes resistance for this transformation and challenges in the setup. Are you seeing any of that with your current customers, or are you just making it super simple for them to get started? I I think there’s always a resistance from the kind of old school broadcast guys. They’re like, oh, no. This is kind of like a toy. We’re not gonna use that. You know, I I don’t believe it’ll work. So, we’ve definitely had that kind of resistance. Like, our our, I don’t wanna call out any specific names here because they’re probably, like, like, even the groups we’re dealing with. But one motorsport group, like, they have pretty traditional, like, you know, the trucks and then all of this stuff. And we’re like, okay. But at the same time, you know, we need to get graphic overlays of what’s happening with the vehicles in sync with the live video from, from the driver cams and stuff. And then, you know, like, time syncing that and getting it back into the production and the workflows. And you can’t use just those, like, old like, first of all, an SDI cable is not gonna connect to a car. Right? There’s no way. That that’s speeding around. It’s not happening. And then so, like, all of these kind of workflows are, not possible without the new technology, and you still get resistance from the, like, old school approach. Right? But the yeah. So I agree with Chris on that. Getting outside of the major sports and you start to look at the tier twos, we have another, really exciting partner in team track, and they’re building a new facility. And we’re seeing a lot of the changes happening. So you mentioned before about the eighteen wheelers. You know, you’re not seeing those in the remote productions, but you have to be ready or even the on-site productions, you have to be ready and plug-and-play ready. So new tech is actually especially in production is replacing at a very fast clip, a lot of these outdated, technologies so that you could have groups coming in to a multipurpose facility. How everything’s done, it also eliminates, a lot of unnecessary equipment in the production room. So we’re seeing a lot of those change outs, and you’re not gonna be able to do it without some of this newer newer tech, and supporting real time. You know, I I guess another kind of interesting aspect of this is marrying new tech with old approaches. It’s it’s a lot of it has been basically put together with, like, duct tape. You know? Like, that kind of like, it’s it’s nasty. Right? Like, we’re we’re and we’re seeing a lot of uses because we have this new TrueTime, meetings product that we’re rolling out pretty soon in our TrueTime studio. And this is allowing remote commentators or, like, people to become guests at shows. What they’ve been having to do before using our solution is, you basically have to have, like, a Zoom app. Set up a Zoom meeting. You have a laptop in the studio just running the Zoom meetings so they can get ISOs of each of the individual video contributors into it. It’s like a nasty workflow. And then we’re we’re providing, hey, just send them this URL. It works in a browser. They join the meeting. They’re able to have a conversation. It’s very intuitive. And then, that stream is just part of the Red5 system, which then we can hand off to Zixi or whatever, platform that they’re using to kinda manage their streams. And they’ve got their ISOs already. It’s already set up. It’s like a it’s a very seamless workflow, and then you don’t need these extra laptops sitting around just becoming a participant in a Zoom meeting just to grab that video. You know? It’s like so I think it’s a matter of simplifying a lot of this stuff too. Right? Yeah. So you guys are really famous for, you know, sub-250-millisecond interactive, streaming. What other use cases? So so right now, you’ve talked about production use cases where you’re ad inserting, you’re doing live commentary. What other use cases do you foresee using, you know, your low extreme low latency, solution? I mean, we’re we’re seeing tons of this. Right? It it I think one thing is the lines are blurring between a broadcast scenario and, like, the fan experience. If we’re still talking sports, like, I think we’ve got another customer, the famous group, great partner of ours. They have this Vixi Live product. We’re using, you know, the people’s cell phones in their arenas to actually do selfie cam up to the billboards in the arena. But now you imagine the same kind of scenario where those people are able to now become part of the broadcast. People from home cheering or whatever, all of that stuff can be streams that become part of the broadcast. And and, yeah, it’s it’s user contributed, but it’s user contributed then going as part of the broadcast. And I can see the same thing happening with news where you’ve got people on-site at it at something that’s unfolding. They’re using their phones to actually stream that in. Like, you imagine having, like, a CNN app or something like that. People can contribute, and then those feeds can can become part of the live broadcast that then the, you know, professional journalists and commentators are actually making getting that feed in and then having being able to ask questions of remote viewers and everything like that. So I think the the lines are blurring. I don’t think it’s gonna be traditional. Just everything happens in the studio, and it’s only professional camera operators and everything else. And one other one other thing I would add to it in the other use cases, we do a lot of work in surveillance and intelligence, and two things come out in that that two hundred and fifty milliseconds. Now that bar is still high. You know, the tech team has been spending a lot of time trying to just reduce every millisecond possible because now you’re not only interacting with, the stream. You’ve got touching of, like, you know, when we’re talking about drones and things like that that you actually have to have really quick maneuvering. You know, press button left, right. There’s the so that’s only one piece of it. And then the other side of it is, the two way communication. If you actually have people in the in the field or mission critical kind of things, and that can be for weather and other emergency type situations, you don’t have the time, for course correction. If If somebody is in a mission critical situation and you have to have that communication two way all the time, there’s perfect applications, with us too. Yeah. Brett, you bring up an interesting point of two way communication. Typically, broadcasting has always been a one way. You watch it coming out of the stadium. Do you foresee potential new avenues of business where you can bring someone in the home helping to cheer on their team and show them on the jump Oh, a hundred percent. Yeah. I mean, this is this is happening already. Like, the famous group’s doing this with a Vixi Home product already, And they’re able to pull that into the billboards in the arena, but also can become part of the broadcast that viewers at home are seeing themselves back, you know, which is it’s a pretty cool effect. And then bringing in kind of that talk radio, kind of be like, hey. Let’s take in the next caller. Now it’s video chat on phones. Right? And you’re if you keep the latency down on the broadcast, the stream coming back, then it makes it possible. And then, yeah, there’s also sports betting. That’s another aspect that’s coming in where the latency is really critical where you wanna do microbetting on the next play. You better believe that it has to be super super low latency. Otherwise, people are gonna be gaming the system like crazy. You know? Then to add one more piece to it, it’s again, with the latencies, and the communication. So on engagement, we’ve always talked about engagement for almost the past five years. Right? Advertising, AI. And I Chris, you’ve been spending a lot of time on this, but, two very important pieces, especially with server-side ad-insertion (SSAI) and really, like, how fast can things, move. And then, of course, I I was hoping you’d be able to maybe elaborate on AI and the importance on that, and what’s happening now triggered in these live streams. We’ve been working a lot on, that on the tech side. Yeah. So real time, video coming in and then being able to manipulate that stream, either just during detection, like, in a broadcast industry, the biggest one is, like, swear word bleeping, you know, or or blurring out things in the video that shouldn’t be there. Somebody, you know, flips the bird, like, at a sports game or something, they wanna blur those out. So we’re, we’re actually doing a lot of stuff with pulling out raw frames from the video, handing that off to AI, like a model, which can then detect certain things, blur it back blur it back out, re encode the video into us, and then that’s making a part of the, like, live broadcast stream and doing those things on the fly. And then there’s the ability to do, like, live transcriptions. I know there’s some legislation happening in the US anyway to kind of force that issue to make it more accessible the streams more accessible for everybody, particularly people with hearing disabilities. And being able to do that in real time is now possible because these language models are so good that, you know, and and the speed of them, we’re we’re actually working with an NVIDIA Parakeet model right now, which can do this stuff and, like, it’s instant. Like, it’s stuff’s coming out, transcribing exactly what you’re saying, like, as you’re saying it. And it it’s very cool. That’s the that that point he made there, already in Europe, and I think it’s being rolled out now. It it is the requirement. So the US is on that. But also on the AI end, maybe some of the advancements that we’ve been working on, in surveillance and, you know, being able to detect anomalies and things like that. I mean, there’s so many use cases in AI that it’s yeah. It’s just the beginning too. It’s fun. Yeah. It’s amazing you’re still able to do that all within that two hundred and fifty milliseconds of interactivity. It’s it’s it’s crazy. Where where do you see, you know, this type of production going in the next few years? Man, that’s a hard thing to predict. Right? Well, think about your daughter and my son. Right? I mean, they’re they’re watching more clips, less of the I mean, live is still everything, but it has to generate. At least on my side, I I see that that it’s going in a direction of, young kids’ attention spans, and they wanna they wanna see what they want when they want it. They don’t want delays. They don’t want spoilers. All the things that we’ve been to battle. I I think I I think that’s a kind of obvious trend that’s happening. It’s it’s kind of a continuation. I think the thing that is probably less obvious, which is gonna become more critical is as fake video becomes more, I mean, you just can’t tell. I think the in real life kind of validity of it and using live video where you can actually tell, the latency of that is gonna become critical because you can’t it’s gonna be harder to fake. There’s gonna be, like, this kind of authenticity to it. And I think, I think regular broadcast is gonna be switched out with more I wouldn’t say it’s necessarily user generated, but, like, taking user generated content and marrying that into probably AI generated content too. Like, you’re like, this little studio that we have right now, like, we’ve got a real plant here and everything. Like like, all this stuff is not gonna be necessary pretty soon. Right? And then but to make it like, how do I know that this is actually real? I think there’s gonna be something to that. And I I don’t know exactly what it is. But this being able to, like, authenticate it and say that, this actually is a real live stream from this real person, I think, is gonna become critical. Oh, yeah. That’s gonna be so tough in the future. I don’t I don’t know how people are gonna differentiate. Yeah. AI is really, you know, doing a lot these days. You know, how how would you advise your customers that wanna leverage some of those technologies? Would it be something you sell, or would it be an additive to what they’re currently using? Yeah. So our our plans are with Red5 Cloud. We’re gonna make it, several models available for people. And then we have this, frame extractor, which does super, super low latency, like, ripping of frames to hand off to a model like that. We’re gonna split it into two aspects. If you want just the frames and you wanna handle everything from that point, great. We’ll hand it off to you and let you do that stuff, and then probably make it in a in a cloud storage, like an S3-compatible cloud storage and pull up. That’s obviously gonna add a little bit of latency for that part of it. But we’re gonna provide models where people can just plug them in, and then start using that for, their own usage on Red5 Cloud pretty soon. And early on, we touched on these companies that are hesitant to move to these types of workflows. If if you could give them some kind of advice on, you know, how to get started, what would what would you say? I I think, the biggest thing is you don’t have to replace everything you’re doing now. I think there’s ways to take bits and pieces of the new technology and incorporate it into your current workflow because the hardest thing to do is just like, okay. We’re just gonna rip all of this out and get rid of it and then start again and then, like, build it all with a, like, a cloud based solution. I don’t think that’s very practical, and I think it’s better to just take it piece by piece, solve real problems first, and then, use new technology to solve those problems as you’re going. I agree on absolutely with especially that last statement. What what trends I’m seeing, and I think it’s a good thing is the support of the ecosystem. We’ve seen a lot of unfortunate layoffs and things like that, operationally speaking. Unfortunate or not, I think it was necessary. We’re seeing a lot of, old technology from the bigger companies, the bigger broadcast type companies that are phasing out very But I think there’s a new support now for we didn’t build everything here. If you choose partners wisely, if you work with companies that can provide really strong, technical ability, software, hardware, whatever it is, but then add it to that, you know, ecosystem. And as Chris said, you don’t have to replace everything. You just have to retool stuff. That’s absolutely the direction. We’re seeing it happening right now across the, industry, for sure. So we have another partner that works with Zixi as well as Chauffeur. They they’ve got a really interesting ad platform that actually look speaking of AI, they look at what’s happening in the broadcast and then try to tailor ads to that so that when you trigger an ad, it’s gonna actually be relevant, right, to that content. The cool thing about the real time streams delivering down to fans and stuff or or, you know, viewers is that we also have a dedicated connection per person. So we actually kinda know who they are as opposed to an HLS feed where it’s like you gotta go pull it. And then you’re pulling segments off of the CDN. There’s no way to know who’s who when you’re doing that scenario. In this case, we can actually do really targeted, ad insertion. It’s for per person and make it super relevant. And, yeah, this company, Sofar, that we’re working with, we’re gonna be rolling out something pretty cool with those guys pretty soon. Yeah. Their their technology is really cool. I I saw it, at the the last show. But so you’re saying, basically, you can now monetize down to the user level that’s watching, not just the household, but to the specific user that’s watching the stream. That’s right. And it’s because it’s a UDP, like, connection going on to each user as opposed to a, go fetch this segment file, like, on an HTTP HTTP basis. The problem with that approach is that you you’re going and fetching it from, like, multiple servers on a CDN. You have no way of knowing who’s receiving that stream. So you can’t really do super targeted stuff that way. And when you’re doing that super targeted, I I bet it’s gonna be really valuable to advertisers. Yeah. Are you able to insert the ads, in the real time streaming, or is it an HLS version? No. No. We we insert it into the WebRTC, like, stream directly. And then, you know, it’s Media over Quic (MoQ) is next. We’re we’re working on some really cool stuff with that too. So MOQ, like, it doesn’t matter to us what the protocol necessarily is, but, yeah, this is one aspect of WebRTC that we can really take advantage of, though. Okay. Cool. So before we wrap, we talked a little about a little bit about AI and about how you guys are using it. Yeah. Would you wanna share what your last, AI query was? So, like, a ChatGPT or We use we use, we certainly use AI quite a bit. I mean, the most interesting one is we just did a video with CAST by one of our other partners. And I needed a clip of some guys, like, cheering in a, like, sports thing. So last chat GPT thing, what I did for that was, hey. Can you take this still image that I got, like, and then switch out to people? Because I didn’t I didn’t feel comfortable or just using it verbatim. The and they did a great job at that. And then I grabbed that image, handed it over to Veo. I don’t know how Google is calling that product. And it had to animate and give them, like, yeah. Like, in a real game video is, like, perfect, man. It’s, like, crazy. But, yeah, that was that was it. Yeah. Chris and Brett, thanks for, being on the episode today and sharing your ideas. It sounds like, you know, Red5 has a lot of great, products coming out in the future. We can’t wait to see them. Thank you. Thank you for having us. Thanks, guys.