Not sure how many here would actually be interested, but for those who might I finally published my RTMP crate: https://crates.io/crates/rml_rtmp
RTMP is the main protocol that's used for the creation and distribution of live video streams. For example, if you want to send live video to Youtube or Twitch, your encoder sends the video to their servers using the RTMP protocol. The purpose of this crate is to make it easy to implement RTMP clients and servers in your own applications.
To demonstrate this I created an RTMP server using mio. OBS or Ffmpeg can send video into it and VLC and mplayer clients can connect to watch the video feed live. It can also pull live video from external RTMP servers, as well as relay video out to other external RTMP servers.
I'm happy with the performance I've been able to get with it. According to benchmarks I've created I have average time to deserialize a 10KB video packet and distribute it to 2 subscribers to 24 microseconds (not counting network times). So for now there's lots of budget for custom logic for smooth 60fps video.
I tried to do a pretty big documentation pass, so if anyone is interested and finds anything confusing let me know.
This is a stepping stone for my end goal, which is to create a live video streaming distribution server (like Wowza) that's free, open source (and remains so), and high performance enough to run efficiently off a raspberry pi or other low end devices.
How would you suggest going about streaming an image?
I know I could use ffmpeg for that but I'd like to use this library to push an image to an rtmp server when there are no connections.
Don't worry about something like code sample or stuff, just what direction to look. Thanks
Hey there, sorry for the silence, I haven't checked this forum in a while.
This library unfortunately does not do anything with h264 or any transcoding, that's a bit out of my realm of knowledge and expertise (though maybe one day). However, reading between the lines it almost sounds like what you want to do is stream a slate to all viewers when you aren't streaming, and then once you start publishing your live stream the slate would auto transition to your video. That isn't easy to do because H264 expects the initial video packets to contain all the parameters it needs to decode the video, and without that data it has no idea how to read it.
Unfortunately, that's an issue because your slate video must then use the same parameters that your live stream will be using because most video players don't support switching h264 parameters on the fly. In order to accomplish that you'd really need to fine tune control of ffmpeg or gstreamer to always transcode video with two sources, an image for the slate on one and the live stream via rtmp as two, and custom code would have to swap between the sources.
This is probably more of a job for the rust Gstreamer library than anything my library provides.
Thanks for the reply! Continuing looking I also came to the conclusion that Gstreamer is better suited for this particular project. I appreciate your comprehensive answer
Unfortunately, I left my live streaming gig a year and a half ago and haven't made much progress. I have a theoretical RTMP library in my Github but people have identified some failures with it that I haven't been in the right mind set to pick this large project back up :(.