Align two instances of OBS

kevinmclane

New Member
Hey,

I'm not exactly sure the right way to define this. In FMLE it is called streamsynchronization in Wirecast it is called keyframe align. I run two redundant computer encoders taking a live feed from a splitter to respective usb3 video capture cards. I need a way for the two encoders to align somehow using some kind of NTP for UTC reference. In my case the video is being streamed rtmp to dual wowza servers and transcoded to hls and then joined at akamai in a master playlist. If this type of setting is not turned on in FMLE or wirecast when the primary encoder connection is broken the stream will stall out and you will be able to grab the backup upon a player refresh. Without the setting on if you start the two encoders at the same time you will get a seemless failover 1 time. With this type of setting turned on; the streamsynchronization for FMLE or keyframe align for wirecast you will get seemless failover back and forth no matter when you start or stop the encoders as long as there is enough buffer in the encode of the playlist. Does anyone know what exactly is going on under the hood and the right questions to ask to begin trying to build such a plug-in for OBS?
 

pkv

Developer
i checked closely the rtmp code; it relies on librtmp with tweaks from Jim (the lead dev). It sets the initial timestamp as UTC epoch (edit: though it's presumably ntp epoch and not linux epoch, which shouldn't matter).
So it's already set as it should be. It will use the time provided by the OS. Just set the clock on the two machines to sync with same ntp server and that should be it. On windows 10 you would go to Settings > region > Additional date, time & regional settings > set time and date > internet time > change settings
Test and report but it should work provided you've synced your machines to same ntp server
 
Last edited:

kevinmclane

New Member
Coming back to this with some more info. I reached out to telestream support asking what their keyframe align actually does. Does this additional info help to clarify my question? Maybe there are a list of settings I can type into the x264 options field that would accomplish this?
Support ticket highlights:

"Hi,
I am interested in what the keyframe align technically does to the stream. I see the difference in the .ts segments in the browser traffic. (Wirecast rtmp>wowza hls packetizing). With it off the numbering starts at media_1.ts. But with it on the number starts at a very high number. What is happening to the stream?"

"When Keyframe aligned is checked, it facilitates adaptive bitrate streaming by ensuring that keyframes from multiple streams are in sync, along with the keyframes timestamp, DTS and PTS values.
But this is true only if those other streams also have the option turned on and have the same keyframe interval.
To accomplish this, Wirecast disables scene detection and manually inserts the keyframe at the exact keyframe interval specified. When Keyframe Aligned is enabled, absolute timestamp is also enabled."
 

kevinmclane

New Member
Hi,

I am coming back to this thread with some feedback from Wowza support. The wowza feature I am trying to feed data to is called cupertinoCalculateChunkIDBasedOnTimecode


"The key to making it work is absolute timecodes that are aligned to some epoch. Wirecast and FMLE enable this with their alignment and synchronization settings."

Implementing this feature or a plugin would be greatly appreciated.
 

wowza_roger

New Member
Hi, Kevin has reached out to Wowza Support and the feature that is required is the Absolute Timecodes that is required so that stream timecodes between separate encoders and streams is aligned.
 

dodgepong

Administrator
Community Helper
Hi, Kevin has reached out to Wowza Support and the feature that is required is the Absolute Timecodes that is required so that stream timecodes between separate encoders and streams is aligned.
Hi Roger, thanks for your comment.

By "Absolute Timecodes" do you mean NTP-synced timestamps? For example, if MISP timestamps were inserted into h264 SEI, would that accomplish this?
 

wowza_roger

New Member
Hi,

It's the actual RTMP timestamp (or extended timestamp) that we are referring to. It is either synchronised to NTP or some other epoch. The idea is that it's synchronised between encoders and across encoder restarts so the timecode is alway monotonic.

In Wirecast, it's enabled with the Keyframe Aligned option and in fmle, it was enabled with the streamsynchronization option. fmle also had an extra option to set the epoch that the timecodes are synchronised on.
 

dodgepong

Administrator
Community Helper
So Wowza reads that data from the transport protocol rather than the video stream. Is there a spec we can follow to ensure we implement it the same way as FMLE/Wirecast, if we were to implement it? Or is it just "run Wireshark on FMLE and copy what they do"?
 
Top