{"id":2064,"date":"2019-06-10T15:50:21","date_gmt":"2019-06-10T07:50:21","guid":{"rendered":"http:\/\/myblog.emhct.net.cn\/?p=2064"},"modified":"2019-06-10T15:50:21","modified_gmt":"2019-06-10T07:50:21","slug":"why-we-created-srt-and-the-difference-between-srt-and-udt","status":"publish","type":"post","link":"http:\/\/myblog.emhct.net.cn\/index.php\/archives\/2064","title":{"rendered":"Why We Created SRT and the Difference Between SRT and UDT"},"content":{"rendered":"\n<h5 class=\"wp-block-heading\">Article Form :  <br \/><a href=\"https:\/\/www.haivision.com\/blog\/broadcast-video\/created-srt-difference-srt-udt\/\">https:\/\/www.haivision.com\/blog\/broadcast-video\/created-srt-difference-srt-udt\/<\/a> <\/h5>\n\n\n\n<p><h5><p><em>Editor\u2019s Note: This post originally appeared on the&nbsp;<a rel=\"noreferrer noopener\" href=\"https:\/\/github.com\/Haivision\/srt\/wiki\" target=\"_blank\">GitHub Wiki for SRT<\/a>. It has been slightly modified for formatting.&nbsp;<\/em><\/p><\/h5><\/p>\n\n\n\n<p>Some people have asked us why we\u2019re using the&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/udp-based-protocol-udt\/\" target=\"_blank\" rel=\"noreferrer noopener\">UDT<\/a>&nbsp;library within our&nbsp;<a href=\"https:\/\/www.haivision.com\/about\/partners\/srt-alliance\/\" target=\"_blank\" rel=\"noreferrer noopener\">SRT protocol<\/a>. Actually, some people claimed that&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/srt\/\" target=\"_blank\" rel=\"noreferrer noopener\">SRT<\/a>&nbsp;is just a slightly modified version of UDT and that UDT is known to be useless for live video transmission. Guess what, the latter is true. UDT has been designed for high throughput file transmission over public networks. However, SRT is far from being a slightly modified version of UDT. I\u2019ll get into the details, but will start with a little bit of history.<\/p>\n\n\n\n<p>Haivision has always been known for lowest&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/video-latency\/\" target=\"_blank\" rel=\"noreferrer noopener\">latency<\/a>&nbsp;video transmission across IP based networks \u2014 typically MPEG-TS unicast or multicast streams over the&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/udp-based-protocol-udt\/\" target=\"_blank\" rel=\"noreferrer noopener\">UDP protocol<\/a>. This solution is perfect for protected networks and if&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/packet-loss\/\" target=\"_blank\" rel=\"noreferrer noopener\">packet loss<\/a>became a problem, enabling&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/fec-encryption\/\" target=\"_blank\" rel=\"noreferrer noopener\">forward error correction<\/a>&nbsp;(FEC) fixed it. At some point we were getting questioned as to whether it would be possible to achieve the same latency between customer sites in different locations, between different cities, countries or even continents.<\/p>\n\n\n\n<p>Of course it\u2019s possible with satellite links or dedicated&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/multi-protocol-label-switching-mpls\/\" target=\"_blank\" rel=\"noreferrer noopener\">MPLS<\/a>&nbsp;networks, but those are quite expensive solutions, so people wanted to use their public internet connectivity instead. While it\u2019s possible to go with&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/fec-encryption\/\" target=\"_blank\" rel=\"noreferrer noopener\">FEC<\/a>&nbsp;in some cases, that\u2019s not a reliable solution, as the amount of recoverable&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/packet-loss\/\" target=\"_blank\" rel=\"noreferrer noopener\">packet loss<\/a>&nbsp;is limited, unless you accept a significant amount of&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/bandwidth\/\" target=\"_blank\" rel=\"noreferrer noopener\">bandwidth<\/a>&nbsp;overhead.<\/p>\n\n\n\n<p>After evaluating the pros and cons of different third party solutions, we found that none satisfied all our requirements. The lack of insight into the underlying technology drove us to the the decision to develop our own solution, which we then could deeply integrate into products. That way, it would become the \u201cglue\u201d that enables us to transmit streams between all our different products, locally or across far distances, while maintaining our low&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/video-latency\/\" target=\"_blank\" rel=\"noreferrer noopener\">latency<\/a>&nbsp;proposition.<\/p>\n\n\n\n<p>There were a few of possible choices to consider:<\/p>\n\n\n\n<ul class=\"wp-block-list\"><li>The TCP based approach. Problem for live streaming: Network congestion, too slow&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/packet-loss\/\" target=\"_blank\" rel=\"noreferrer noopener\">packet loss<\/a>&nbsp;recovery.<\/li><li>The UDP based Approach. General problem:&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/packet-loss\/\" target=\"_blank\" rel=\"noreferrer noopener\">Packet loss<\/a>,&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/jitter\/\" target=\"_blank\" rel=\"noreferrer noopener\">jitter<\/a>, packet re-ordering, delay<\/li><li>Reliable UDP. Adds framing and selective retransmit.<\/li><\/ul>\n\n\n\n<p>Having had a history with UDT for data transmission, I remembered its&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/packet-loss\/\" target=\"_blank\" rel=\"noreferrer noopener\">packet loss<\/a>&nbsp;recovery abilities and just started playing with it. Though not designed for live streaming at all, it kind of worked when using really big buffers. I handed it over to one of our extremely talented networking guys in the embedded software team (thanks, Jean!) and asked him whether he\u2019d be able to make this a low&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/video-latency\/\" target=\"_blank\" rel=\"noreferrer noopener\">latency<\/a>&nbsp;live streaming solution. I didn\u2019t hear anything back for quite a while and had almost lost my hope, when he contacted me to tell me he had to rewrite the whole&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/packet-retransmission\/\" target=\"_blank\" rel=\"noreferrer noopener\">packet retransmission<\/a>functionality in order to be able to react to packet loss immediately when it happens and that he added an encryption protocol, which he had specified and implemented for other use cases before. Nice \ud83d\ude42<\/p>\n\n\n\n<p>We started testing sending low&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/video-latency\/\" target=\"_blank\" rel=\"noreferrer noopener\">latency<\/a>&nbsp;live streams back and forth between Germany and Montreal and it worked! However, we didn\u2019t get the latency down to a level we had hoped to achieve. The problem we faced turned out to be timing related (as it often is in media).<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/github.com\/Haivision\/srt\" alt=\"Bad Signal\"\/><\/figure>\n\n\n\n<p>What happened was this:&nbsp;<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/www.haivision.com\/wp-content\/uploads\/SRT_Transmission_Bad_Signal.png\" alt=\"\"\/><\/figure>\n\n\n\n<h5 class=\"wp-block-heading\"><\/h5>\n\n\n\n<p>The characteristics of the original stream on the source network got completely changed by the transmission over the public internet. The reasons are delay,&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/jitter\/\" target=\"_blank\" rel=\"noreferrer noopener\">jitter<\/a>,&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/packet-loss\/\" target=\"_blank\" rel=\"noreferrer noopener\">packet loss<\/a>&nbsp;and its recovery on the dirty network. The signal on the receiver side had completely different characteristics, which led to problems with decoding, as the audio and video decoders didn\u2019t get the packets at the expected times. This can be handled by buffering, but that\u2019s not what you want in low&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/video-latency\/\" target=\"_blank\" rel=\"noreferrer noopener\">latency<\/a>&nbsp;setups.<\/p>\n\n\n\n<p>The solution was to come up with a mechanism that recreates the signal characteristics on the receiver side. That way we were able to dramatically reduce the buffering. This functionality is part of the&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/srt\/\" target=\"_blank\" rel=\"noreferrer noopener\">SRT<\/a>&nbsp;protocol itself, so once the data comes out of the&nbsp;<a href=\"https:\/\/www.haivision.com\/products\/srt-secure-reliable-transport\/\" target=\"_blank\" rel=\"noreferrer noopener\">SRT protocol<\/a>&nbsp;on the receiver side, the stream characteristics have been properly recovered.<\/p>\n\n\n\n<p>The result is a happy decoder:&nbsp;<\/p>\n\n\n\n<h5 class=\"wp-block-heading\"><\/h5>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/www.haivision.com\/wp-content\/uploads\/SRT_History_Good_Signal.png\" alt=\"Good Signal\"\/><\/figure>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/www.haivision.com\/wp-content\/uploads\/SRT_History_Good_Signal.png\" alt=\"\"\/><\/figure>\n\n\n\n<p>We publicly showed&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/srt\/\" target=\"_blank\" rel=\"noreferrer noopener\">SRT (Secure Reliable Transport)<\/a>&nbsp;for the first time at IBC 2013, where&nbsp;<strong>we were the only ones to show an&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/hevc-h-265\/\" target=\"_blank\" rel=\"noreferrer noopener\">HEVC<\/a>&nbsp;encoded live stream, camera to glass, from a hotel suite outside the exhibition directly onto the show floor, using the network provided by the RAI<\/strong>. Everybody who has been at a show like this knows how bad these networks can get. And the network was bad. So bad that we expected the whole demo to fall apart, having pulled the first trial version of SRT directly from the labs. The excitement was huge, when we realized that the transmission still worked fine!<\/p>\n\n\n\n<p>Since then,&nbsp;<a href=\"https:\/\/www.haivision.com\/products\/\" target=\"_blank\" rel=\"noreferrer noopener\">we added SRT to all our products<\/a>, enabling us to send high quality, low&nbsp;<a href=\"https:\/\/www.haivision.com\/resources\/streaming-video-definitions\/video-latency\/\" target=\"_blank\" rel=\"noreferrer noopener\">latency<\/a>&nbsp;video from and to any endpoint, including our mobile applications. Of course there were improvements to be made and the protocol matured on the way. Until NAB 2017, where we announced that SRT is now Open Source.<\/p>\n\n\n\n<p>You can learn more about SRT at the&nbsp;<a href=\"http:\/\/www.srtalliance.org\/\" target=\"_blank\" rel=\"noreferrer noopener\">SRT Alliance website here<\/a>.<\/p>\n\n\n\n<p>To view the SRT on GitHub and start contributing to this open-source movement,&nbsp;<a href=\"https:\/\/github.com\/Haivision\/srt\/\" target=\"_blank\" rel=\"noreferrer noopener\">click here<\/a>!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Article Form : https:\/\/www.haivision.com\/blog\/broadcast-video\/created-srt-difference-srt-udt\/ Editor\u2019s Note: This post originally appeared on the&nbsp;GitHub Wi &hellip;<\/p>\n<p class=\"read-more\"><a href=\"http:\/\/myblog.emhct.net.cn\/index.php\/archives\/2064\">read more<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[88,87,70],"tags":[],"class_list":["post-2064","post","type-post","status-publish","format-standard","hentry","category-udt","category-network_protocol","category-audio_video_image"],"_links":{"self":[{"href":"http:\/\/myblog.emhct.net.cn\/index.php\/wp-json\/wp\/v2\/posts\/2064","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/myblog.emhct.net.cn\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/myblog.emhct.net.cn\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/myblog.emhct.net.cn\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/myblog.emhct.net.cn\/index.php\/wp-json\/wp\/v2\/comments?post=2064"}],"version-history":[{"count":1,"href":"http:\/\/myblog.emhct.net.cn\/index.php\/wp-json\/wp\/v2\/posts\/2064\/revisions"}],"predecessor-version":[{"id":2065,"href":"http:\/\/myblog.emhct.net.cn\/index.php\/wp-json\/wp\/v2\/posts\/2064\/revisions\/2065"}],"wp:attachment":[{"href":"http:\/\/myblog.emhct.net.cn\/index.php\/wp-json\/wp\/v2\/media?parent=2064"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/myblog.emhct.net.cn\/index.php\/wp-json\/wp\/v2\/categories?post=2064"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/myblog.emhct.net.cn\/index.php\/wp-json\/wp\/v2\/tags?post=2064"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}