Discussion:
[Live-devel] Support for raw video format
GENESTIER Denis
2018-07-12 07:32:31 UTC
Permalink
Hi Ross,

We do have to deal with special camera that display only raw video, i.e. uncompressed format (RFC 4175).
We are aware that is an heresy to stream video in that way, but you know how clients may be with their legacy's devices!!!

In a first draft for supporting this, I am intending to create a subclass of MultiFramedRTPSource (like RawVideoRTPSource.cpp) that would be created in MediaSession.cpp.
That should allow a RTSP client to get that kind of stream.

If I send you a patch, would you integrate it in your code?

Best regards,
Denis.
Ross Finlayson
2018-08-02 09:12:21 UTC
Permalink
First, my apologies for the long delay in responding to this question. (I have been traveling recently, including a couple of weeks “en vacances” in your wonderful (though unusually hot :-) country.)
Post by GENESTIER Denis
Hi Ross,
We do have to deal with special camera that display only raw video, i.e. uncompressed format (RFC 4175).
We are aware that is an heresy to stream video in that way
Not at all. This RTP payload format would not have been standardized if it wasn’t thought to have been appropriate. It’s perfectly appropriate to stream raw video over networks (e.g., corporate intranets) that can handle streams with this high bitrate (and with low packet loss).
Post by GENESTIER Denis
In a first draft for supporting this, I am intending to create a subclass of MultiFramedRTPSource (like RawVideoRTPSource.cpp) that would be created in MediaSession.cpp.
That should allow a RTSP client to get that kind of stream.
If I send you a patch, would you integrate it in your code?
Yes, very likely. Do you also plan to implement this for transmitters - i.e., a subclass of “MultiFramedRTPSink”?

After re-reading RFC 4175, I think that it should be possible to implement most of this payload format fairly easily, by reimplementing the “processSpecialHeader()” virtual function in your “MultiFramedRTPSource” subclass. The big problem, however, is going to be the ‘extended sequence number’ field. I see no way to implement the handling of ‘extended sequence numbers’ without making major changes to the code. The best solution for now would probably be to ignore this field - which will mean that receivers will unnecessarily drop some packets if they get excessively delayed.


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
GENESTIER Denis
2018-08-03 07:58:41 UTC
Permalink
Hi Ross,

As Promised, here is a patch for our first step for supporting raw video stream : the receiver side.
It also contains a correction for the compilation of the mingw config that was broken (my mistake because of my last submission on the keep alive on socket :-D)
I have tested it with only one kind of source and on a linux environment.
Yes, very likely. Do you also plan to implement this for transmitters - i.e., a subclass of “MultiFramedRTPSink”?
I will work on it, linked with the use of a "V_UNCOMPRESSED" track in a matroska file. It should allow us to test the whole chain of live555.
But for now it is my turn to be in holidays, and I could not do anything before a while.
After re-reading RFC 4175, I think that it should be possible to implement most of this payload format fairly easily, by reimplementing the “processSpecialHeader()” virtual function in your >“MultiFramedRTPSource” subclass. The big problem, however, is going to be the ‘extended sequence number’ field. I see no way to implement the handling of ‘extended sequence numbers’ without >making major changes to the code. The best solution for now would probably be to ignore this field - which will mean that receivers will unnecessarily drop some packets if they get excessively delayed.
I totally agree with it and as far as I can see, this extended sequence number is not used by our camera.
Also I did not used any "BufferedPacket" subclass, but maybe am I missing something?

Best regards,
Denis.
Ross Finlayson
2018-08-05 03:39:40 UTC
Permalink
Post by GENESTIER Denis
Also I did not used any "BufferedPacket" subclass, but maybe am I missing something?
Yes, because it is possible (in RFC 4175) for data from more than one line to be contained within a single RTP payload, you need to subclass “BufferedPacket” to allow for this. (Perhaps your ‘raw video’ RTP transmitter only delivers one line of raw video at a time; that’s probably going to be the common case.)

Anyway, I have updated the implementation, and included it in a new release (2018.08.05) of the “LIVE555 Streaming Media” code. Please test it to make sure it works OK with your ‘raw video’ RTP transmitter.


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
GENESTIER Denis
2018-08-28 15:44:44 UTC
Permalink
OK, many thanks Ross.
Currently, I am still testing your version and it seems to give the same results that mine.
Post by Ross Finlayson
Yes, because it is possible (in RFC 4175) for data from more than one line to be contained within a single RTP payload, you need to subclass “BufferedPacket” to allow for this. (Perhaps your ‘raw video’ RTP transmitter only delivers one line of raw video at a time; that’s probably going to be the common case.)
OK, but a quick look at your code shows me that your subclass RawVideoBufferedPacketFactory is called nowhere. Moreover its virtual function createNewPacket() is not instantiated and some functions are just dead code (currentLineNumber(), currentLineFieldId(), currentOffsetWithinLine() ).
My "raw video" source is a gstreamer pipeline and it sends RTP packet with more than one line inside, basically for the packet that contains the end of a line and the start of the next one. And without the use of the RawVideoBufferedPacket subclass, the client gets raw frame of correct size.

Concerning the dev on the RawVideoRTPSink subclass : work in progress ;-D

Regards,
Denis.
Ross Finlayson
2018-08-28 19:41:57 UTC
Permalink
Post by GENESTIER Denis
OK, many thanks Ross.
Currently, I am still testing your version and it seems to give the same results that mine.
Post by Ross Finlayson
Yes, because it is possible (in RFC 4175) for data from more than one line to be contained within a single RTP payload, you need to subclass “BufferedPacket” to allow for this. (Perhaps your ‘raw video’ RTP transmitter only delivers one line of raw video at a time; that’s probably going to be the common case.)
OK, but a quick look at your code shows me that your subclass RawVideoBufferedPacketFactory is called nowhere.
Oops - my mistake. It should have been instantiated in the “RawVideoRTPSource” constructor.

I’ve just installed a new version (2018.08.28) of the code that fixes this.


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
Ross Finlayson
2018-08-28 20:32:16 UTC
Permalink
Post by Ross Finlayson
Oops - my mistake. It should have been instantiated in the “RawVideoRTPSource” constructor.
I’ve just installed a new version (2018.08.28) of the code that fixes this.
And sure enough - I made a mistake there, so I’ve just installed a new new version (2018.08.28a) that should fix it for real now. Sorry.

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
GENESTIER Denis
2018-09-04 14:07:41 UTC
Permalink
Hi Ross, how are you doing?

Here is a patch for managing the raw video stream in server's side, based on the last live555 version.
I created a new sink "RawVideoRTPSink" and used it in the context of matroska file streaming with the codecID V_UNCOMPRESSED (by the way I also added the codec V_MJPEG that we also need and that is used by FFMPEG too).
For that, I use the EBML fields pixelWidth, pixelHeight, bitDepth and I also added the parsing of colourSpace and Primaries (cf https://www.matroska.org/technical/specs/index.html).

My sink class:
- compute an overflow (to have packets with size that are multiple of pgroup).
- set the special header size and doSpecialFrameHandling with a function getNbLineInPacket()

I have mainly tested it with a mkv file with I420 colorSpace (i.e. a sampling of "YCbCr-4:2:0"), streamed by the live555MediaServer and received by the openRTSP. The raw file we get (video-RAW-1) is readable by VLC (by specifying the right demuxer parameters).

NB : I did not do any management of the optional parameters (Interlace, Top-field-first, chroma-position or gamma).

Thanks again for your support.

Regards,
Denis.
Ross Finlayson
2018-09-05 06:29:03 UTC
Permalink
Denis,

Many thanks for this. I have now installed a new version (2018.09.05) of the code that includes your changes (with some small, mostly cosmetic, changes).


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
GENESTIER Denis
2018-09-05 08:20:47 UTC
Permalink
Thanks Ross for your reactivity and your code review.
I just noticed you have forgotten to remove all the "*.orig" and a "liveMedia/stGDDpcY" files and from your tar ball delivery

Regards,
Denis
Ross Finlayson
2018-09-05 11:11:56 UTC
Permalink
Post by GENESTIER Denis
I just noticed you have forgotten to remove all the "*.orig" and a "liveMedia/stGDDpcY" files and from your tar ball delivery
Oops - my mistake. I’ll remove these from the next release of the software.



Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
GENESTIER Denis
2018-09-10 10:41:05 UTC
Permalink
Hi Ross,

By checking some degraded cases, I found a memory double deletion in the RawVideoRTPSource.cpp, line 122.
Here is a patch to avoid this.


Denis.
Ross Finlayson
2018-09-10 11:11:42 UTC
Permalink
Thanks. I’ve made this fix now.

FYI, the actual patch is simpler than the one that you proposed. It’s just:

diff -c RawVideoRTPSource.cpp~ RawVideoRTPSource.cpp
*** RawVideoRTPSource.cpp~ 2018-08-28 13:28:28.345447000 -0700
--- RawVideoRTPSource.cpp 2018-09-10 04:04:11.162413000 -0700
***************
*** 132,138 ****
// Make sure that we have enough bytes for all of the line lengths promised:
if (totalLength > packetSize) {
fNumLines = 0;
! delete[] fLineHeaders;
return False;
}

--- 132,138 ----
// Make sure that we have enough bytes for all of the line lengths promised:
if (totalLength > packetSize) {
fNumLines = 0;
! delete[] fLineHeaders; fLineHeaders = NULL;
return False;
}


because you don’t need to set
fLineHeaders = NULL;
in the destructor, because “fLineHeaders” is never used again.


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/
GENESTIER Denis
2018-09-18 14:36:59 UTC
Permalink
Hi Ross,

I have found a problem in my algorithm for the RawVideoRTPSink (seen by the use of video in small resolution (like 32x24)).
Here is a patch.

Thanks again for your support.

Denis.
Ross Finlayson
2018-09-18 20:40:47 UTC
Permalink
Post by GENESTIER Denis
Hi Ross,
I have found a problem in my algorithm for the RawVideoRTPSink (seen by the use of video in small resolution (like 32x24)).
Here is a patch.
Thanks. I have now installed a new version (2018.09.18) of the code that includes this patch.

Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

Continue reading on narkive:
Loading...