Discussion:
[Live-devel] unicast onDemand from live source NAL Units NVidia
Pablo Gomez
2013-01-22 14:57:34 UTC
Permalink
Hi,

I'm trying to implement a unicast ondemand streaming from a live source.
The live source comes from the Nvidia encoder NVEnc: http://docs.nvidia.com/cuda/samples/3_Imaging/cudaEncode/doc/nvcuvenc.pdf
This encoder produces NAL units ready to send over the network.

I have a lot of problems with artifacts. I have tried two ways of sending the packages.

è A Stream Ring buffer: With this approach I'm sending as much bytes as possible. Means. Min(availableBytesInBuffer,fMaxSize) bytes

è A queue where I hold the NAL packets and its size: With this approach I tried to send the entire NAL. Means Min(nal.size,fMaxSize) bytes

With the first approach I can see in the log that I'm getting many errors regarding the SPS PPS incorrect. also when I display the stream in the video player I see artifacts and blinkings

With the second approach I also see artifacts and blinkings and the main problem is that there are many NAL units that are bigger than fMaxSize. Thus, those frames are truncated. I do not know
How to ensure that fMaxSize is always bigger that fFrameSize.

I tried to set up in the Streamer code enough size in the OutputPacketBuffer but this does not seem to work....
{
OutPacketBuffer::maxSize=10000000;
char const* streamName = "testStream";
ServerMediaSession* sms
= ServerMediaSession::createNew(*env, streamName, streamName,
descriptionString);
sms->addSubsession(H264LiveServerMediaSubsession
::createNew(*env,h264LiveBuffer, reuseFirstSource));
rtspServer->addServerMediaSession(sms);

announceStream(rtspServer, sms, streamName, inputFileName);
}

At, this point I wonder... first

Is it possible to stream the NAL units coming from the NVidia with the live555?
If so,, any suggestion how can I try to solve this? I'm literally stuck at this point.


Thank you
Best
Pablo
Ross Finlayson
2013-01-22 18:46:08 UTC
Permalink
First, I assume that you have are feeding your input source object (i.e., the object that delivers H.264 NAL units) into a "H264VideoStreamDiscreteFramer" object (and from there to a "H264VideoRTPSink").
Post by Pablo Gomez
I tried to set up in the Streamer code enough size in the OutputPacketBuffer but this does not seem to work....
{
OutPacketBuffer::maxSize=10000000;
Setting "OutPacketBuffer::maxSize" to some value larger than the largest expected NAL unit is correct - and should work. However, setting this value to 10 million is insane. You can't possibly expect to be generating NAL units this large, can you??

If possible, you should configure your encoder to generate a sequence of NAL unit 'slices', rather than single large key-frame NAL units. Streaming very large NAL units is a bad idea, because - although our code will fragment them correctly when they get packed into RTP packets - the loss of just one of these fragments will cause the whole NAL unit to get discarded by receivers.

Nonetheless, if you set "OutPacketBuffer::maxSize" to a value larger than the largest expected NAL unit, then this should work (i.e., you should find that "fMaxSize" will always be large enough for you to copy a whole NAL unit).


Ross Finlayson
Live Networks, Inc.
http://www.live555.com/

Loading...