Trang chủ‎ > ‎IT‎ > ‎Video Processing‎ > ‎

[Live-devel] testRTSPClient / H.264 Network Camera Stream

From: Ross Finlayson 
Sent: Friday, April 05, 2013 7:10 PM
To: LIVE555 Streaming Media - development & use 
Subject: Re: [Live-devel] testRTSPClient / H.264 Network Camera Stream

      I am experimenting with the testRTSPClient app to develop a DirectShow source filter to connect to a network camera outputting an H.264
  video stream. The app successfully connects to the camera and shows the following continuous output:
  ... 
  Stream "rtsp://192.168.1.7:65534/videoSub/"; audio/PCMU:        Received 960 bytes.     Presentation time: 1365102232.117119
  Stream "rtsp://192.168.1.7:65534/videoSub/"; video/H264:        Received 411 bytes.     Presentation time: 1365102232.186298
  Stream "rtsp://192.168.1.7:65534/videoSub/"; audio/PCMU:        Received 960 bytes.     Presentation time: 1365102232.226244
  Stream "rtsp://192.168.1.7:65534/videoSub/"; video/H264:        Received 458 bytes.     Presentation time: 1365102232.375975
  Stream "rtsp://192.168.1.7:65534/videoSub/"; audio/PCMU:        Received 960 bytes.     Presentation time: 1365102232.359869
  Stream "rtsp://192.168.1.7:65534/videoSub/"; audio/PCMU:        Received 960 bytes.     Presentation time: 1365102232.465869
  Stream "rtsp://192.168.1.7:65534/videoSub/"; audio/PCMU:        Received 960 bytes.     Presentation time: 1365102232.585744
  Stream "rtsp://192.168.1.7:65534/videoSub/"; video/H264:        Received 950 bytes.     Presentation time: 1365102232.666086
  ...

Initially, I am only interested in the video portion of the stream.
Using the library, what is the most efficient way to get access to the H.264 portion of the stream

>>Because this is a Frequently Asked Question, I have now added an entry for it to the FAQ.  See: 
>>http://www.live555.com/liveMedia/faq.html#testRTSPClient-how-to-decode-data

Re-reading your explanation above, in my ContinueAfterSETUP method,  should I do the following:

unsigned n_records=0;
const char* sps = scs.subsession->fmtp_spropparametersets();    
SPropRecord* pSPropRecord = parseSPropParameterSets( sps, n_records );

Now what do I send to my downstream decoder ??

Thanks,

Tom Fisher
--------------------------------------------------------------------------------
Do something like:

for (unsigned i = 0; i < n_records; ++i) {
	pass the NAL unit pointed to by "psPropRecord[i]->sPropBytes" (of length "psPropRecord[i]->sPropLength") to your decoder
}
delete[] psPropRecord;
--------------------------------------------------------------------------------
After you parse the nal records from parseSPropParameterSets, store them.

For each sample that comes through, construct the buffer you send to your upstream decoder like this:

Where start_code is 4 bytes 0x00, 0x00, 0x00, 0x01

[start_code][NAL_record_1][start_code][NAL_record_2][start_code][sample_data]

There may be some more correct way, but this has worked for me.  Also some h264 decoders are different in how they except NAL units, so you may want to check with the documentation.

-Jer
--------------------------------------------------------------------------------
I decode H.264 from a network camera.

My decoder wants SPS/PPS once before the first I-frame in band delivered.
So I wait until I get the first I-frame and then insert 0x0000001/SPS/0x0000001/PPS before the I-frame.
Interestingly - one of the cameras I test with sends SPS/PPS inside the RTP stream - I drop those SPS/PPSs.

But you are right - it's decoder dependent.

Markus.
--------------------------------------------------------------------------------
I have solved this in my code by cacheing and inserting into the stream (using a filter).

The decoder always needs it and, as stated, some encoders do not(axis default,pelco), some do(gvi,samsung) and some it is a setting(axis)
--------------------------------------------------------------------------------
Reading further on your site I see that you may need the SDP description
from my camera stream. BTW, I am getting some very jerky but clear video 
frames just by passing the NAL units directly to a MS DTV-DVD Video Decoder filter in DirectShow. 
I am soooooo close!

v=0
o=- 1365522147122142 1 IN IP4 192.168.1.2
s=IP Camera Video
i=videoMain
t=0 0
a=tool:LIVE555 Streaming Media v2013.01.25
a=type:broadcast
a=control:*
a=range:npt=0-
a=x-qt-text-nam:IP Camera Video
a=x-qt-text-inf:videoMain
m=video 0 RTP/AVP 96
c=IN IP4 0.0.0.0
b=AS:96
a=rtpmap:96 H264/90000
a=control:track1
m=audio 0 RTP/AVP 0
c=IN IP4 0.0.0.0
b=AS:64
a=control:track2

Thanks again for any help.

Tom Fisher


From: tboonefisher at clear.net 
Sent: Tuesday, April 09, 2013 2:41 PM
To: LIVE555 Streaming Media - development & use 
Subject: Re: [Live-devel] testRTSPClient / H.264 Network Camera Stream

THANKS for the help. Reading the FAQ I see:

“ If you are receiving H.264 video data, there is one more thing that you have to do before you start feeding frames to your decoder. H.264 streams have out-of-band configuration information ("SPS" and "PPS" NAL units) that you may need to feed to the decoder to initialize it. To get this information, call "MediaSubsession::fmtp_spropparametersets()" (on the video 'subsession' object). This will give you a (ASCII) character string. You can then pass this to "parseSPropParameterSets()" (defined in the file "include/H264VideoRTPSource.hh"), to generate binary NAL units for your decoder.”

Could you be more specific as to when/where that this needs to be done?
Using the DummySink example, when I attempt to call these methods as shown below I get nada...


DummySink::DummySink(UsageEnvironment& env, MediaSubsession& subsession, char const* streamId)
  : MediaSink(env),
    fSubsession(subsession) 
{
    fStreamId = strDup(streamId);
    fReceiveBuffer = new u_int8_t[DUMMY_SINK_RECEIVE_BUFFER_SIZE];

    const char* sps = fSubsession.fmtp_spropparametersets();    /// returns NULL
    unsigned n_records=0;
    SPropRecord* pSPropRecord = parseSPropParameterSets( fSubsession.fmtp_spropparametersets(), n_records );

}

Thanks again for ANY help...

Tom Fisher
---------------------------------------------------------------------------------
OK, the bug here is in your server - i.e., the camera; not the client.  The camera (server) isn't including a "sprop-parameter-strings" parameter (in a "a=fmtp:96..." line) in the SDP description.

Fortunately, however, I see that the camera (server) is also implemented using our software, so it should be easy to fix.  Please tell whoever developed your camera to:
	(1) Upgrade to the latest version of the "LIVE555 Streaming Media" software, and
	(2) Either insert "SPS" and "PPS" NAL units at the start of the H.264 Video stream (that it feeds into the "H264VideoRTPSink" object, or else change the way that the "H264VideoRTPSink" object is created - to use one of the forms of the "H264VideoRTPSink::createNew()" function that take "SPS"/"PPS" NAL unit information as a parameter (either as raw binary NAL units, or a "sPropParameterSetsStr" string).
Ross Finlayson, Live Networks, Inc. http://www.live555.com/
---------------------------------------------------------------------------------
It must also be VERY frustrating to see these companies making $$$$$$$ from your very nice work !!!

Tom Fisher

-----Original Message----- 
From: tboonefisher at clear.net
Sent: Tuesday, April 09, 2013 6:05 PM
To: LIVE555 Streaming Media - development & use
Subject: Re: [Live-devel] testRTSPClient / H.264 Network Camera Stream

    This is BAD news as the camera is:

http://foscam.us/products/foscam-fi9821w-megapixel-wireless-ip-camera.html

    I am just an end-user and thus probably have little influence on them. I
will send them
the info that you suggest but I will be very surprised IF they respond. They
have their own ‘plug-in’ that works pretty well in IE and other apps.

    Are there any other possible work-arounds? Is possible that they are
using
MEDIASUBTYPE_AVC1 'AVC1' H.264(without start codes)? I did try
this(GetMediatype()), but
I have no decoders that will connect to that.....ughhhhhh.

Thanks very much for your help...

Tom Fisher
Dallas
---------------------------------------------------------------------------------
>   It must also be VERY frustrating to see these companies making $$$$$$$
> from your very nice work !!!

On the contrary - I am *happy* to see many companies making successful use of this software.  That's why I made it available.

>   This is BAD news as the camera is:
> http://foscam.us/products/foscam-fi9821w-megapixel-wireless-ip-camera.html

What does frustrate me, however, is that so many of the companies that have successfully used this software in their products have avoided making use of the support offered by this mailing list.  In particular, nobody from "Foscam" appears to be on this mailing list.

>   I am just an end-user and thus probably have little influence on them. I
> will send them
> the info that you suggest but I will be very surprised IF they respond.

Please also check whether you have the latest firmware for your camera.  Firmware upgrades for Foscam camera can apparently be downloaded from: http://foscam.us/firmware

>   Are there any other possible work-arounds?
No.  The problem is with the camera.

Ross Finlayson
---------------------------------------------------------------------------------
I've also worked with cameras that don't send sprop-parameter-sets in
the SDP file (but not using the Live555 client); this shouldn't be a
problem as long as the camera does send SPS and PPS in band. The fact
that you are able to decode and display the raw NALUs using DirectShow
would indicate that it is (decoding would fail otherwise), so perhaps
your problem is elsewhere?

Colin Caughie
---------------------------------------------------------------------------------
Thanks Colin. The fact that I do get some very jerky but clear video rendered, does make me suspicious that I’m not implementing my DS source filter correctly. In particular I’m looking closely at GetMediaType(..), etc method(s). As I said, 
my skills are weak/intermediate in this stuff, so any tips from the “Pros” are greatly appreciated. 

Question...is it OK for me to post some of my DirectShow code here so that you might be able so see if/where I’m going wrong ???

Tom Fisher
---------------------------------------------------------------------------------

Comments