Posted: Tue May 16, 2006 11:55 am Post subject: [asterisk-video] How do you send large frames?
(retransmitting this for the mail archives)
Friends,
I am still a student in the video area... One question I have is how
you send large video frames over UDP/RTP?
UDP fragmentation? Or do you use the multiple RTP frame function -
where you use the same seq number?
Or any other trick?
I got an error report from a friend that we somehow cancelled or cut
large frame updates.
Also, the format drivers for video seems to make an illegal
assumption about the size of two frames.
Posted: Tue May 16, 2006 12:24 pm Post subject: [asterisk-video] How do you send large frames?
Hi,
Video frames are typically split into smaller chunks via fragmentation of
the video data itself. For example, H.263 has two packetization modes (RFC
2190 and RFC 2429). In these RFCs, it describes how you can split one frame
into multiple UDP packets. H.264 is a little nicer since it was written
with networks in mind -- each H.264 "chunk" is called a NAL (network
abstraction layer) unit. Encoders can output small NALs directly and keep
the packet less than MTU.
Duane
On 5/16/06, Olle E Johansson <oej@edvina.net> wrote:
Quote:
(retransmitting this for the mail archives)
Friends,
I am still a student in the video area... One question I have is how
you send large video frames over UDP/RTP?
UDP fragmentation? Or do you use the multiple RTP frame function -
where you use the same seq number?
Or any other trick?
I got an error report from a friend that we somehow cancelled or cut
large frame updates.
Also, the format drivers for video seems to make an illegal
assumption about the size of two frames.
/O
_______________________________________________
--Bandwidth and Colocation provided by Easynews.com --
Posted: Wed May 17, 2006 11:02 am Post subject: [asterisk-video] How do you send large frames?
(retransmit of mail sent to "simple" mailing list with no reply. Wish
me luck this time :-)
-------
Friends,
I am still a student in the video area... One question I have is how
you send large video frames over UDP/RTP?
UDP fragmentation? Or do you use the multiple RTP frame function -
where you use the same seq number?
Or any other trick?
I got an error report from a friend that we somehow cancelled or cut
large frame updates.
Also, the format drivers for video seems to make an illegal
assumption about the size of two frames.
Posted: Wed May 17, 2006 12:48 pm Post subject: [asterisk-video] How do you send large frames?
Skipped content of type multipart/alternative-------------- next part --------------
_______________________________________________
--Bandwidth and Colocation provided by Easynews.com --
Posted: Wed May 17, 2006 9:04 pm Post subject: [asterisk-video] How do you send large frames?
17 maj 2006 kl. 23.48 skrev John Martin:
Quote:
Hi Olle,
Maybe you missed Duane?s post as the mailing lists switched around?
Yes, I did. Sorry.
Just to check that I understand this: The codec divides the frame
into several RTP frames itself, so we have individual RTP frames?
Then we really need to fix the formats.
From format_h624.c:
#define BUF_SIZE 4096 /* Two Real h264 Frames */
------
static struct ast_frame *h264_read(struct ast_filestream *s, int
*whennext)
{
int res;
int mark=0;
unsigned short len;
unsigned int ts;
struct h264_desc *fs = (struct h264_desc *)s->private;
/* Send a frame from the file to the appropriate channel */
if ((res = fread(&len, 1, sizeof(len), s->f)) < 1)
return NULL;
len = ntohs(len);
mark = (len & 0x8000) ? 1 : 0;
len &= 0x7fff;
if (len > BUF_SIZE) {
ast_log(LOG_WARNING, "Length %d is too long\n", len);
len = BUF_SIZE; /* XXX truncate */
}
--------
Seems like we truncate frames bigger than 4096 bytes on reading from
file. I don't know where that assumption
came from, but the same BUF_SIZE seems to exist in format_h263 as well.
Posted: Wed May 17, 2006 11:15 pm Post subject: [asterisk-video] How do you send large frames?
Olle,
Yes the codec chops up the video frames into udp frames.
So the code below has len &= 0x7ffff in it. That's 32kBytes... that's
a big video packet. I'd have to look at some more of the code to see if
the file read/write is trying to store whole video frames at a time, or
just the UDP frames. What I do remember is that asterisk uses the
timestamps of the frame to determine when it should be read from disk.
Because udp frames within video frames all have the same timestamp they
get read from the disk almost immediately and put out on the wire at
that time. This can lead to overflowing of network equipment and packet
reversal (sorry to bring that up again) at high bitrates.
Also its technically possible for video frames (not UDP frames) can be
over that 32k size, but encoders would normally be chopping these up
into 1400 byte packets to fit over the internet. I've not ever got
involved in networks that would allow a big MTU to be used but I suppose
a codec could be aware of that and try to use really big UDP frames but
I'm not sure Asterisk should be supporting this.
-----Original Message-----
From: asterisk-video-bounces@lists.digium.com
[mailto:asterisk-video-bounces@lists.digium.com] On Behalf Of Olle E
Johansson
Sent: 18 May 2006 07:05
To: Discussion of video media support in Asterisk
Subject: Re: [asterisk-video] How do you send large frames?
17 maj 2006 kl. 23.48 skrev John Martin:
Quote:
Hi Olle,
Maybe you missed Duane's post as the mailing lists switched around...
Yes, I did. Sorry.
Just to check that I understand this: The codec divides the frame
into several RTP frames itself, so we have individual RTP frames?
Then we really need to fix the formats.
From format_h624.c:
#define BUF_SIZE 4096 /* Two Real h264 Frames */
------
static struct ast_frame *h264_read(struct ast_filestream *s, int
*whennext)
{
int res;
int mark=0;
unsigned short len;
unsigned int ts;
struct h264_desc *fs = (struct h264_desc *)s->private;
/* Send a frame from the file to the appropriate channel */
if ((res = fread(&len, 1, sizeof(len), s->f)) < 1)
return NULL;
len = ntohs(len);
mark = (len & 0x8000) ? 1 : 0;
len &= 0x7fff;
if (len > BUF_SIZE) {
ast_log(LOG_WARNING, "Length %d is too long\n", len);
len = BUF_SIZE; /* XXX truncate */
}
--------
Seems like we truncate frames bigger than 4096 bytes on reading from
file. I don't know where that assumption
came from, but the same BUF_SIZE seems to exist in format_h263 as well.
/O
_______________________________________________
--Bandwidth and Colocation provided by Easynews.com --
Posted: Wed May 17, 2006 11:22 pm Post subject: [asterisk-video] How do you send large frames?
Remember that format drivers are used to read both things that we
record in
record() and voicemail() plus external files - so we can't have an
asterisk-only format
for video files :-)
Posted: Wed May 17, 2006 11:54 pm Post subject: [asterisk-video] How do you send large frames?
I am learning new things about the video implementation all the time.
Seems like we are actually saving RTP headers in the format files,
which is bad. As far as I know,
IAX2 does not use RTP so I wonder if playing one of the saved files
to an IAX videophone would work?
I am not aware of any IAX2 video phones, but not using a clean format
disturbs me.
Posted: Thu May 18, 2006 12:02 am Post subject: [asterisk-video] How do you send large frames?
Hi, sorry - I think I confused the discussion on #asterisk-dev.
We actually strip the rtp header - it's the h263 RFC2190 header that is
stored, together with the length (with rtp mark bit) and timestamp.
It's not a standard format that I know of, which is why we need the
GStreamer modules to convert other formats into this 'asterisk format'.
Neil
Olle E Johansson wrote:
Quote:
I am learning new things about the video implementation all the time.
Seems like we are actually saving RTP headers in the format files, which
is bad. As far as I know,
IAX2 does not use RTP so I wonder if playing one of the saved files to
an IAX videophone would work?
I am not aware of any IAX2 video phones, but not using a clean format
disturbs me.
/O
_______________________________________________
--Bandwidth and Colocation provided by Easynews.com --
Posted: Thu May 18, 2006 1:06 am Post subject: [asterisk-video] How do you send large frames?
Quote:
Hi, sorry - I think I confused the discussion on #asterisk-dev.
We actually strip the rtp header - it's the h263 RFC2190 header that
is stored, together with the length (with rtp mark bit) and
timestamp.
It's not a standard format that I know of, which is why we need the
GStreamer modules to convert other formats into this 'asterisk
format'.
Neil
Olle E Johansson wrote:
> I am learning new things about the video implementation all the time.
>
> Seems like we are actually saving RTP headers in the format files,
> which is bad. As far as I know,
> IAX2 does not use RTP so I wonder if playing one of the saved files
> to an IAX videophone would work?
>
> I am not aware of any IAX2 video phones, but not using a clean
> format disturbs me.
Hi everyone, I've been following the thread, but I really still doesn't understand the problem.
Can any one clear up the questions? Where is the problem? Is it in sending video stored in a file?
Saving it to a file or just retransmitting it?
By the way, I think that it would be a good idea to use some standard container for storing the video.
I have been working woth mp4 for a while and I think it would be perfect for what we need. In fact I developed an aplication that converted a dump of rtp h263 rfc 2190 data into a perfectly valid mp4 file, that could be played with vlc or qt and streamed correctly (with same rfc 2190 payload) with a darwing streaming server.
Greetings
Sergio
--------------------------------------------------------------------------------------
This message and any files transmitted with it are confidential and intended solely
for the use of the individual or entity to whom they are addressed. No confidentiality
or privilege is waived or lost by any wrong transmission.
If you have received this message in error, please immediately destroy it and kindly
notify the sender by reply email.
You must not, directly or indirectly, use, disclose, distribute, print, or copy any
part of this message if you are not the intended recipient. Opinions, conclusions and
other information in this message that do not relate to the official business of
Ydilo Advanced Voice Solutions, S.A. shall be understood as neither given nor endorsed by it.
--------------------------------------------------------------------------------------
Posted: Thu May 18, 2006 4:06 am Post subject: [asterisk-video] How do you send large frames?
On May 18, 2006, at 6:06 AM, Sergio Garc?a Murillo wrote:
Quote:
> Hi, sorry - I think I confused the discussion on #asterisk-dev.
>
> We actually strip the rtp header - it's the h263 RFC2190 header that
> is stored, together with the length (with rtp mark bit) and
> timestamp.
>
> It's not a standard format that I know of, which is why we need the
> GStreamer modules to convert other formats into this 'asterisk
> format'.
>
> Neil
>
> Olle E Johansson wrote:
>> I am learning new things about the video implementation all the
>> time.
>>
>> Seems like we are actually saving RTP headers in the format files,
>> which is bad. As far as I know,
>> IAX2 does not use RTP so I wonder if playing one of the saved files
>> to an IAX videophone would work?
>>
>> I am not aware of any IAX2 video phones, but not using a clean
>> format disturbs me.
Hi everyone, I've been following the thread, but I really still
doesn't understand the problem.
Can any one clear up the questions? Where is the problem? Is it in
sending video stored in a file?
Saving it to a file or just retransmitting it?
By the way, I think that it would be a good idea to use some
standard container for storing the video.
I have been working woth mp4 for a while and I think it would be
perfect for what we need. In fact I developed an aplication that
converted a dump of rtp h263 rfc 2190 data into a perfectly valid
mp4 file, that could be played with vlc or qt and streamed
correctly (with same rfc 2190 payload) with a darwing streaming
server.
A couple of comments:
1) As far as I know, there only IAX2 video phone implementation, is
the fork that tipic made of iaxclient; The official iaxclient
distribution should also support video at some point later this year
(based in part on tipic's work). So, I think that the IAX2 video
formats are somewhat flexible at this point, and no specification
documents really discuss this.
Things like marker bits for beginning-of-frame, keyframes, etc, are
certainly not specified.
2) File formats: MP4 (or MOV, on which it's based) are fine
containers, but one thing I've been concerned about with saving
streams has always been how to represent lost packets in these
containers. It's important to be able to represent that for two
reasons: (a) you want to be able to keep audio and video in sync,
and (b) you may want to be able to accurately represent a recording
with some packet loss, by playing it in the proper amount of time.
For (b), consider the situation where you are recording a one hour
session with random 10% packet loss. User agents in the live session
will play this fine, and users may not even notice the 10% packet
loss (in the audio portion, the video might be a mess). But, if you
don't somehow represent these lost packets in the recorded file, the
whole thing will play back in 54 minutes, instead of 60.
Posted: Thu May 18, 2006 4:19 am Post subject: [asterisk-video] How do you send large frames?
Quote:
2) File formats: MP4 (or MOV, on which it's based) are fine
containers, but one thing I've been concerned about with saving
streams has always been how to represent lost packets in these
containers. It's important to be able to represent that for two
reasons: (a) you want to be able to keep audio and video in sync,
and (b) you may want to be able to accurately represent a recording
with some packet loss, by playing it in the proper amount of time.
For (b), consider the situation where you are recording a one hour
session with random 10% packet loss. User agents in the live session
will play this fine, and users may not even notice the 10% packet
loss (in the audio portion, the video might be a mess). But, if you
don't somehow represent these lost packets in the recorded file, the
whole thing will play back in 54 minutes, instead of 60.
I've been working with mp4 files with mpeg4ip libraries and it allowed me to specify the timing of each frame of video and audio tracks, and also you also have the hint tracks with the rtp timestamp and secuence number for more info. So I think that there would be no problems with lost packets for either playing or streaming the file.
Greetings
Sergio
--------------------------------------------------------------------------------------
This message and any files transmitted with it are confidential and intended solely
for the use of the individual or entity to whom they are addressed. No confidentiality
or privilege is waived or lost by any wrong transmission.
If you have received this message in error, please immediately destroy it and kindly
notify the sender by reply email.
You must not, directly or indirectly, use, disclose, distribute, print, or copy any
part of this message if you are not the intended recipient. Opinions, conclusions and
other information in this message that do not relate to the official business of
Ydilo Advanced Voice Solutions, S.A. shall be understood as neither given nor endorsed by it.
--------------------------------------------------------------------------------------
You can post new topics in this forum You can reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum