[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re[2]: [Gnash-dev] video in gnash

From: Tomas Groth
Subject: Re[2]: [Gnash-dev] video in gnash
Date: Sun, 26 Nov 2006 00:16:04 +0100 (CET)

--- Udo Giacomozzi <address@hidden> skrev:

> Hello Bastiaan,
> Thursday, November 23, 2006, 4:03:15 PM, you wrote:
> BJ> On Tue, 2006-11-21 at 23:24 +0100, Udo Giacomozzi wrote:
> >> Just pass a RGB(A) buffer to the renderer sounds like a simple
> >> solution to me.
> BJ> So how does one currently pass an RGB(A) buffer to the renderer?
> I have to correct myself. "pass a RGB buffer to the renderer" is not
> quite correct. I suggest this:
>   render_handler::draw_YUV_frame(YUV_video *v, const rect* bounds);
> which, by default (render_handler.c), could convert the data to RGB
> and call: 
>   render_handler::draw_RGB_frame(image::rgb* im, const rect* bounds);
> A render handler capable of hardware accelerated YUV playback (OpenGL)
> could overwrite the draw_YUV_frame() method with a direct
> implementation. Other render handlers (AGG, Cairo) just have to deal
> with a simple RGB buffer.

This sounds like a good and clean way to do it. I've looked at the AGG
renderer, and have to admit i can't get my head around it, so could you have a
look at how to do that?

> Instead of image::rgb we could also use a plain byte buffer
> (unsigned char*). Don't know which would be better.
> The YUV_video class is not very clear to me, I suppose it can give the
> data for the current frame.

Maybe we should extend the image class to support YUV, and drop the use of the
YUV_video class completely.



Alt i én. Få Yahoo! Mail med adressekartotek, kalender og

reply via email to

[Prev in Thread] Current Thread [Next in Thread]