Infognition forum
October 24, 2017, 05:27:05 AM *
Welcome, Guest. Please login or register.

Login with username, password and session length
News: Last Video Enhancer version: 2.2
 
   Home   Help Search Login Register  
Pages: [1] 2
  Print  
Author Topic: Trying to save an MPEG4 stream to an AVI file  (Read 20505 times)
markr
Newbie
*

Karma: +0/-0
Posts: 23


View Profile
« on: August 06, 2009, 11:32:52 PM »

I am new to directshow and I am amazed at the seemingly difficult task to learn it.  Why doesn't someone create a "how-to" guide for things like "this is how you save an AVI file", etc.  Anyway, if you have any suggestions as to more tutorials, etc. than just the Microsoft SDK Docs, I would appreciate it.

I have recently downloaded the GraphEditPlus evaluation and love it...I use the DirectShow.NET library in C# and have successfully created a graph that currently only renders a live preview from an IP camera.  I want to continue to display a live preview, but also want to record the video stream to an MJPEG4 file.  The Graph consists of the following:

IP Source->Smart Tee(Preview Pin)->Color Space Converter->Video Renderer (this was the original graph and works fine in C# and GraphEditPlus)
                Smart Tee(Capture Pin)->Color Space Converter->MS MPEG-4 Video Codec V3->AVI Mux->File Writer

I am not aware of any way to test the file saving branch within GraphEditPlus, since there is no property page to set the output file name and MediaType, which is required for the File Writer's IFileSinkFilter interface, specifically the SetFileName method.  So, GraphEditPlus cannot generate the complete C# code for a workable graph.  So, here's what I've done which causes an error:

1.  Create the ICaptureGraphBuilter2 and IFilterGraph objects
2.  Create all the IBaseFilter objects and add them to the graph
3.  Before connecting any Filter objects together, I create an IFileSinkFilter object as soon as the new FileWriter() object is created using this code:
       IFileSinkFilter pFileSink = (IFileSinkFilter)pFileWriter;
       Then I set the file using this code:  pFileSink.SetFileName("myvideo.avi", null);
4.  Explicitly connect each filter to each other, starting with the Source Filter.  Here is the actual code for all connections:

            // connect the Source Filter to the Smart Tee
            hr = graph.Connect(GetPin(pACIPFilter2, "Output"), GetPin(pSmartTee, "Input"));
            checkHR(hr, "Can't connect ACIP Filter output to Smart Tee input");

            // connect the Smart Tee to the Color Space Converters
            hr = graph.Connect(GetPin(pSmartTee, "Capture"), GetPin(pColorSpaceConverter, "Input"));
            checkHR(hr, "Can't connect Smart Tee output to Color Space Converter input");           
            hr = graph.Connect(GetPin(pSmartTee, "Preview"), GetPin(pColorSpaceConverter2, "Input"));
            checkHR(hr, "Can't connect Smart Tee output to Color Space Converter #2 input");
                   
            // connect the Color Converter to the Video Renderer
            hr = graph.Connect(GetPin(pColorSpaceConverter2, "XForm Out"), GetPin(pVideoRenderer, "VMR Input0"));
            checkHR(hr, "Can't connect Color Space Converter to Video Renderer input");

            // connect the Color Space Converter to the MPEG-4 Codec
            hr = graph.Connect(GetPin(pColorSpaceConverter, "XForm Out"), GetPin(pMPEG4VideoCodec, "Input"));
            checkHR(hr, "Can't connect Color Space Converter to MPEG-4 Codec input");

            // connect the MPEG-4 Codec to the AVI Mux
            hr = graph.Connect(GetPin(pMPEG4VideoCodec, "Output"), GetPin(pAVIMux, "Input 01"));
            checkHR(hr, "Can't connect MPEG-4 Codec output to AVI Mux input");

            //// connect the AVI Mux output to the File Writer
            hr = graph.Connect(GetPin(pAVIMux, "AVI Out"), GetPin(pFileWriter, "in"));
            checkHR(hr, "Can't connect AVI Mux output to the File Writer input");
           
            // create the media control for the entire graph
            mediaControl = (IMediaControl)graph;

            // initialize the event
            pEvent = (IMediaEvent)graph;
         
            // setup and configure the video window
            IVideoWindow videoWindow = graph as IVideoWindow;
            ConfigureVideoWindow(videoWindow, this.pnlVideo);


The ConfigureVideoWindow method is one that just renders the preview to a panel control on the form. 
The mediaControl is an IMediaControl object and the pEvent is an IMediaEvent object declared at the top of the class.

The Graph builds without error and begins to run - at least for a few frames as the panel control shows some video from the camera, and, in the local folder, I see a "myvideo.avi" file created by having an instance of Windows Explorer pointing to that folder.

The graph is started in this method:

        public void Start()
        {
            // Create a new thread to process events
            Thread t;
            t = new Thread(new ThreadStart(EventWait));
            t.Name = "Media Event Thread";
            t.Start();

            int hr = mediaControl.Run();
            checkHR(hr, "Can't Run Graph");
        }

...and the EventWait method, which is the callback catching the events from the graph is coded like this:

        private void EventWait()
        {
            int hr;
            IntPtr p1, p2;
            EventCode ec;
            EventCode exitCode = 0;
           
            do
            {
                // Read the event
                for (
                    hr = pEvent.GetEvent(out ec, out p1, out p2, 100);
                    hr >= 0;
                    hr = pEvent.GetEvent(out ec, out p1, out p2, 100)
                    )
                {
                    //Debug.WriteLine(ec);
                    switch (ec)
                    {
                        // If the clip is finished playing
                        case EventCode.Complete:
                        case EventCode.ErrorAbort:
                        case EventCode.UserAbort:
                            exitCode = ec;

                            // Release any resources the message allocated
                            hr = pEvent.FreeEventParams(ec, p1, p2);
                            checkHR(hr, "Can't Free Event Params for UserAbort");
                            break;

                        default:
                            // Release any resources the message allocated
                            hr = pEvent.FreeEventParams(ec, p1, p2);
                            checkHR(hr, "Can't Free Event Params for Default Events");
                            break;
                    }
                }

             } while (exitCode == 0);

            MessageBox.Show("Exited Thread. Graph stopped with Exit Code: " + ec.ToString() + "\r\n" + "HRESULT = " + hr.ToString());


        } // Exit the thread


Within just a second or two, the thread exits with the Event Notification Code of ErrorAbort and an HRESULT of -2147467260.

Thanks for any help in letting me know what I am doing wrong!!

Mark

Logged
Dee Mon
Administrator
Hero Member
*****

Karma: +13/-0
Posts: 743



View Profile WWW
« Reply #1 on: August 07, 2009, 09:50:31 AM »

Quote
I am not aware of any way to test the file saving branch within GraphEditPlus, since there is no property page to set the output file name and MediaType, which is required for the File Writer's IFileSinkFilter interface, specifically the SetFileName method.

When you insert File Writer into the graph GEP must show an open file dialog and when you select a filename it feeds it to the File Writer via IFileSinkFilter interface. And generated code gets the interface and sets the filename. Doesn't it happen on your machine? Or am I missing something?

On topic: have you tried other codecs? Try using XviD, it's a nice free MPEG4 codec.
Your graph seems quite fine. Also check for time stamps, if they are not set correctly AVI Mux will fail. Insert a sample grabber between the codec and AVI Mux and watch grabbed samples (there is a command in context menu for sample grabber) to check if they have proper timestamps.
Logged
markr
Newbie
*

Karma: +0/-0
Posts: 23


View Profile
« Reply #2 on: August 07, 2009, 06:50:04 PM »

Thank you for your time.  I took your advise and downloaded/installed the latest XviD Codec.  Within GEP, I built a new graph and made sure that when I added the FileWriter I entered a file name this time and the auto-generated code added the appropriate IFileSinkFilter and set the SetFileName method accordingly.  As a newbie, I previously cancelled entering a filename when the dialog appeared thinking that it was looking to "open" an existing file instead of creating a new one.  Thanks for that clarification since it does allow the testing of file saving within the GEP.  Anyway, now my simple (preview only) graph looks like this:

Source->Infinite Pin Tee (Output1)->SampleGrabber->Color Space Converter->Video Renderer

I used the context menu as you said to watch samples and ran the graph.  The samples displayed as expected.  Here are a few records:

11:17:14 AM.750: SampleTime=6.975E+11, Time(start=0, end=0), MediaTime(start=0, end=0), data length=1228800, Data=0087000000870000...
11:17:14 AM.828: SampleTime=6.975E+11, Time(start=0, end=0), MediaTime(start=0, end=0), data length=1228800, Data=0002000000020000...
11:17:14 AM.859: SampleTime=6.975E+11, Time(start=0, end=0), MediaTime(start=0, end=0), data length=1228800, Data=0002000000020000...

I allowed it to run for awhile and it did with no issues.  Then I stopped the graph and added another file write branch.  The second branch looks like this:

Infinite Pin Tee (Output2)->XviD Codec->Sample Grabber->AVI Mux->File Writer

The sample grabber on this branch produced two records before the graph crashed...here are the two records:

11:37:16 AM.906: SampleTime=6.975E+11, Time(start=0, end=0), MediaTime(start=0, end=0), data length=59376, Data=000001B0F5000001...
11:37:17 AM.781: SampleTime=6.975E+11, Time(start=0, end=0), MediaTime(start=0, end=0), data length=57010, Data=000001B0F5000001...

The preview branch produced three records that are fixed data lengths and look like the records above, except for the time stamp of course.

The AVI file gets written to disk, but is only 64k when the graph crashes and is an invalid file if you try to play it.

The MediaTypes going in and coming out of the XviD Codec look as expected and I make no property changes to the defaults in the AVI Mux. 

I am writing to a local folder on the local C drive.  I am logged in as Administrator.  It is a Windows XP Pro SP3 and is fully updated.  The machine is Intel 6600 @ 2.4GHz with 3GB of RAM.

Again, thank you very much for your time.

Mark
Logged
markr
Newbie
*

Karma: +0/-0
Posts: 23


View Profile
« Reply #3 on: August 07, 2009, 07:41:58 PM »

Just another note...I started from scratch and built the graph again.  Now I receive a dialog that is, through trial and error, produced by the AVI Mux Filter.  The message states:

An error occurred in the graph: 80040249: No time stamp has been set for this sample. 

However, it seems that the Sample Grabber indicates a time stamp for all frames - especially when the graph is just a simple preview and I let the graph run for awhile.  Out of a hundred records, I saw a time stamp on each frame.  Is this possibly a timing issue?  The Source Filter is supplied by the camera mfg. ACTi.  The stream parameters for the camera is

Resolution: 640 x 480
Frame Rate: Constant @ 8 FPS
Encoder Type: MPEG4 (this is the only one supported by their DS Filter although MJPEG is available otherwise, such as through simple HTTP)
Video Bitrate Mode: Constant
Bitrate: 2M (even though I've tried anywhere from 256K to the max 3M)
B2 Frame Enabled: False

Thanks again...

Mark
Logged
Dee Mon
Administrator
Hero Member
*****

Karma: +13/-0
Posts: 743



View Profile WWW
« Reply #4 on: August 08, 2009, 08:57:06 AM »

Here's the problem I expected:
 Time(start=0, end=0), MediaTime(start=0, end=0)

SampleTime is not enough for AVI Mux, it needs those parameters to be set properly, it's very sensitive to their values. Because they are zero for your samples, AVI Mux can't handle such stream.

Now you have several options:
a) try using another file format instead of AVI. For instance, WMV or MKV.
b) try fixing time stamps for captured samples. One easy way may be to insert a sample grabber and in your program when a sample arrives set its times accordingly. More complicated way is to write a trans-in-place filter fixing time stamps, then it will work in graph editing apps, not only in your one.
c) try using another AVI writer. If you install Video Enhancer you'll find "Dee Mon's Avi Writer" in the list of filters. Try using it intead of AVI Mux / File Writer pair.
Logged
markr
Newbie
*

Karma: +0/-0
Posts: 23


View Profile
« Reply #5 on: August 08, 2009, 11:00:31 PM »

Thank you so much...I have learned so much from you.

Of the three options you listed, I am not sure about saving to an WMV or MKV file, so I haven't tried that.  I think that fixing the time stamps is the best option right now because I tried the third option of installing Video Enhancer and attempting to use your AVI Writer instead.  I tried it with a few different video compressor codecs, such as XviD, DivX, and the MS MPEG4 V3 Codec, but the graph locks up with no error dialogs and eventually GEP becomes unresponsive.  So, unless you have an idea as to why the Dee Mon AVI Writer is also crashing, I will focus on setting the timestamps of each sample using a SampleGrabber.

I think I know the steps to do this, but I appreciate your clarification:

1.  Add the ISampleGrabberCB interface to the Form's class declaration so the form can handle the callback
1.  Create and add a SampleGrabber to the graph
3.  Create an AMMediaType object and set it's properties to the video type, etc. of what is coming into the SampleGrabber, then call the SetMediaType method on the SampleGrabber object, then free the mediatype object from memory.
4.  Call the SetCallback method of the SampleGrabber object, setting the first parameter to the instance of the form (this in C#), and the second parameter to the integer value of 0 (zero) indicating that I want to trap the SampleCB method
5.  Implement the callback function in C# as such:

unsafe int ISampleGrabberCB.SampleCB( double SampleTime, IMediaSample pSample )
{
   ... do work here
}
6.  Within the callback function, use the pSample parameter to call the SetTime and SetMediaTime methods.

If this process flow is correct, then I think I can get this far.  At this point, I am not sure what values I should use for these methods, and, after reading the DirectShow doc regarding Time Stamps, whether both methods should be set.

So, once again, thank you for clarification.  You have been very helpful and I really appreciate it.

Mark
Logged
Dee Mon
Administrator
Hero Member
*****

Karma: +13/-0
Posts: 743



View Profile WWW
« Reply #6 on: August 10, 2009, 05:45:42 AM »

Yes, your plan is right. Maybe medatype part is not necessary. It's better to set both kinds of time stamps.
What AVI Mux likes is that end time of each sample equals (or just a bit less? need to check) to start time of next sample.
Logged
markr
Newbie
*

Karma: +0/-0
Posts: 23


View Profile
« Reply #7 on: August 10, 2009, 05:08:58 PM »

I have read the Windows SDK 6.1 documentation about Time Stamps, but it is so vague.  I know that the values expected are of type long, but I still don't quite understand how I'm supposed to arrive at the values.  Where can I read a good explanation of this?  The Windows SDK doc is laughable:  Here's the officical definition of IMediaSample::SetMediaTime:

"The SetMediaTime method sets the media times for this sample". 

...oh, really???

So, the questions that seem to be relevant are:

What relationship is the start time to the end time with the SetTime method?
What relationship is the start time to the end time with the SetMediaTime method?
How do I establish the start time to either method?

With each call to the callback function, should I immediately establish the current time from the computer clock and convert it to a long using:

            DateTime dt = DateTime.Now();
            long stLong = Convert.ToInt64(dt);           

...and then use the DsLong.FromInt64(stLong) as the parameter required by the DirectShowLib's SetTime and SetMediaTime methods?

...but, then I read in the TimeStamps section of the Win SDK regarding media times that "In a video stream, media time represents the frame number", so does this mean that for each sample that arrives at the callback, I should establish a counter of type long, initialize it BEFORE any callback calls to zero, then increment the counter by 1 (counter++) for the start and end times of the SetMediaTime method, so that the first frame would have the SetMediaTime parameters of 1 and 1, and the second frame 2 and 2, etc.?

... and then there's the IReferenceClock::GetTime method...

So, thanks again for helping out this newbie!!!

Mark
Logged
Dee Mon
Administrator
Hero Member
*****

Karma: +13/-0
Posts: 743



View Profile WWW
« Reply #8 on: August 11, 2009, 09:00:19 AM »

When documentation is unclear, one can use the power of his instruments. Run GEP, render some video files and with sample grabber learn how they are timestamped. For example, here's what I got with one AVI file containing MJPEG video:

Quote
12:33:09.423: SampleTime=0, Time(start=0, end=666660), MediaTime(start=0, end=1), data length=23172, keyframe=True, Discontinuity, Data=FFD8FFE000104156...
12:33:09.583: SampleTime=0.06667, Time(start=666660, end=1333320), MediaTime(start=1, end=2), data length=22876, keyframe=True, Data=FFD8FFE000104156...
12:33:09.584: SampleTime=0.1333, Time(start=1333320, end=1999980), MediaTime(start=2, end=3), data length=19558, keyframe=True, Data=FFD8FFE000104156...
12:33:09.598: SampleTime=0.2, Time(start=1999980, end=2666640), MediaTime(start=3, end=4), data length=19516, keyframe=True, Data=FFD8FFE000104156...
12:33:09.664: SampleTime=0.2667, Time(start=2666640, end=3333300), MediaTime(start=4, end=5), data length=19378, keyframe=True, Data=FFD8FFE000104156...
12:33:09.731: SampleTime=0.3333, Time(start=3333300, end=3999960), MediaTime(start=5, end=6), data length=19456, keyframe=True, Data=FFD8FFE000104156...
12:33:09.798: SampleTime=0.4, Time(start=3999960, end=4666620), MediaTime(start=6, end=7), data length=19350, keyframe=True, Data=FFD8FFE000104156...
12:33:09.864: SampleTime=0.4667, Time(start=4666620, end=5333280), MediaTime(start=7, end=8), data length=19360, keyframe=True, Data=FFD8FFE000104156...
12:33:09.931: SampleTime=0.5333, Time(start=5333280, end=5999940), MediaTime(start=8, end=9), data length=19376, keyframe=True, Data=FFD8FFE000104156...
12:33:09.998: SampleTime=0.6, Time(start=5999940, end=6666600), MediaTime(start=9, end=10), data length=19246, keyframe=True, Data=FFD8FFE000104156...
12:33:10.064: SampleTime=0.6667, Time(start=6666600, end=7333260), MediaTime(start=10, end=11), data length=19242, keyframe=True, Data=FFD8FFE000104156...
12:33:10.131: SampleTime=0.7333, Time(start=7333260, end=7999920), MediaTime(start=11, end=12), data length=19288, keyframe=True, Data=FFD8FFE000104156...
12:33:10.198: SampleTime=0.8, Time(start=7999920, end=8666580), MediaTime(start=12, end=13), data length=19230, keyframe=True, Data=FFD8FFE000104156...
12:33:10.264: SampleTime=0.8667, Time(start=8666580, end=9333240), MediaTime(start=13, end=14), data length=19250, keyframe=True, Data=FFD8FFE000104156...
12:33:10.331: SampleTime=0.9333, Time(start=9333240, end=9999900), MediaTime(start=14, end=15), data length=19352, keyframe=True, Data=FFD8FFE000104156...
12:33:10.398: SampleTime=1, Time(start=9999900, end=10666560), MediaTime(start=15, end=16), data length=19468, keyframe=True, Data=FFD8FFE000104156...
12:33:10.464: SampleTime=1.067, Time(start=10666560, end=11333220), MediaTime(start=16, end=17), data length=19468, keyframe=True, Data=FFD8FFE000104156..

For each sample there are two kinds of times: stream time (GetTime and SetTime functions in CMediaSample) and media time. Media time is usually frame number for video and total number of PCM samples for audio. Stream time is the time when sample should be presented (drawn for video, played for audio). Video Renderer uses stream time to determine proper moment for showing. AVI Mux uses it to write frames in proper order in AVI file.

In time each sample is an interval. When everything is smooth these intervals cover whole time axis from 0 to the end of your video. I.e. end stream time of one sample equals to start stream time of the next sample. If there is a hole between times of two samples AVI Mux will insert an empty frame which may not be desired because AVI has constant frame rate and each frame's time completely depends on its number, so odd frames can cause shifting video in time.

Stream time is the time difference between current sample and start of the video, so using system clock is not a wise thing. In your case you already receive good SampleTime in your callback. Just convert it from double to proper integer type and remember that it must be in 100-nanosecond units, i.e. 1 second is 10 000 000. End time for sample is its start time plus sample duration (10 000 000 / FPS). This will give you stream time for SetTime method. For media times, just make a frame counter and set media time (count, count+1) for each frame.
« Last Edit: August 11, 2009, 09:07:08 AM by Dee Mon » Logged
markr
Newbie
*

Karma: +0/-0
Posts: 23


View Profile
« Reply #9 on: August 11, 2009, 05:58:21 PM »

You should write a book!

Your explanation was perfect.  I have been reading as much as I can find about the Reference Clock, etc., but it just didn't "click" until I read your comments.  Thank you so very much.

The question, however, always remains....with the BILLIONS of dollars available to Microsoft, why couldn't they have been as precise in their documentation?  I'm sure that if you add up all the time all the developers across the world have spend trying to understand and trying to explain, the waste is enormous. 

BTW...your support has been THE very best.  For that, I will be purchasing GEP today.  Keep up the good work...and patience with people like me!

Mark
Logged
markr
Newbie
*

Karma: +0/-0
Posts: 23


View Profile
« Reply #10 on: August 11, 2009, 07:47:52 PM »

...to follow up, after pondering....

The samples I am receiving show a SampleTime of a constant 6.975E+11.  The samples you provided from the AVI file as it was played indicated SampleTimes that allowed you to see that the FPS was 15, but I did not see a coorelation between the SetTime's Start and End times utilizing the SampleTime value.  What I saw was that the first frame had the SetTime's start time set to 0 and the end time set to 10 000 000 / 15 rounded to the nearest 100 nanoseconds, which generated a constant 666660 nanosecond duration per frame.

I am able to set the FPS using the Source Filter's IFileSourceFilter's Load method, so, for a 15 FPS rate, shouldn't I simply use the same 666660 nanosecond frame duration starting at 0 and disregard any use of the 6.975E+11 value?

Thanks,

Mark
Logged
markr
Newbie
*

Karma: +0/-0
Posts: 23


View Profile
« Reply #11 on: August 12, 2009, 01:04:17 AM »

...ok, so based upon what I thought, this is what I did:  I inserted the SampleGrabber and setup the callback for the SampleCB method.  I declared a constant and 2 variables:

        private const long FRAME_DURATION_15 = 666660;
        private long frameCount = -1;
        private long initialSampleTime = 0;
       
        // sample grabber callback for sample - used to set timestamps on each sample
        unsafe int ISampleGrabberCB.SampleCB(double SampleTime, IMediaSample pSample)
        {
            if (frameCount == -1)
            {
                long tempSampleTime = Convert.ToInt64(SampleTime);
                long remainder = 0;
                Math.DivRem(tempSampleTime, 100, out remainder);
                initialSampleTime = tempSampleTime - remainder;
            }

            frameCount++;
            DsLong timeStart = DsLong.FromInt64(frameCount * FRAME_DURATION_15 + initialSampleTime);
            DsLong timeEnd = DsLong.FromInt64((frameCount * FRAME_DURATION_15) + FRAME_DURATION_15 + initialSampleTime);
            DsLong mediaStart = DsLong.FromInt64(frameCount);
            DsLong mediaEnd = DsLong.FromInt64(frameCount + 1);
            pSample.SetTime(timeStart, timeEnd);
            pSample.SetMediaTime(mediaStart, mediaEnd);           
            Debug.WriteLine(frameCount.ToString() + ": " + timeStart.ToString() + ";" + timeEnd.ToString() + ";" +
                mediaStart.ToString() + ";" + mediaEnd.ToString());
            return 0;
        }

The SampleGrabber is setup like this:  Source->Infinite Pin Tee->SampleGrabber->MS MPEG4 V3 Codec->AVI Mux->FileWriter

I am requesting samples from the camera at 15 FPS, which, according to the sample times you provided, has a duration of 666660.  However, if 100 nanoseconds is the smallest interval, I would have thought this duration would be 666600 instead??

Anyway, I have tried to use both 666660 and 666600 for the FRAME_DURATION_15 constant and also included and excluded the code within the "if" statement, but the graph still crashes before the second frame gets to the callback method.

Mark

Logged
Dee Mon
Administrator
Hero Member
*****

Karma: +13/-0
Posts: 743



View Profile WWW
« Reply #12 on: August 12, 2009, 05:59:26 AM »

Oh, I haven't noticed that SampleTime values you get are so weird, just saw they not zero. I don't think you capture for 697 billion seconds Smiley so you better ignore those values.
Well, in this case you should calculate stream time on your own based either on frame number or on system clock (difference between now and capture start time). In first case just remove initialSampleTime from the code quoted in your last message, and it will look fine. Remember that stream time is difference between current sample and video start moments, so it should be 0 for the first sample, not 697 gazillions.

You don't need to truncate time values to hundreds. When we're talking about 100 nanosecond units it just means that integer value of 1 in stream time means 100 nanoseconds or 0,0000001 second. That's why value of 10 000 000 means 1 second. So it's just a scale, just a coefficient for converting between seconds and units of stream time.
Logged
Dee Mon
Administrator
Hero Member
*****

Karma: +13/-0
Posts: 743



View Profile WWW
« Reply #13 on: August 12, 2009, 06:02:05 AM »

I am able to set the FPS using the Source Filter's IFileSourceFilter's Load method, so, for a 15 FPS rate, shouldn't I simply use the same 666660 nanosecond frame duration starting at 0 and disregard any use of the 6.975E+11 value?
Yes, exactly!
Logged
markr
Newbie
*

Karma: +0/-0
Posts: 23


View Profile
« Reply #14 on: August 12, 2009, 10:57:39 PM »

Even though the SampleTime is a parameter that is passed into the callback method, are you saying that I can "set" the SampleTime from within the callback method?  Unlike the methods explicitly provided for setting the Time and MediaTime, I don't see any specific method, so do you just re-assign a new value to SampleTime?  If the answer is YES, well I tried that, but it still did not work.

I tried it by setting a reference integer variable called "startingTicks", declared outside the scope of the callback method, to the Environment.TickCount value when the first frame was received at the callback.  From within the callback method, I declared an integer variable called "currentTicks" for obtaining the current Environment.TickCount.  For the first frame, I insured that currentTicks equalled startingTicks.  Then I calculated the delta as (currentTicks - startingTicks), divided that by 1000 - since the Enviroment.TickCount is in milliseconds - and converted the quotient value to a double, which conformed to the type declared in the callback for the SampleTime.  This would end up as the decimal value of the number of seconds each sample would reach the callback, assuming if any accuracy of framerate was achieved, there would be close to 15 samples between the initial 0 SampleTime and the frame where the SampleTime reached 1.  However, this, too, did not work.

Of course, using other video capture sources, such as a few different webcams, I can preview and generate AVI files easily.  I know that the source filter provided by ACTi for their IP camera is junk, but their IP cameras are inexpensive compared to the other mfgs, so I've tried hard - with alot of your help - to try to use their cameras.  At this point, however, unless you can think of where I'm going wrong, I'm just about ready to ditch the effort of using the ACTi camera line until they can produce a quality source filter (don't hold your breath on that one, however)

thanks again,

Mark

Logged
Pages: [1] 2
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.13 | SMF © 2006-2011, Simple Machines LLC Valid XHTML 1.0! Valid CSS!