1

I make a FFMPEG-based player for ios. It works fine on simulator, but on real-device (iPhone 4) the frame rate is low and make my audio and video out of sync. the player works fine on iPhone 4s, so I guess it's just problem about device's computing power.

So, is there anyway to build FFMPEG optimized for iOS device (armv7, arvm7s arch)? or is there anyway to utilize ios device hardware to decode video stream?

My video stream is encode in H264/AAC.

4

1 に答える 1

2

Those streams should play just fine, I assume since your using ffmpeg you are not using a video protocol that iOS supports directly.

We use ffmpeg to do rtsp/rtmp and we get good performance with h264/aac

There are a number of factors that contribute to av/sync issues, usually some type of pre-buffering of the video is required, also network plays a big part in it.

As to your second question, hardware encoding is only available via avfoundation, you can use avassetwriter to encode your video, but again depends wether or not you need real-time.

see this link https://github.com/mooncatventures-group/FFPlayer-beta1/blob/master/FFAVFrames-test/ViewController.m

-(void) startRecording {

    //  // create the AVComposition
    //  [mutableComposition release];
    //  mutableComposition = [[AVMutableComposition alloc] init];


    movieURL = [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@/%llu.mov", NSTemporaryDirectory(), mach_absolute_time()]];


    NSError *movieError = nil;
    assetWriter = [[AVAssetWriter alloc] initWithURL:movieURL 
                                            fileType: AVFileTypeQuickTimeMovie 
                                               error: &movieError];
    NSDictionary *assetWriterInputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                              AVVideoCodecH264, AVVideoCodecKey,
                                              [NSNumber numberWithInt:FRAME_WIDTH], AVVideoWidthKey,
                                              [NSNumber numberWithInt:FRAME_HEIGHT], AVVideoHeightKey,
                                              nil];
    assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo
                                                          outputSettings:assetWriterInputSettings];
    assetWriterInput.expectsMediaDataInRealTime = YES;
    [assetWriter addInput:assetWriterInput];

    assetWriterPixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor  alloc]
                                     initWithAssetWriterInput:assetWriterInput
                                     sourcePixelBufferAttributes:nil];
    [assetWriter startWriting];

    firstFrameWallClockTime = CFAbsoluteTimeGetCurrent();
    [assetWriter startSessionAtSourceTime:kCMTimeZero];
    startSampleing=YES;
}

The one drawback right now is that a way needs to be determined to read the encoded data as its being written, believe me when I say there are a few of us developers trying to figure out how to do that as we I write this.

于 2012-12-10T16:38:10.837 に答える