开发者

How do I create an AVAsset with a UIImage captured from a camera?

开发者 https://www.devze.com 2023-04-08 15:48 出处:网络
I am a newbie trying to capture camera video images using AVFoundation and want to render the captured frames without using AVCaptureVideoPreviewLayer. I

I am a newbie trying to capture camera video images using AVFoundation and want to render the captured frames without using AVCaptureVideoPreviewLayer. I want a slider control to be able to slow down or speed up the rate of display of camera images.

Using other peoples code as examples, I can capture images and using an NSTimer, with my slider control can define on the fly how often to display them, but I can't convert the image to something I can display. I want to move these images into a UIView or UIImageView and render them in the timer Fire function.

I have looked at Apples AVCam app, (which uses an AVCaptureVideoPreviewLayer) but because it has its own built in AVCaptureSession, I can't adjust how often the images are displayed. (well, you can adjust the preview layer frame rate but that can't be done on the fly)

I have looked at the AVFoundation programming guide, which talks about AVAssets and AVPlayer, etc. but I can't see how a camera image can be turned into an AVAsset. When I look at the AVFoundation guide, and other demos which show how to define an AVAsset, it only gives me choices of using http stream data to create the asset, or a url to define an asset using an existing file. I can't figure out how to make my captured UIImage into an AVAsset, in which case I guess I could use an AVPlayer, AVPlayerItems and AVAssetTracks to show the image with an observeValueForKeyPath function checking status and doing [myPlayer play]. (I also studied the WWDC session 405 "Exploring AV Foundation" to see how that is done)

I have tried similar code as in the WWDC Session 409 "Using the Camera on iPhone." Like that myCone demo, I can set up the device, the input, the capture session, the output, the setting up of a callback function to a CMSampleBuffer, and I can collect UIImages and size them, etc. At this point I want to send that image to a UIView or UIimageView. The session 409 just talks about doing 开发者_开发问答it with CFShow(sampleBuffer). This wasn't explained, and I guess its just assuming a knowledge of Core Foundation I don't yet have. I think I am turning the captured output in the sample buffer into a UIImage, but I can't figure out how to render it. I created an IBOutlet UIImageView in my nib file, but when I try to stuff the image into that view, nothing gets displayed. Do I need an AVPlayerLayer?

I have looked at the UIImagePickerViewController as an alternate method of controlling how often I display captured camera images, and I dont see that I can change the time on the fly to display images using that controller either.

So, as you can see, I am learning this stuff with the Apple development forum and their documentation, the WWDC videos, and various websites such as stackoverflow.com but have yet to see any examples of doing camera to screen without using AVCaptureVideoPreviewLayer, UIImagePickderViewController or by using an AVAsset that isnt already a file or http stream.

Can anybody make a suggestion? Thanks in advance.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号