问题是它没有抓住当前帧.它似乎忽略了小数.如果我用例如1.22和1.70调用该函数,它将返回相同的帧.对于Swift来说,我是新手,所以我猜我没有把CMTime对象搞定.所以有人能看出这有什么问题吗?
func generateThumnail(url : NSURL,fromTime:float64) -> UIImage { var asset :AVAsset = AVAsset.assetWithURL(url) as! AVAsset var assetimgGenerate : AVAssetimageGenerator = AVAssetimageGenerator(asset: asset) assetimgGenerate.applIEsPreferredTracktransform = true var error : NSError? = nil var time : CMTime = CMTimeMakeWithSeconds(fromTime,600) var img : CGImageRef = assetimgGenerate.copyCGImageAtTime(time,actualTime: nil,error: &error) var frameimg : UIImage = UIImage(CGImage: img)! return frameimg}var grabTime = 1.22img = generateThumnail(urlVIDeo,fromTime: float64(grabTime))感谢@ eric-d发现这篇文章:
iOS Take Multiple Screen Shots
我设法找出添加:
assetimgGenerate.requestedTimetoleranceAfter = kCMTimeZero; assetimgGenerate.requestedTimetoleranceBefore = kCMTimeZero;
…对我的功能将做的伎俩.
我更新的函数如下所示:
func generateThumnail(url : NSURL,fromTime:float64) -> UIImage { var asset :AVAsset = AVAsset.assetWithURL(url) as! AVAsset var assetimgGenerate : AVAssetimageGenerator = AVAssetimageGenerator(asset: asset) assetimgGenerate.applIEsPreferredTracktransform = true assetimgGenerate.requestedTimetoleranceAfter = kCMTimeZero; assetimgGenerate.requestedTimetoleranceBefore = kCMTimeZero; var error : NSError? = nil var time : CMTime = CMTimeMakeWithSeconds(fromTime,fromTime: float64(grabTime))总结
以上是内存溢出为你收集整理的使用Swift从视频中抓取帧全部内容,希望文章能够帮你解决使用Swift从视频中抓取帧所遇到的程序开发问题。
如果觉得内存溢出网站内容还不错,欢迎将内存溢出网站推荐给程序员好友。
欢迎分享,转载请注明来源:内存溢出
评论列表(0条)