当前位置: 首页 > 面试题库 >

使用不同的过渡动画从UIImage数组制作视频

陶树
2023-03-14
问题内容

我正在按照以下代码从 UIImage Array* 创建 视频
。从一个图像过渡到另一个图像时,此处没有动画。我想添加一些像这样的照片过渡效果:
*

  1. TransitionFlipFromTop
  2. TransitionFlipFromBottom
  3. TransitionFlipFromLeft
  4. TransitionFlipFromRight
  5. TransitionCurlUp
  6. TransitionCurlDown
  7. TransitionCrossDissolve
  8. 淡入
  9. 消退

这些动画可以通过UIView.transition()&来完成UIView.animate()

但是,如何在从UIImage数组制作视频时应用这些过渡动画?我搜索了很多,但没有找到任何东西。

我也尝试过
HJImagesToVideo,但它仅提供Crossfade过渡。


问题答案:

我在同一问题上停留了一段时间。由于这是一个很大的答案,可能会让您感到无聊,但我希望它会为您提供帮助。这是我在Swift 3中的答案。

理论:

首先, 您必须使用图像阵列制作视频。关于此问题有很多解决方案。只是搜索一下,希望您能得到一个。如果没有,请参见下面的代码。 其次,
您可能知道可以在视频上添加CALayer。如果不是,请搜索一下如何在视频上添加CALayer。 第三,
您必须知道如何为CALayer设置动画。Core
Animation将为您完成这项工作。关于此的大量示例代码可用。搜索它。现在,您有了CALayer,并且知道如何为其设置动画了。 最后,
您必须在视频中添加此动画CALayer。

码:

1.使用图像数组创建视频

let outputSize = CGSize(width: 1920, height: 1280)
    let imagesPerSecond: TimeInterval = 3 //each image will be stay for 3 secs
    var selectedPhotosArray = [UIImage]()
    var imageArrayToVideoURL = NSURL()
    let audioIsEnabled: Bool = false //if your video has no sound
    var asset: AVAsset!

func buildVideoFromImageArray() {
        for image in 0..<5 {
            selectedPhotosArray.append(UIImage(named: "\(image + 1).JPG")!) //name of the images: 1.JPG, 2.JPG, 3.JPG, 4.JPG, 5.JPG
        }

        imageArrayToVideoURL = NSURL(fileURLWithPath: NSHomeDirectory() + "/Documents/video1.MP4")
        removeFileAtURLIfExists(url: imageArrayToVideoURL)
        guard let videoWriter = try? AVAssetWriter(outputURL: imageArrayToVideoURL as URL, fileType: AVFileTypeMPEG4) else {
            fatalError("AVAssetWriter error")
        }
        let outputSettings = [AVVideoCodecKey : AVVideoCodecH264, AVVideoWidthKey : NSNumber(value: Float(outputSize.width)), AVVideoHeightKey : NSNumber(value: Float(outputSize.height))] as [String : Any]
        guard videoWriter.canApply(outputSettings: outputSettings, forMediaType: AVMediaTypeVideo) else {
            fatalError("Negative : Can't apply the Output settings...")
        }
        let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings)
        let sourcePixelBufferAttributesDictionary = [kCVPixelBufferPixelFormatTypeKey as String : NSNumber(value: kCVPixelFormatType_32ARGB), kCVPixelBufferWidthKey as String: NSNumber(value: Float(outputSize.width)), kCVPixelBufferHeightKey as String: NSNumber(value: Float(outputSize.height))]
        let pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: videoWriterInput, sourcePixelBufferAttributes: sourcePixelBufferAttributesDictionary)
        if videoWriter.canAdd(videoWriterInput) {
            videoWriter.add(videoWriterInput)
        }
        if videoWriter.startWriting() {
            let zeroTime = CMTimeMake(Int64(imagesPerSecond),Int32(1))
            videoWriter.startSession(atSourceTime: zeroTime)

            assert(pixelBufferAdaptor.pixelBufferPool != nil)
            let media_queue = DispatchQueue(label: "mediaInputQueue")
            videoWriterInput.requestMediaDataWhenReady(on: media_queue, using: { () -> Void in
                let fps: Int32 = 1
                let framePerSecond: Int64 = Int64(self.imagesPerSecond)
                let frameDuration = CMTimeMake(Int64(self.imagesPerSecond), fps)
                var frameCount: Int64 = 0
                var appendSucceeded = true
                while (!self.selectedPhotosArray.isEmpty) {
                    if (videoWriterInput.isReadyForMoreMediaData) {
                        let nextPhoto = self.selectedPhotosArray.remove(at: 0)
                        let lastFrameTime = CMTimeMake(frameCount * framePerSecond, fps)
                        let presentationTime = frameCount == 0 ? lastFrameTime : CMTimeAdd(lastFrameTime, frameDuration)
                        var pixelBuffer: CVPixelBuffer? = nil
                        let status: CVReturn = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, pixelBufferAdaptor.pixelBufferPool!, &pixelBuffer)
                        if let pixelBuffer = pixelBuffer, status == 0 {
                            let managedPixelBuffer = pixelBuffer
                            CVPixelBufferLockBaseAddress(managedPixelBuffer, CVPixelBufferLockFlags(rawValue: CVOptionFlags(0)))
                            let data = CVPixelBufferGetBaseAddress(managedPixelBuffer)
                            let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
                            let context = CGContext(data: data, width: Int(self.outputSize.width), height: Int(self.outputSize.height), bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(managedPixelBuffer), space: rgbColorSpace, bitmapInfo: CGImageAlphaInfo.premultipliedFirst.rawValue)
                            context!.clear(CGRect(x: 0, y: 0, width: CGFloat(self.outputSize.width), height: CGFloat(self.outputSize.height)))
                            let horizontalRatio = CGFloat(self.outputSize.width) / nextPhoto.size.width
                            let verticalRatio = CGFloat(self.outputSize.height) / nextPhoto.size.height
                            //let aspectRatio = max(horizontalRatio, verticalRatio) // ScaleAspectFill
                            let aspectRatio = min(horizontalRatio, verticalRatio) // ScaleAspectFit
                            let newSize: CGSize = CGSize(width: nextPhoto.size.width * aspectRatio, height: nextPhoto.size.height * aspectRatio)
                            let x = newSize.width < self.outputSize.width ? (self.outputSize.width - newSize.width) / 2 : 0
                            let y = newSize.height < self.outputSize.height ? (self.outputSize.height - newSize.height) / 2 : 0
                            context?.draw(nextPhoto.cgImage!, in: CGRect(x: x, y: y, width: newSize.width, height: newSize.height))
                            CVPixelBufferUnlockBaseAddress(managedPixelBuffer, CVPixelBufferLockFlags(rawValue: CVOptionFlags(0)))
                            appendSucceeded = pixelBufferAdaptor.append(pixelBuffer, withPresentationTime: presentationTime)
                        } else {
                            print("Failed to allocate pixel buffer")
                            appendSucceeded = false
                        }
                    }
                    if !appendSucceeded {
                        break
                    }
                    frameCount += 1
                }
                videoWriterInput.markAsFinished()
                videoWriter.finishWriting { () -> Void in
                    print("-----video1 url = \(self.imageArrayToVideoURL)")

                    self.asset = AVAsset(url: self.imageArrayToVideoURL as URL)
                    self.exportVideoWithAnimation()
                }
            })
        }
    }

    func removeFileAtURLIfExists(url: NSURL) {
        if let filePath = url.path {
            let fileManager = FileManager.default
            if fileManager.fileExists(atPath: filePath) {
                do{
                    try fileManager.removeItem(atPath: filePath)
                } catch let error as NSError {
                    print("Couldn't remove existing destination file: \(error)")
                }
            }
        }
    }

2.将动画添加到创建的视频中(请仔细阅读所有评论的部分。我认为通过阅读这些内容,一些问题会变得清楚。)

func exportVideoWithAnimation() {
        let composition = AVMutableComposition()

        let track =  asset?.tracks(withMediaType: AVMediaTypeVideo)
        let videoTrack:AVAssetTrack = track![0] as AVAssetTrack
        let timerange = CMTimeRangeMake(kCMTimeZero, (asset?.duration)!)

        let compositionVideoTrack:AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID())

        do {
            try compositionVideoTrack.insertTimeRange(timerange, of: videoTrack, at: kCMTimeZero)
            compositionVideoTrack.preferredTransform = videoTrack.preferredTransform
        } catch {
            print(error)
        }

        //if your video has sound, you don’t need to check this
        if audioIsEnabled {
            let compositionAudioTrack:AVMutableCompositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())

            for audioTrack in (asset?.tracks(withMediaType: AVMediaTypeAudio))! {
                do {
                    try compositionAudioTrack.insertTimeRange(audioTrack.timeRange, of: audioTrack, at: kCMTimeZero)
                } catch {
                    print(error)
                }
            }
        }

        let size = videoTrack.naturalSize

        let videolayer = CALayer()
        videolayer.frame = CGRect(x: 0, y: 0, width: size.width, height: size.height)

        let parentlayer = CALayer()
        parentlayer.frame = CGRect(x: 0, y: 0, width: size.width, height: size.height)
        parentlayer.addSublayer(videolayer)

        ////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
        //this is the animation part
        var time = [0.00001, 3, 6, 9, 12] //I used this time array to determine the start time of a frame animation. Each frame will stay for 3 secs, thats why their difference is 3
        var imgarray = [UIImage]()

        for image in 0..<5 {
            imgarray.append(UIImage(named: "\(image + 1).JPG")!)

            let nextPhoto = imgarray[image]

            let horizontalRatio = CGFloat(self.outputSize.width) / nextPhoto.size.width
            let verticalRatio = CGFloat(self.outputSize.height) / nextPhoto.size.height
            let aspectRatio = min(horizontalRatio, verticalRatio)
            let newSize: CGSize = CGSize(width: nextPhoto.size.width * aspectRatio, height: nextPhoto.size.height * aspectRatio)
            let x = newSize.width < self.outputSize.width ? (self.outputSize.width - newSize.width) / 2 : 0
            let y = newSize.height < self.outputSize.height ? (self.outputSize.height - newSize.height) / 2 : 0

            ///I showed 10 animations here. You can uncomment any of this and export a video to see the result.

            ///#1. left->right///
            let blackLayer = CALayer()
            blackLayer.frame = CGRect(x: -videoTrack.naturalSize.width, y: 0, width: videoTrack.naturalSize.width, height: videoTrack.naturalSize.height)
            blackLayer.backgroundColor = UIColor.black.cgColor

            let imageLayer = CALayer()
            imageLayer.frame = CGRect(x: x, y: y, width: newSize.width, height: newSize.height)
            imageLayer.contents = imgarray[image].cgImage
            blackLayer.addSublayer(imageLayer)

            let animation = CABasicAnimation()
            animation.keyPath = "position.x"
            animation.fromValue = -videoTrack.naturalSize.width
            animation.toValue = 2 * (videoTrack.naturalSize.width)
            animation.duration = 3
            animation.beginTime = CFTimeInterval(time[image])
            animation.fillMode = kCAFillModeForwards
            animation.isRemovedOnCompletion = false
            blackLayer.add(animation, forKey: "basic")

            ///#2. right->left///
//            let blackLayer = CALayer()
//            blackLayer.frame = CGRect(x: 2 * videoTrack.naturalSize.width, y: 0, width: videoTrack.naturalSize.width, height: videoTrack.naturalSize.height)
//            blackLayer.backgroundColor = UIColor.black.cgColor
//            
//            let imageLayer = CALayer()
//            imageLayer.frame = CGRect(x: x, y: y, width: newSize.width, height: newSize.height)
//            imageLayer.contents = imgarray[image].cgImage
//            blackLayer.addSublayer(imageLayer)
//            
//            let animation = CABasicAnimation()
//            animation.keyPath = "position.x"
//            animation.fromValue = 2 * (videoTrack.naturalSize.width)
//            animation.toValue = -videoTrack.naturalSize.width
//            animation.duration = 3
//            animation.beginTime = CFTimeInterval(time[image])
//            animation.fillMode = kCAFillModeForwards
//            animation.isRemovedOnCompletion = false
//            blackLayer.add(animation, forKey: "basic")

            ///#3. top->bottom///
//            let blackLayer = CALayer()
//            blackLayer.frame = CGRect(x: 0, y: 2 * videoTrack.naturalSize.height, width: videoTrack.naturalSize.width, height: videoTrack.naturalSize.height)
//            blackLayer.backgroundColor = UIColor.black.cgColor
//            
//            let imageLayer = CALayer()
//            imageLayer.frame = CGRect(x: x, y: y, width: newSize.width, height: newSize.height)
//            imageLayer.contents = imgarray[image].cgImage
//            blackLayer.addSublayer(imageLayer)
//            
//            let animation = CABasicAnimation()
//            animation.keyPath = "position.y"
//            animation.fromValue = 2 * videoTrack.naturalSize.height
//            animation.toValue = -videoTrack.naturalSize.height
//            animation.duration = 3
//            animation.beginTime = CFTimeInterval(time[image])
//            animation.fillMode = kCAFillModeForwards
//            animation.isRemovedOnCompletion = false
//            blackLayer.add(animation, forKey: "basic")

            ///#4. bottom->top///
//            let blackLayer = CALayer()
//            blackLayer.frame = CGRect(x: 0, y: -videoTrack.naturalSize.height, width: videoTrack.naturalSize.width, height: videoTrack.naturalSize.height)
//            blackLayer.backgroundColor = UIColor.black.cgColor
//            
//            let imageLayer = CALayer()
//            imageLayer.frame = CGRect(x: x, y: y, width: newSize.width, height: newSize.height)
//            imageLayer.contents = imgarray[image].cgImage
//            blackLayer.addSublayer(imageLayer)
//            
//            let animation = CABasicAnimation()
//            animation.keyPath = "position.y"
//            animation.fromValue = -videoTrack.naturalSize.height
//            animation.toValue = 2 * videoTrack.naturalSize.height
//            animation.duration = 3
//            animation.beginTime = CFTimeInterval(time[image])
//            animation.fillMode = kCAFillModeForwards
//            animation.isRemovedOnCompletion = false
//            blackLayer.add(animation, forKey: "basic")

            ///#5. opacity(1->0)(left->right)///
//            let blackLayer = CALayer()
//            blackLayer.frame = CGRect(x: -videoTrack.naturalSize.width, y: 0, width: videoTrack.naturalSize.width, height: videoTrack.naturalSize.height)
//            blackLayer.backgroundColor = UIColor.black.cgColor
//            
//            let imageLayer = CALayer()
//            imageLayer.frame = CGRect(x: x, y: y, width: newSize.width, height: newSize.height)
//            imageLayer.contents = imgarray[image].cgImage
//            blackLayer.addSublayer(imageLayer)
//            
//            let animation = CABasicAnimation()
//            animation.keyPath = "position.x"
//            animation.fromValue = -videoTrack.naturalSize.width
//            animation.toValue = 2 * (videoTrack.naturalSize.width)
//            animation.duration = 3
//            animation.beginTime = CFTimeInterval(time[image])
//            animation.fillMode = kCAFillModeForwards
//            animation.isRemovedOnCompletion = false
//            blackLayer.add(animation, forKey: "basic")
//            
//            let fadeOutAnimation = CABasicAnimation(keyPath: "opacity")
//            fadeOutAnimation.fromValue = 1
//            fadeOutAnimation.toValue = 0
//            fadeOutAnimation.duration = 3
//            fadeOutAnimation.beginTime = CFTimeInterval(time[image])
//            fadeOutAnimation.isRemovedOnCompletion = false
//            blackLayer.add(fadeOutAnimation, forKey: "opacity")

            ///#6. opacity(1->0)(right->left)///
//            let blackLayer = CALayer()
//            blackLayer.frame = CGRect(x: 2 * videoTrack.naturalSize.width, y: 0, width: videoTrack.naturalSize.width, height: videoTrack.naturalSize.height)
//            blackLayer.backgroundColor = UIColor.black.cgColor
//            
//            let imageLayer = CALayer()
//            imageLayer.frame = CGRect(x: x, y: y, width: newSize.width, height: newSize.height)
//            imageLayer.contents = imgarray[image].cgImage
//            blackLayer.addSublayer(imageLayer)
//            
//            let animation = CABasicAnimation()
//            animation.keyPath = "position.x"
//            animation.fromValue = 2 * videoTrack.naturalSize.width
//            animation.toValue = -videoTrack.naturalSize.width
//            animation.duration = 3
//            animation.beginTime = CFTimeInterval(time[image])
//            animation.fillMode = kCAFillModeForwards
//            animation.isRemovedOnCompletion = false
//            blackLayer.add(animation, forKey: "basic")
//            
//            let fadeOutAnimation = CABasicAnimation(keyPath: "opacity")
//            fadeOutAnimation.fromValue = 1
//            fadeOutAnimation.toValue = 0
//            fadeOutAnimation.duration = 3
//            fadeOutAnimation.beginTime = CFTimeInterval(time[image])
//            fadeOutAnimation.isRemovedOnCompletion = false
//            blackLayer.add(fadeOutAnimation, forKey: "opacity")

            ///#7. opacity(1->0)(top->bottom)///
//            let blackLayer = CALayer()
//            blackLayer.frame = CGRect(x: 0, y: 2 * videoTrack.naturalSize.height, width: videoTrack.naturalSize.width, height: videoTrack.naturalSize.height)
//            blackLayer.backgroundColor = UIColor.black.cgColor
//            
//            let imageLayer = CALayer()
//            imageLayer.frame = CGRect(x: x, y: y, width: newSize.width, height: newSize.height)
//            imageLayer.contents = imgarray[image].cgImage
//            blackLayer.addSublayer(imageLayer)
//            
//            let animation = CABasicAnimation()
//            animation.keyPath = "position.y"
//            animation.fromValue = 2 * videoTrack.naturalSize.height
//            animation.toValue = -videoTrack.naturalSize.height
//            animation.duration = 3
//            animation.beginTime = CFTimeInterval(time[image])
//            animation.fillMode = kCAFillModeForwards
//            animation.isRemovedOnCompletion = false
//            blackLayer.add(animation, forKey: "basic")
//            
//            let fadeOutAnimation = CABasicAnimation(keyPath: "opacity")
//            fadeOutAnimation.fromValue = 1
//            fadeOutAnimation.toValue = 0
//            fadeOutAnimation.duration = 3
//            fadeOutAnimation.beginTime = CFTimeInterval(time[image])
//            fadeOutAnimation.isRemovedOnCompletion = false
//            blackLayer.add(fadeOutAnimation, forKey: "opacity")

            ///#8. opacity(1->0)(bottom->top)///
//            let blackLayer = CALayer()
//            blackLayer.frame = CGRect(x: 0, y: -videoTrack.naturalSize.height, width: videoTrack.naturalSize.width, height: videoTrack.naturalSize.height)
//            blackLayer.backgroundColor = UIColor.black.cgColor
//            
//            let imageLayer = CALayer()
//            imageLayer.frame = CGRect(x: x, y: y, width: newSize.width, height: newSize.height)
//            imageLayer.contents = imgarray[image].cgImage
//            blackLayer.addSublayer(imageLayer)
//            
//            let animation = CABasicAnimation()
//            animation.keyPath = "position.y"
//            animation.fromValue = -videoTrack.naturalSize.height
//            animation.toValue = 2 * videoTrack.naturalSize.height
//            animation.duration = 3
//            animation.beginTime = CFTimeInterval(time[image])
//            animation.fillMode = kCAFillModeForwards
//            animation.isRemovedOnCompletion = false
//            blackLayer.add(animation, forKey: "basic")
//            
//            let fadeOutAnimation = CABasicAnimation(keyPath: "opacity")
//            fadeOutAnimation.fromValue = 1
//            fadeOutAnimation.toValue = 0
//            fadeOutAnimation.duration = 3
//            fadeOutAnimation.beginTime = CFTimeInterval(time[image])
//            fadeOutAnimation.isRemovedOnCompletion = false
//            blackLayer.add(fadeOutAnimation, forKey: "opacity")

            ///#9. scale(small->big->small)///
//            let blackLayer = CALayer()
//            blackLayer.frame = CGRect(x: 0, y: 0, width: videoTrack.naturalSize.width, height: videoTrack.naturalSize.height)
//            blackLayer.backgroundColor = UIColor.black.cgColor
//            blackLayer.opacity = 0
//            
//            let imageLayer = CALayer()
//            imageLayer.frame = CGRect(x: x, y: y, width: newSize.width, height: newSize.height)
//            imageLayer.contents = imgarray[image].cgImage
//            blackLayer.addSublayer(imageLayer)
//            
//            let scaleAnimation = CAKeyframeAnimation(keyPath: "transform.scale")
//            scaleAnimation.values = [0, 1.0, 0]
//            scaleAnimation.beginTime = CFTimeInterval(time[image])
//            scaleAnimation.duration = 3
//            scaleAnimation.isRemovedOnCompletion = false
//            blackLayer.add(scaleAnimation, forKey: "transform.scale")
//            
//            let fadeInOutAnimation = CABasicAnimation(keyPath: "opacity")
//            fadeInOutAnimation.fromValue = 1
//            fadeInOutAnimation.toValue = 1
//            fadeInOutAnimation.duration = 3
//            fadeInOutAnimation.beginTime = CFTimeInterval(time[image])
//            fadeInOutAnimation.isRemovedOnCompletion = false
//            blackLayer.add(fadeInOutAnimation, forKey: "opacity")

            ///#10. scale(big->small->big)///
//            let blackLayer = CALayer()
//            blackLayer.frame = CGRect(x: 0, y: 0, width: videoTrack.naturalSize.width, height: videoTrack.naturalSize.height)
//            blackLayer.backgroundColor = UIColor.black.cgColor
//            blackLayer.opacity = 0
//            
//            let imageLayer = CALayer()
//            imageLayer.frame = CGRect(x: x, y: y, width: newSize.width, height: newSize.height)
//            imageLayer.contents = imgarray[image].cgImage
//            blackLayer.addSublayer(imageLayer)
//            
//            let scaleAnimation = CAKeyframeAnimation(keyPath: "transform.scale")
//            scaleAnimation.values = [1, 0, 1]
//            scaleAnimation.beginTime = CFTimeInterval(time[image])
//            scaleAnimation.duration = 3
//            scaleAnimation.isRemovedOnCompletion = false
//            blackLayer.add(scaleAnimation, forKey: "transform.scale")
//            
//            let fadeOutAnimation = CABasicAnimation(keyPath: "opacity")
//            fadeOutAnimation.fromValue = 1
//            fadeOutAnimation.toValue = 1
//            fadeOutAnimation.duration = 3
//            fadeOutAnimation.beginTime = CFTimeInterval(time[image])
//            fadeOutAnimation.isRemovedOnCompletion = false
//            blackLayer.add(fadeOutAnimation, forKey: "opacity")

            parentlayer.addSublayer(blackLayer)
        }
        ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////

        let layercomposition = AVMutableVideoComposition()
        layercomposition.frameDuration = CMTimeMake(1, 30)
        layercomposition.renderSize = size
        layercomposition.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videolayer, in: parentlayer)
        let instruction = AVMutableVideoCompositionInstruction()
        instruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration)
        let videotrack = composition.tracks(withMediaType: AVMediaTypeVideo)[0] as AVAssetTrack
        let layerinstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videotrack)
        instruction.layerInstructions = [layerinstruction]
        layercomposition.instructions = [instruction]

        let animatedVideoURL = NSURL(fileURLWithPath: NSHomeDirectory() + "/Documents/video2.mp4")
        removeFileAtURLIfExists(url: animatedVideoURL)

        guard let assetExport = AVAssetExportSession(asset: composition, presetName:AVAssetExportPresetHighestQuality) else {return}
        assetExport.videoComposition = layercomposition
        assetExport.outputFileType = AVFileTypeMPEG4
        assetExport.outputURL = animatedVideoURL as URL
        assetExport.exportAsynchronously(completionHandler: {
            switch assetExport.status{
            case  AVAssetExportSessionStatus.failed:
                print("failed \(String(describing: assetExport.error))")
            case AVAssetExportSessionStatus.cancelled:
                print("cancelled \(String(describing: assetExport.error))")
            default:
                print("Exported")
            }
        })
    }

不要忘记导入AVFoundation



 类似资料:
  • 该项目提供了两种动画变体。 动画选项1,触发器('imationOption1') 无投诉。 动画选项2,触发器('imationOption2') 转换在此处不起作用。 在线查看这个项目在StackBlitz.com app.component.html 应用程序组件.ts 谷歌搜索并没有导致解决方案。

  • 请看两张截图: 上面的部分好像是用了一个ViewPager。ViewPager上的当前屏幕显示上一个和下一个屏幕的一部分。 问题:我们如何实现这个动画?i、 e.在哪里可以看到部分上一个和下一个屏幕??

  • 主要内容:过渡,实例,实例,实例,实例,JavaScript 钩子,HTML 代码:,JavaScript 代码:,实例,初始渲染的过渡本章节我们主要讨论 Vue.js 的过渡效果与动画效果。 过渡 Vue 在插入、更新或者移除 DOM 时,提供多种不同方式的应用过渡效果。 Vue 提供了内置的过渡封装组件,该组件用于包裹要实现过渡效果的组件。 语法格式 我们可以通过以下实例来理解 Vue 的过渡是如何实现的: 实例 <div id = "databinding"> <button v-on:c

  • 问题内容: 我一直在尝试放置一些基本的CSS3动画。目的是在按钮的click事件上切换类,并根据添加的类为div设置动画。该代码非常适合Firefox中的第一个切换迭代,但是对于Chrome等其他浏览器以及Firefox中的下一个迭代,只需瞬间即可切换。请帮助我找出问题所在。 片段: 我也在这里准备了一个小提琴。 问题答案: 这是Chrome的一种已知行为。Firefox似乎确实能够在过渡过程中顺

  • 问题内容: 因此,我了解了如何同时执行CSS3过渡和动画。我用谷歌搜索的不清楚是什么时候使用它。 例如,如果我想使球弹跳,很明显,动画是必经之路。我可以提供关键帧,而浏览器将做中间帧,并且我将获得一个不错的动画。 然而,在某些情况下,无论哪种方式都可以达到上述效果。一个简单而常见的示例是实现Facebook风格的滑动抽屉菜单: 可以通过以下过渡来实现此效果: 或者,通过类似这样的动画: 使用HTM

  • 内置过渡动画 Element 内应用在部分组件的过渡动画,你也可以直接使用。在使用之前请阅读 transition 组件文档 。 fade 淡入淡出 提供 el-fade-in-linear 和 el-fade-in 两种效果。 <template> <div> <el-button @click="show = !show">Click Me</el-button> <di