如何在松开按钮后立即看到新的视频层?
How to instantly see new video layer after letting go of button?
我有一个应用程序,用户可以在其中按住一个按钮来拍摄视频。然而,当他们这样做时,带有视频播放的新层不会立即出现。相反,有一个非常短的延迟,您可以看到相机仍然显示用户松开按钮后相机看到的内容。当延迟结束时,视频立即出现并开始播放。但是我怎样才能让视频的第一帧出现在它准备好播放之前,以便它在开始播放之前只在那里停留片刻?查看 snapchat 的视频拍摄功能了解我的意思
下面是我的 longTap 方法:
@objc func longTap(_ sender: UIGestureRecognizer) {
print("Long tap")
self.numForVid = numForVid + 1 //shud change this number stuff
print("\(numForVid)")
cameraButton.isHidden = true
if sender.state == .ended {
print("UIGestureRecognizerStateEnded")
//stopSession()
stopRecording()
}
else if sender.state == .began {
print("UIGestureRecognizerStateBegan.")
//Do Whatever You want on Began of Gesture
startCapture()
}
}
停止录制功能:
func stopRecording() {
if movieOutput.isRecording == true {
movieOutput.stopRecording()
}
}
并且在输出URL之后调用的方法有所有数据:
func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {
if (error != nil) {
print("Error recording movie11: \(error!.localizedDescription)")
} else {
newViewVideoPlayback()
switchIcon.isHidden = true
switchWhiteUI.isHidden = true
switchCamButton.isHidden = true
camWhiteLine.isHidden = true
let videoRecorded = outputURL! as URL
playerQueue = AVQueuePlayer(playerItem: AVPlayerItem(url: videoRecorded))
self.playerQueue?.play()
playerLayer = AVPlayerLayer(player: playerQueue)
playerLayer.frame = (camPreview?.bounds)!
playerLayer?.layoutIfNeeded()
playerLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
camPreview?.layer.insertSublayer(playerLayer, above: previewLayer)
playerItem1 = AVPlayerItem(url: videoRecorded)
playerLooper = AVPlayerLooper(player: playerQueue, templateItem: playerItem1)
if !captureSession.isRunning {
DispatchQueue.global(qos: .background).async {
self.startRunningCaptureSession()
}
}
}
}
更新:
我已经尝试了下面的代码,但它只是继续 运行 永远导致 xcode 崩溃,我不明白为什么,因为据说在某些时候 AVPlayer 项目的状态应该是.readyToPlay
while playerItem1.status == .unknown {
print("1111121232432431243123241432143243214324321")
if playerItem1.status == .readyToPlay {
playerQueue = AVQueuePlayer(playerItem: playerItem1)
self.playerQueue?.play()
playerLayer = AVPlayerLayer(player: playerQueue)
playerLayer.frame = (camPreview?.bounds)!
playerLayer?.layoutIfNeeded()
playerLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
camPreview?.layer.insertSublayer(playerLayer, above: previewLayer)
}
}
playerLooper = AVPlayerLooper(player: playerQueue, templateItem: playerItem1)
这可以通过以下方式完成。在 viewController 中设置这些属性,用于在视频录制开始时存储第一帧的缩略图,如果捕获的照片是缩略图或通过单击的常规照片捕获,则设置一个布尔值来存储:
private var isSettingThumbnail = false
private var thumbImage: UIImage?
并更改单击手势识别器目标的实现
@objc func normalTap(_ sender: UIGestureRecognizer) {
//self.numForPic = numForPic + 1
let settings = AVCapturePhotoSettings()
isSettingThumbnail = false
photoOutput?.capturePhoto(with: settings, delegate: self)
}
您还需要在停止录制时移除预览层,否则截取的缩略图无法显示在ImageView 中。这里的 10 秒延迟只是为了检查缩略图是否显示。仅用于调试目的,您可以在以后随时删除它。
func stopRecording() {
if thumbImage != nil {
camPreview.image = thumbImage!
}
previewLayer?.removeFromSuperlayer()
DispatchQueue.main.asyncAfter(deadline: .now() + 10.0, execute: {
if self.movieOutput.isRecording == true {
self.movieOutput.stopRecording()
}
})
}
现在将委托函数添加到AVCaptureFileOutputRecordingDelegate
,以便在录制开始时获得回调。此时,您设置了布尔标志。
func fileOutput(_ output: AVCaptureFileOutput, didStartRecordingTo fileURL: URL, from connections: [AVCaptureConnection]) {
isSettingThumbnail = true
photoOutput?.capturePhoto(with: AVCapturePhotoSettings(), delegate: self)
}
现在我们检查这个布尔标志并存储和存储缩略图或普通图像。
extension ViewController: AVCapturePhotoCaptureDelegate {
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
print("you in this !")
if let imageData = photo.fileDataRepresentation() {
print("\(UIImage(data: imageData)) <-- image DALUE FEFE DEDE KEKE LALY")
if isSettingThumbnail {
thumbImage = UIImage(data: imageData)
} else {
image = UIImage(data: imageData)
}
print("\(image) <-- dada dudu creoo IMAGE valu")
}
}
}
我已经测试了代码并且工作正常。此外,在录制过程中,快门声不会 play by default and we take the photo when the recording has begun so we are doing it as per the Apple guidelines. According to this apple doc. This is one possible way of doing it. Also please change the class of Cam Preview to UIImageView
in the storyboard. You can check the working code on github。祝你有美好的一天。这是一个非常有趣的问题。 :)
我有一个应用程序,用户可以在其中按住一个按钮来拍摄视频。然而,当他们这样做时,带有视频播放的新层不会立即出现。相反,有一个非常短的延迟,您可以看到相机仍然显示用户松开按钮后相机看到的内容。当延迟结束时,视频立即出现并开始播放。但是我怎样才能让视频的第一帧出现在它准备好播放之前,以便它在开始播放之前只在那里停留片刻?查看 snapchat 的视频拍摄功能了解我的意思
下面是我的 longTap 方法:
@objc func longTap(_ sender: UIGestureRecognizer) {
print("Long tap")
self.numForVid = numForVid + 1 //shud change this number stuff
print("\(numForVid)")
cameraButton.isHidden = true
if sender.state == .ended {
print("UIGestureRecognizerStateEnded")
//stopSession()
stopRecording()
}
else if sender.state == .began {
print("UIGestureRecognizerStateBegan.")
//Do Whatever You want on Began of Gesture
startCapture()
}
}
停止录制功能:
func stopRecording() {
if movieOutput.isRecording == true {
movieOutput.stopRecording()
}
}
并且在输出URL之后调用的方法有所有数据:
func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {
if (error != nil) {
print("Error recording movie11: \(error!.localizedDescription)")
} else {
newViewVideoPlayback()
switchIcon.isHidden = true
switchWhiteUI.isHidden = true
switchCamButton.isHidden = true
camWhiteLine.isHidden = true
let videoRecorded = outputURL! as URL
playerQueue = AVQueuePlayer(playerItem: AVPlayerItem(url: videoRecorded))
self.playerQueue?.play()
playerLayer = AVPlayerLayer(player: playerQueue)
playerLayer.frame = (camPreview?.bounds)!
playerLayer?.layoutIfNeeded()
playerLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
camPreview?.layer.insertSublayer(playerLayer, above: previewLayer)
playerItem1 = AVPlayerItem(url: videoRecorded)
playerLooper = AVPlayerLooper(player: playerQueue, templateItem: playerItem1)
if !captureSession.isRunning {
DispatchQueue.global(qos: .background).async {
self.startRunningCaptureSession()
}
}
}
}
更新:
我已经尝试了下面的代码,但它只是继续 运行 永远导致 xcode 崩溃,我不明白为什么,因为据说在某些时候 AVPlayer 项目的状态应该是.readyToPlay
while playerItem1.status == .unknown {
print("1111121232432431243123241432143243214324321")
if playerItem1.status == .readyToPlay {
playerQueue = AVQueuePlayer(playerItem: playerItem1)
self.playerQueue?.play()
playerLayer = AVPlayerLayer(player: playerQueue)
playerLayer.frame = (camPreview?.bounds)!
playerLayer?.layoutIfNeeded()
playerLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
camPreview?.layer.insertSublayer(playerLayer, above: previewLayer)
}
}
playerLooper = AVPlayerLooper(player: playerQueue, templateItem: playerItem1)
这可以通过以下方式完成。在 viewController 中设置这些属性,用于在视频录制开始时存储第一帧的缩略图,如果捕获的照片是缩略图或通过单击的常规照片捕获,则设置一个布尔值来存储:
private var isSettingThumbnail = false
private var thumbImage: UIImage?
并更改单击手势识别器目标的实现
@objc func normalTap(_ sender: UIGestureRecognizer) {
//self.numForPic = numForPic + 1
let settings = AVCapturePhotoSettings()
isSettingThumbnail = false
photoOutput?.capturePhoto(with: settings, delegate: self)
}
您还需要在停止录制时移除预览层,否则截取的缩略图无法显示在ImageView 中。这里的 10 秒延迟只是为了检查缩略图是否显示。仅用于调试目的,您可以在以后随时删除它。
func stopRecording() {
if thumbImage != nil {
camPreview.image = thumbImage!
}
previewLayer?.removeFromSuperlayer()
DispatchQueue.main.asyncAfter(deadline: .now() + 10.0, execute: {
if self.movieOutput.isRecording == true {
self.movieOutput.stopRecording()
}
})
}
现在将委托函数添加到AVCaptureFileOutputRecordingDelegate
,以便在录制开始时获得回调。此时,您设置了布尔标志。
func fileOutput(_ output: AVCaptureFileOutput, didStartRecordingTo fileURL: URL, from connections: [AVCaptureConnection]) {
isSettingThumbnail = true
photoOutput?.capturePhoto(with: AVCapturePhotoSettings(), delegate: self)
}
现在我们检查这个布尔标志并存储和存储缩略图或普通图像。
extension ViewController: AVCapturePhotoCaptureDelegate {
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
print("you in this !")
if let imageData = photo.fileDataRepresentation() {
print("\(UIImage(data: imageData)) <-- image DALUE FEFE DEDE KEKE LALY")
if isSettingThumbnail {
thumbImage = UIImage(data: imageData)
} else {
image = UIImage(data: imageData)
}
print("\(image) <-- dada dudu creoo IMAGE valu")
}
}
}
我已经测试了代码并且工作正常。此外,在录制过程中,快门声不会 play by default and we take the photo when the recording has begun so we are doing it as per the Apple guidelines. According to this apple doc. This is one possible way of doing it. Also please change the class of Cam Preview to UIImageView
in the storyboard. You can check the working code on github。祝你有美好的一天。这是一个非常有趣的问题。 :)