AVCaptureSession 图像是屏幕大小
AVCaptureSession Image Is Size Of Screen
我正在使用 AVCaptureSession 创建相机并尝试用它拍照。这是加载相机的代码...
func reloadCamera() {
cameraView.backgroundColor = UIColor.clearColor()
captureSession = AVCaptureSession()
captureSession!.sessionPreset = AVCaptureSessionPresetHigh
let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
if (camera == false) {
let videoDevices = AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo)
for device in videoDevices {
let device = device as! AVCaptureDevice
if device.position == AVCaptureDevicePosition.Front {
captureDevice = device
break
} else {
captureDevice = backCamera
}
}
} else {
captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
}
do {
let input = try AVCaptureDeviceInput(device: captureDevice)
if captureSession!.canAddInput(input) {
captureSession!.addInput(input)
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput!.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if captureSession!.canAddOutput(stillImageOutput) {
captureSession!.addOutput(stillImageOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer!.videoGravity = AVLayerVideoGravityResizeAspectFill
previewLayer!.connection?.videoOrientation = AVCaptureVideoOrientation.Portrait
previewLayer?.frame = cameraView.bounds
cameraView.layer.addSublayer(previewLayer!)
captureSession!.startRunning()
}
}
} catch let error as NSError {
// Handle any errors
print(error)
}
}
这是我拍照的方式...
func didPressTakePhoto(){
toggleFlash()
if let videoConnection = stillImageOutput?.connectionWithMediaType(AVMediaTypeVideo){
videoConnection.videoOrientation = (previewLayer?.connection.videoOrientation)!
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {
(sampleBuffer, error) in
if sampleBuffer != nil {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
self.capturedImage = UIImage(data: imageData)
if self.camera == true {
self.capturedImage = UIImage(CGImage: self.capturedImage.CGImage!, scale: 1.0, orientation: UIImageOrientation.Right)
} else {
self.capturedImage = UIImage(CGImage: self.capturedImage.CGImage!, scale: 1.0, orientation: UIImageOrientation.LeftMirrored)
}
self.tempImageView.image = self.capturedImage
UIImageWriteToSavedPhotosAlbum(self.capturedImage, nil, nil, nil);
self.tempImageView.hidden = false
self.goButton.hidden = false
self.cameraView.hidden = true
self.removeImageButton.hidden = false
self.captureButton.hidden = true
self.flashChanger.hidden = true
self.switchCameraButton.hidden = true
}
})
}
}
但实际情况是拍摄的照片与整个屏幕一样大(如 Snapchat),但我只希望它与我从中获取的 UIView 一样大。如果我需要更多信息,请告诉我。谢谢!
首先,您是将捕获会话的预设设置为 AVCaptureSessionPresetHigh 的人。如果你不需要那个,就不要那样做;使用 smaller-size 预设。例如,使用 AVCaptureSessionPreset640x480 以获得更小的尺寸。
其次,无论生成的照片大小如何,将其缩小到您需要的大小并以该大小显示,完全取决于您。最终,这只是图像视图的大小和内容模式的问题,尽管最好也减小图像大小,以避免浪费内存(就像显示大得多的图像一样)然后向用户显示什么)。
我正在使用 AVCaptureSession 创建相机并尝试用它拍照。这是加载相机的代码...
func reloadCamera() {
cameraView.backgroundColor = UIColor.clearColor()
captureSession = AVCaptureSession()
captureSession!.sessionPreset = AVCaptureSessionPresetHigh
let backCamera = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
if (camera == false) {
let videoDevices = AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo)
for device in videoDevices {
let device = device as! AVCaptureDevice
if device.position == AVCaptureDevicePosition.Front {
captureDevice = device
break
} else {
captureDevice = backCamera
}
}
} else {
captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
}
do {
let input = try AVCaptureDeviceInput(device: captureDevice)
if captureSession!.canAddInput(input) {
captureSession!.addInput(input)
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput!.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if captureSession!.canAddOutput(stillImageOutput) {
captureSession!.addOutput(stillImageOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer!.videoGravity = AVLayerVideoGravityResizeAspectFill
previewLayer!.connection?.videoOrientation = AVCaptureVideoOrientation.Portrait
previewLayer?.frame = cameraView.bounds
cameraView.layer.addSublayer(previewLayer!)
captureSession!.startRunning()
}
}
} catch let error as NSError {
// Handle any errors
print(error)
}
}
这是我拍照的方式...
func didPressTakePhoto(){
toggleFlash()
if let videoConnection = stillImageOutput?.connectionWithMediaType(AVMediaTypeVideo){
videoConnection.videoOrientation = (previewLayer?.connection.videoOrientation)!
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {
(sampleBuffer, error) in
if sampleBuffer != nil {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
self.capturedImage = UIImage(data: imageData)
if self.camera == true {
self.capturedImage = UIImage(CGImage: self.capturedImage.CGImage!, scale: 1.0, orientation: UIImageOrientation.Right)
} else {
self.capturedImage = UIImage(CGImage: self.capturedImage.CGImage!, scale: 1.0, orientation: UIImageOrientation.LeftMirrored)
}
self.tempImageView.image = self.capturedImage
UIImageWriteToSavedPhotosAlbum(self.capturedImage, nil, nil, nil);
self.tempImageView.hidden = false
self.goButton.hidden = false
self.cameraView.hidden = true
self.removeImageButton.hidden = false
self.captureButton.hidden = true
self.flashChanger.hidden = true
self.switchCameraButton.hidden = true
}
})
}
}
但实际情况是拍摄的照片与整个屏幕一样大(如 Snapchat),但我只希望它与我从中获取的 UIView 一样大。如果我需要更多信息,请告诉我。谢谢!
首先,您是将捕获会话的预设设置为 AVCaptureSessionPresetHigh 的人。如果你不需要那个,就不要那样做;使用 smaller-size 预设。例如,使用 AVCaptureSessionPreset640x480 以获得更小的尺寸。
其次,无论生成的照片大小如何,将其缩小到您需要的大小并以该大小显示,完全取决于您。最终,这只是图像视图的大小和内容模式的问题,尽管最好也减小图像大小,以避免浪费内存(就像显示大得多的图像一样)然后向用户显示什么)。