如何在WEBRTC最新框架<Anakros/WebRTC>中使用videoview中显示localstream的方法? - 用于 webrtc 框架(iOS)
How to use the method to show localstream in videoview in WEBRTC latest framework <Anakros/WebRTC>? - for webrtc framework(iOS)
更新最新的 webrtc 框架后,我不知道如何向用户显示本地流,因为方法已更改,在存储库的 "iOS" 文件夹中没有示例。
旧代码...
RTCVideoCapturer *capturer = [RTCVideoCapturer capturerWithDeviceName:cameraID];
RTCMediaConstraints *mediaConstraints = [self defaultMediaStreamConstraints];
RTCVideoSource *videoSource = [_factory videoSourceWithCapturer:capturer constraints:mediaConstraints];
localVideoTrack = [_factory videoTrackWithID:@"ARDAMSv0" source:videoSource];
RTCVideoCapturer 对象和 RTCVideoSource 对象在此处相互链接。
但在新代码中...
RTCVideoSource *source = [_factory videoSource];
RTCCameraVideoCapturer *capturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:source];
[_delegate appClient:self didCreateLocalCapturer:capturer];
localVideoTrack = [_factory videoTrackWithSource:source
trackId:kARDVideoTrackId];
彼此没有联系。
所以,委托方法做了什么,
[_delegate appClient:self didCreateLocalCapturer:capturer];
我不明白。 [需要帮助!]
在视频通话视图控制器中实现此委托方法....
- (void)appClient:(ARDAppClient *)client didCreateLocalCapturer:(RTCCameraVideoCapturer *)localCapturer{
NSLog(@"%s %@",__PRETTY_FUNCTION__ ,localCapturer);
_captureController = [[ARDCaptureController alloc] initWithCapturer:localCapturer
settings:[[ARDSettingsModel alloc] init]];
[_captureController startCapture];
}
然后....这个方法调用它来创建相同...
- (RTCVideoTrack *)createLocalVideoTrack {
RTCVideoTrack* localVideoTrack = nil;
// The iOS simulator doesn't provide any sort of camera capture
// trying to open a local stream.
#if !TARGET_IPHONE_SIMULATOR
if (![_settings currentAudioOnlySettingFromStore]) {
RTCVideoSource *source = [_factory videoSource];
RTCCameraVideoCapturer *capturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:source];
[_delegate appClient:self didCreateLocalCapturer:capturer];
localVideoTrack = [_factory videoTrackWithSource:source
trackId:kARDVideoTrackId];
[_delegate appClient:self didReceiveLocalVideoTrack:localVideoTrack];
}
然后调用...
_localVideoTrack = [self createLocalVideoTrack];
在你的初始化方法中...
- (void)initCall {
NSLog(@"%s",__PRETTY_FUNCTION__);
if (!_isTurnComplete) {
return;
}
self.state = kARDAppClientStateConnected;
_localVideoTrack = [self createLocalVideoTrack];
// Create peer connection.
_constraints = [self defaultPeerConnectionConstraints];
}
我可以通过这段代码实现!
更新最新的 webrtc 框架后,我不知道如何向用户显示本地流,因为方法已更改,在存储库的 "iOS" 文件夹中没有示例。
旧代码...
RTCVideoCapturer *capturer = [RTCVideoCapturer capturerWithDeviceName:cameraID];
RTCMediaConstraints *mediaConstraints = [self defaultMediaStreamConstraints];
RTCVideoSource *videoSource = [_factory videoSourceWithCapturer:capturer constraints:mediaConstraints];
localVideoTrack = [_factory videoTrackWithID:@"ARDAMSv0" source:videoSource];
RTCVideoCapturer 对象和 RTCVideoSource 对象在此处相互链接。
但在新代码中...
RTCVideoSource *source = [_factory videoSource];
RTCCameraVideoCapturer *capturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:source];
[_delegate appClient:self didCreateLocalCapturer:capturer];
localVideoTrack = [_factory videoTrackWithSource:source
trackId:kARDVideoTrackId];
彼此没有联系。 所以,委托方法做了什么, [_delegate appClient:self didCreateLocalCapturer:capturer]; 我不明白。 [需要帮助!]
在视频通话视图控制器中实现此委托方法....
- (void)appClient:(ARDAppClient *)client didCreateLocalCapturer:(RTCCameraVideoCapturer *)localCapturer{
NSLog(@"%s %@",__PRETTY_FUNCTION__ ,localCapturer);
_captureController = [[ARDCaptureController alloc] initWithCapturer:localCapturer
settings:[[ARDSettingsModel alloc] init]];
[_captureController startCapture];
}
然后....这个方法调用它来创建相同...
- (RTCVideoTrack *)createLocalVideoTrack {
RTCVideoTrack* localVideoTrack = nil;
// The iOS simulator doesn't provide any sort of camera capture
// trying to open a local stream.
#if !TARGET_IPHONE_SIMULATOR
if (![_settings currentAudioOnlySettingFromStore]) {
RTCVideoSource *source = [_factory videoSource];
RTCCameraVideoCapturer *capturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:source];
[_delegate appClient:self didCreateLocalCapturer:capturer];
localVideoTrack = [_factory videoTrackWithSource:source
trackId:kARDVideoTrackId];
[_delegate appClient:self didReceiveLocalVideoTrack:localVideoTrack];
}
然后调用...
_localVideoTrack = [self createLocalVideoTrack];
在你的初始化方法中...
- (void)initCall {
NSLog(@"%s",__PRETTY_FUNCTION__);
if (!_isTurnComplete) {
return;
}
self.state = kARDAppClientStateConnected;
_localVideoTrack = [self createLocalVideoTrack];
// Create peer connection.
_constraints = [self defaultPeerConnectionConstraints];
}
我可以通过这段代码实现!