AudioGraph.CreateFileInputNodeAsync 导致 FormatNotSupported with wav 和 AudioNodeEmitter
AudioGraph.CreateFileInputNodeAsync resulting in FormatNotSupported with wav and AudioNodeEmitter
我正在尝试 AudioGraph 1.1 中的新空间音频功能 API 并且我从一个没有发射器的文件中输出声音,但是当我将发射器添加到我的节点创建时突然调用它returns 不支持格式。我无法通过搜索找到任何有用的信息,可能是因为它太新了 API。任何人都可以看到我做错了什么或遗漏了什么吗?以下是我的代码:
private async void MainPage_Loaded(object sender, RoutedEventArgs args)
{
AudioGraphSettings settings = new AudioGraphSettings(AudioRenderCategory.Media);
var devices = await DeviceInformation.FindAllAsync();
CreateAudioGraphResult result = await AudioGraph.CreateAsync(settings);
if (result.Status != AudioGraphCreationStatus.Success)
{
return;
}
graph = result.Graph;
FileOpenPicker saveFilePicker = new FileOpenPicker();
saveFilePicker.FileTypeFilter.Add(".wav");
saveFilePicker.FileTypeFilter.Add(".wma");
saveFilePicker.FileTypeFilter.Add(".mp3");
StorageFile file = await saveFilePicker.PickSingleFileAsync();
if (file == null)
{
return;
}
AudioNodeEmitter emitter = new AudioNodeEmitter(AudioNodeEmitterShape.CreateOmnidirectional(),
AudioNodeEmitterDecayModel.CreateNatural(.1,1,10,100),
AudioNodeEmitterSettings.None);
emitter.Position = new Vector3(10, 0, 5);
CreateAudioDeviceOutputNodeResult deviceOutputNodeResult = await graph.CreateDeviceOutputNodeAsync();
var outputNode = deviceOutputNodeResult.DeviceOutputNode;
CreateAudioFileInputNodeResult fileInputNodeResult = await graph.CreateFileInputNodeAsync(file, emitter);
inputNode = fileInputNodeResult.FileInputNode;
inputNode.AddOutgoingConnection(outputNode);
graph.Start();
}
你的代码没有问题,问题是:
Audio node emitters can only process audio that is formatted in mono with a sample rate of 48kHz. Attempting to use stereo audio or audio with a different sample rate will result in an exception.
可以参考Spatial audio的注释部分。
要测试此 API,您可以下载此 audio here。
我正在尝试 AudioGraph 1.1 中的新空间音频功能 API 并且我从一个没有发射器的文件中输出声音,但是当我将发射器添加到我的节点创建时突然调用它returns 不支持格式。我无法通过搜索找到任何有用的信息,可能是因为它太新了 API。任何人都可以看到我做错了什么或遗漏了什么吗?以下是我的代码:
private async void MainPage_Loaded(object sender, RoutedEventArgs args)
{
AudioGraphSettings settings = new AudioGraphSettings(AudioRenderCategory.Media);
var devices = await DeviceInformation.FindAllAsync();
CreateAudioGraphResult result = await AudioGraph.CreateAsync(settings);
if (result.Status != AudioGraphCreationStatus.Success)
{
return;
}
graph = result.Graph;
FileOpenPicker saveFilePicker = new FileOpenPicker();
saveFilePicker.FileTypeFilter.Add(".wav");
saveFilePicker.FileTypeFilter.Add(".wma");
saveFilePicker.FileTypeFilter.Add(".mp3");
StorageFile file = await saveFilePicker.PickSingleFileAsync();
if (file == null)
{
return;
}
AudioNodeEmitter emitter = new AudioNodeEmitter(AudioNodeEmitterShape.CreateOmnidirectional(),
AudioNodeEmitterDecayModel.CreateNatural(.1,1,10,100),
AudioNodeEmitterSettings.None);
emitter.Position = new Vector3(10, 0, 5);
CreateAudioDeviceOutputNodeResult deviceOutputNodeResult = await graph.CreateDeviceOutputNodeAsync();
var outputNode = deviceOutputNodeResult.DeviceOutputNode;
CreateAudioFileInputNodeResult fileInputNodeResult = await graph.CreateFileInputNodeAsync(file, emitter);
inputNode = fileInputNodeResult.FileInputNode;
inputNode.AddOutgoingConnection(outputNode);
graph.Start();
}
你的代码没有问题,问题是:
Audio node emitters can only process audio that is formatted in mono with a sample rate of 48kHz. Attempting to use stereo audio or audio with a different sample rate will result in an exception.
可以参考Spatial audio的注释部分。
要测试此 API,您可以下载此 audio here。