Xamarin OpenEars 本机绑定不适用于设备但适用于模拟器

Xamarin OpenEars Native Binding Not working on Device but works on Simulator

我一直致力于在 xamarin iOS 绑定项目中使用 OpenEars v2.03 iOS 框架项目。让我解释一下到目前为止我做了什么。我是 XCode、Xamarin 和所有这些 Binding 东西的新手。这将是一个大问题,所以屏住呼吸......

1) 在 Xcode 中为模拟器构建 OpenEars 框架项目。 从 Framework/OpenEars.framework/Versions/Current/ 复制“OpenEars”文件并重命名为“libOpenEars-i386.a

通过将设备连接到 Mac 并将目标选择到我的 iPhone,同样为 iPhone 4s 设备构建相同的库。最后复制生成的OpenEars,重命名为“libOpenEars-armv7.a

2) 使用 lipo 命令将两个文件 (libOpenEars-i386.a, libOpenEars-armv7.a) 捆绑到一个文件“libOpenEars.a” 使用以下命令。

lipo -create -output libOpenEars.a libOpenEars-i386.a libOpenEars-armv7.a 

3) 在Xamarin Studio 中创建一个Binding 项目并添加libOpenEars.a,它会自动生成一个libOpenEars.linkwith.cs。下面是下面的代码,

using System;
using ObjCRuntime;

[assembly: LinkWith ("libOpenEars.a", LinkTarget.ArmV7 | LinkTarget.Simulator, SmartLink = true, ForceLoad = true, Frameworks="AudioToolbox AVFoundation", IsCxx=true, LinkerFlags = "-lstdc++")]

我尝试更改 liker 标志 LinkerFlags = "-lstdc++ -lc++ -ObjC" 和 SmartLink=false。

4) 我的 ApiDefinition 文件包含 OpenEars 的所有接口,我只是在这里只添加了一个接口。

[BaseType(typeof(NSObject))]
[Protocol]
interface OEEventsObserver
{
    [Wrap ("WeakDelegate")]
    OEEventsObserverDelegate Delegate { get; set; }

    [Export ("delegate", ArgumentSemantic.Assign), NullAllowed]
    NSObject WeakDelegate { get; set; }
}

5) 将 OpenEars.dll 引用到我的 iOS 示例项目。

6) 在 Binding 库本身中添加语言模型和声学模型。 (尽管动态语言模型生成不需要它,但我使用了此 OpenEars Xamarin git 中的旧 OpenEars 示例项目,我没有使用新的 DynamicLanguageModel 生成器,但修改了示例以进行最新更改)。

视图控制器:

public partial class OpenEarsNewApiViewController : UIViewController
{
    OEEventsObserver observer;
    OEFliteController fliteController;
    OEPocketsphinxController pocketSphinxController;


    String pathToLanguageModel;
    String pathToDictionary;
    String pathToAcousticModel;

    String firstVoiceToUse;
    String secondVoiceToUse;

    static bool UserInterfaceIdiomIsPhone {
        get { return UIDevice.CurrentDevice.UserInterfaceIdiom == UIUserInterfaceIdiom.Phone; }
    }

    public void init()
    {
        try
        {
            observer = new OEEventsObserver();
            observer.Delegate = new OpenEarsEventsObserverDelegate (this);
            pocketSphinxController = new OEPocketsphinxController ();

            fliteController = new OEFliteController();

            firstVoiceToUse = "cmu_us_slt";
            secondVoiceToUse = "cmu_us_rms";

            pathToLanguageModel = NSBundle.MainBundle.ResourcePath + System.IO.Path.DirectorySeparatorChar + "OpenEars1.languagemodel";
            pathToDictionary = NSBundle.MainBundle.ResourcePath + System.IO.Path.DirectorySeparatorChar + "OpenEars1.dic";
            pathToAcousticModel = NSBundle.MainBundle.ResourcePath;
        }
        catch(Exception e) {
            Console.WriteLine ("Exception Message :"+e.Message);
            Console.WriteLine ("Inner Exception Mesage :"+e.InnerException.Message);
        }

    }

    public OpenEarsNewApiViewController (IntPtr handle) : base (handle)
    {
        init ();
    }

    #region Update

    public void UpdateStatus (String text)
    {
        txtStatus.Text = text;
    }

    public void UpdateText (String text)
    {
        txtOutput.Text = text;
    }

    public void UpdateButtonStates (bool hidden1, bool hidden2, bool hidden3, bool hidden4)
    {
        btnStartListening.Hidden = hidden1;
        btnStopListening.Hidden = hidden2;
        btnSuspend.Hidden = hidden3;
        btnResume.Hidden = hidden4;
    }

    public void Say (String text)
    {
        //fliteController.SaywithVoice (text, secondVoiceToUse);
    }

    public void StartListening ()
    {
        //pocketSphinxController.RequestMicPermission ();
        if (!pocketSphinxController.IsListening) {

            //NSString *correctPathToMyLanguageModelFile = [NSString stringWithFormat:@"%@/TheNameIChoseForMyLanguageModelAndDictionaryFile.%@",[NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES) objectAtIndex:0],@"DMP"];


            pocketSphinxController.StartListeningWithLanguageModelAtPath (
                pathToLanguageModel,
                pathToDictionary,
                pathToAcousticModel,
                false
            );
        } else {
            new UIAlertView ("Notify !!","Already Listening",null,"OK","Stop").Show();

        }

    }

    public void StopListening ()
    {
        //pocketSphinxController.StopListening ();
    }

    public void SuspendRecognition ()
    {
        pocketSphinxController.SuspendRecognition ();
    }

    public void ResumeRecognition ()
    {
        pocketSphinxController.ResumeRecognition ();
    }

    #endregion

    #region Event Handlers

    partial void btnStartListening_TouchUpInside (UIButton sender)
    {
        try
        {
            StartListening();
            //fliteController.Init();
            //Console.WriteLine("Speech in Progress :"+fliteController.SpeechInProgress);
            //fliteController.Say("Hai", new OEFliteVoice());

            UpdateButtonStates (true, false, false, true);
            Console.WriteLine("Speech in Progress :"+fliteController.SpeechInProgress);
        }
        catch(Exception e)
        {
            Console.WriteLine(e.Message);
        }
    }

    partial void btnStopListening_TouchUpInside (UIButton sender)
    {
        StopListening ();
        UpdateButtonStates (false, true, true, true);
    }

    partial void btnSuspend_TouchUpInside (UIButton sender)
    {
        SuspendRecognition ();
        UpdateButtonStates (true, false, true, false);
    }

    partial void btnResume_TouchUpInside (UIButton sender)
    {
        ResumeRecognition ();
        UpdateButtonStates (true, false, false, true);
    }
}

OpenEarsEventsObserverDelegate:

// nothing much here just to check the status and debugging 

public class OpenEarsEventsObserverDelegate:OEEventsObserverDelegate
{
    OpenEarsNewApiViewController _controller;

    public OpenEarsNewApiViewController controller {
        get {
            return _controller;
        }
        set {
            _controller = value;
        }
    }

    public OpenEarsEventsObserverDelegate (OpenEarsNewApiViewController ctrl)
    {
        controller = ctrl;
    }

    public override void PocketsphinxRecognitionLoopDidStart()
    {
        //base.PocketsphinxRecognitionLoopDidStart();

        Console.WriteLine ("Pocketsphinx is starting up");
        controller.UpdateStatus ("Pocketsphinx is starting up");
    }

    public override void PocketsphinxDidReceiveHypothesis (Foundation.NSString hypothesis, Foundation.NSString recognitionScore, Foundation.NSString utteranceID)
    {
        controller.UpdateText ("Heard: " + hypothesis);
        controller.Say ("You said: " + hypothesis);
    }

    public override void PocketSphinxContinuousSetupDidFail ()
    {

    }

    public override void PocketsphinxDidCompleteCalibration ()
    {
        Console.WriteLine ("Pocket calibration is complete");
        controller.UpdateStatus ("Pocket calibratio is complete");
    }

    public override void PocketsphinxDidDetectSpeech ()
    {

    }

    public override void PocketsphinxDidStartListening ()
    {
        Console.WriteLine ("Pocketsphinx is now listening");
        controller.UpdateStatus ("Pocketphinx is now listening");
        controller.UpdateButtonStates (true, false, false, true);
    }

    public override void PocketsphinxDidStopListening ()
    {

    }

    public override void PocketsphinxDidStartCalibration ()
    {
        Console.WriteLine ("Pocketsphinx calibration has started.");
        controller.UpdateStatus ("Pocketsphinx calibration has started");
    }

    public override void PocketsphinxDidResumeRecognition ()
    {

    }

    public override void PocketsphinxDidSuspendRecognition ()
    {

    }

    public override void PocketsphinxDidDetectFinishedSpeech ()
    {

    }

    public override void FliteDidStartSpeaking ()
    {

    }

    public override void FliteDidFinishSpeaking ()
    {

    }
}

这在 iOS 模拟器上完美运行,但在 运行 真实设备上运行不佳。

我收到此错误消息,而 运行 在 device.I 上收到所有接口的相同消息。

Exception Message :Wrapper type 'OpenEars.OEEventsObserver' is missing its native ObjectiveC class 'OEEventsObserver'.

2015-05-15 12:55:26.996 OpenEarsNewApi[1359:231264] Unhandled managed  exception: Exception has been thrown by the target of an invocation.  (System.Reflection.TargetInvocationException)
at System.Reflection.MonoCMethod.InternalInvoke (System.Object obj,   System.Object[] parameters) [0x00016] in   /Developer/MonoTouch/Source/mono/mcs/class/corlib/System.Reflection/MonoMethod.cs:543 

我是否遗漏了与设备绑定相关的任何内容?

我也尝试使用 make 文件构建相同的 .dll,但收到相同的错误消息。

构建 OpenEars 框架:

xcodebuild -project OpenEars.xcodeproj -target OpenEars -sdk iphonesimulator8.2 -arch i386 -configuration Release clean build

xcodebuild -project OpenEars.xcodeproj -target OpenEars -sdk iphoneos -arch armv7 -configuration Release clean build

为生成评级制作文件 OpenEars.dll

BTOUCH=/Developer/MonoTouch/usr/bin/btouch-native

all: OpenEars.dll


OpenEars.dll: AssemblyInfo.cs OpenEars.cs libOpenEars.a
$(BTOUCH) -unsafe --new-style -out:$@ OpenEars.cs -x=AssemblyInfo.cs --link-with=libOpenEars.a,libOpenEars.a

clean:
   -rm -f *.dll

查看完整mtoucherror log here

$lipo -info libOpenEars.a

Architectures in the fat file: libOpenEars.a are: i386 armv7 

检查 $nm -arch armv7 libOpenEars.a

nm command output here

检查模拟器 (i386) 中是否存在 OEEvent

$ nm -arch i386 libOpenEars.a | grep OEEvent

输出

U _OBJC_CLASS_$_OEEventsObserver
00006aa0 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000076f0 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate
warning: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm: no name list
libOpenEars.a(OEEventsObserver.o):
00002174 S _OBJC_CLASS_$_OEEventsObserver
00002170 S _OBJC_IVAR_$_OEEventsObserver._delegate
00002188 S _OBJC_METACLASS_$_OEEventsObserver
     U _OBJC_CLASS_$_OEEventsObserver
00002d90 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000035a0 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate

检查 armv7 中是否存在 OEEvent

$nm -arch armv7 libOpenEars.a | grep OEEvent

输出

 U _OBJC_CLASS_$_OEEventsObserver
00005680 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000062d8 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate
warning:    /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm: no name list
libOpenEars.a(OEEventsObserver.o):
00001cb4 S _OBJC_CLASS_$_OEEventsObserver
00001cb0 S _OBJC_IVAR_$_OEEventsObserver._delegate
00001cc8 S _OBJC_METACLASS_$_OEEventsObserver
     U _OBJC_CLASS_$_OEEventsObserver
00002638 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
00002e50 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate

我不确定我错过了什么。是的,有很多语法错误,我感谢您花时间阅读这篇文章。

感谢@poupou 和@Halle 的宝贵意见。最后,我使用包括 arm64 和 x86_64(必须)在内的所有架构构建了胖二进制文件。使用 lipo 构建一个 package.Now 就像魅力一样的东西!...还设置项目属性-> 高级-> SupportedArchi。 -> ARMv7 for 运行 in device like ipad 2 and iPhone 4. Still need to test in iPhone 6 and 6+, i hope that might also support since they are arm64 系列。我不确定这在 ARMv7 上是如何工作的(iPhone 5,iPhone 5c,iPad 4)。我在 OpenEars v2.03 中没有看到 ARMv7s 支持。