使用 NSAttributedString 添加后如何从 UITextView 检索照片?

How to retrieve photo from UITextView after adding it using NSAttributedString?

我正在使用下一个代码将图像添加到 UITextView:

    UITextView *textView = [[UITextView alloc] initWithFrame:CGRectMake(200,200,140,140)];
    textView.font = [UIFont systemFontOfSize:20.0f];
    NSMutableAttributedString *attributedString = [[NSMutableAttributedString alloc] initWithString:@"Test  with emoji"];
    NSTextAttachment *textAttachment = [[NSTextAttachment alloc] init];
    textAttachment.image = [UIImage imageNamed:@"Angel.png"];

    //for the padding inside the textView
    textAttachment.image = [UIImage imageWithCGImage:textAttachment.image.CGImage scale:3.0 orientation:UIImageOrientationUp];
    NSAttributedString *attrStringWithImage = [NSAttributedString attributedStringWithAttachment:textAttachment];
    [attributedString replaceCharactersInRange:NSMakeRange(5, 1) withAttributedString:attrStringWithImage];
    [attributedString addAttribute:NSFontAttributeName value:[UIFont systemFontOfSize:17] range:NSMakeRange(0, attributedString.length)];
    textView.attributedText = attributedString;
    NSLog(@"Text view: %@", textView.attributedText);

    [self.view addSubview:textView];

结果如下所示:

我感兴趣的是,我如何知道在文本字段和 位置插入了什么图片? 我正在考虑使用 attributedText,正如您在代码中看到的那样,因为它记录了:

Text view: Test {
    NSFont = "<UICTFont: 0x7ff0324f2110> font-family: \".HelveticaNeueInterface-Regular\"; font-weight: normal; font-style: normal; font-size: 17.00pt";
}{
    NSAttachment = "<NSTextAttachment: 0x7ff032682bc0>";
    NSFont = "<UICTFont: 0x7ff0324f2110> font-family: \".HelveticaNeueInterface-Regular\"; font-weight: normal; font-style: normal; font-size: 17.00pt";
}with emoji{
    NSFont = "<UICTFont: 0x7ff0324f2110> font-family: \".HelveticaNeueInterface-Regular\"; font-weight: normal; font-style: normal; font-size: 17.00pt";
} 

更新

使用代码检索图像:

NSMutableArray *imagesArray = [[NSMutableArray alloc] init];
    [attributedString enumerateAttribute:NSAttachmentAttributeName
                                 inRange:NSMakeRange(0, [attributedString length])
                                 options:0
                              usingBlock:^(id value, NSRange range, BOOL *stop)
    {
        if ([value isKindOfClass:[NSTextAttachment class]])
        {
            NSTextAttachment *attachment = (NSTextAttachment *)value;
            UIImage *image = nil;
            if ([attachment image])
                image = [attachment image];
            else
                image = [attachment imageForBounds:[attachment bounds]
                                     textContainer:nil
                                    characterIndex:range.location];

            if (image)
                [imagesArray addObject:image];
        }
    }];

但是如果 attributedString 包含超过 1 张连续的照片怎么办? 示例:

代码

NSMutableAttributedString *attributedString = [[NSMutableAttributedString alloc] initWithString:@"Test   with emoji "];
    [attributedString replaceCharactersInRange:NSMakeRange(4, 1) withAttributedString:attrStringWithImage];
    [attributedString replaceCharactersInRange:NSMakeRange(5, 1) withAttributedString:attrStringWithImage];

日志:

Image array: (
    "<UIImage: 0x7fd4e3e56760>"
)

代码

NSMutableAttributedString *attributedString = [[NSMutableAttributedString alloc] initWithString:@"Test  with emoji "];
[attributedString replaceCharactersInRange:NSMakeRange(4, 1) withAttributedString:attrStringWithImage];
    [attributedString replaceCharactersInRange:NSMakeRange(16, 1) withAttributedString:attrStringWithImage];

日志

Image array: (
    "<UIImage: 0x7f9ce35a4a70>",
    "<UIImage: 0x7f9ce35a4a70>"
)

那么,我正在做的事情是否存在错误或 enumerateAttribute 方法存在错误?

更新 2 如果我为我添加的每张照片创建一个新的 textAttachmentattrStringWithImage 实例,就设法解决了这个问题。

检索图像的解释

你的新问题是,如果两张图片连续且相同。

所以代替:

if (image)
    [imagesArray addObject:image];

你还需要做其他的检查,这对两张图片就可以了,但是你不知道它们是否连续。

if (image)
{
    if ([imagesArray lastObject] != image)
        [imagesArray addObject:image];
}

所以你也需要保留 NSRange 的引用。

if (image)
{
    if ([imagesArray count] > 0)
    {
        NSDictionary *lastFound = [imagesArray lastObject];
        NSRange lastRange = [lastFound[@"range"] rangeValue];
        UIImage *lastImage = lastFound[@"image"];
        if (lastImage == image && lastRange.location+lastRange.length == range.location)
        { //Two images same & consecutive}
        else
        {
            [imagesArray addObject:@{@"image":image, @"range":[NSValue valueWithRange:range]}];
        }
    }
    else
    {
        [imagesArray addObject:@{@"image":image, @"range":[NSValue valueWithRange:range]}];
    }
}

仅检索图像:

NSArray *onlyImages = [imagesArray valueForKey:@"image"];

注意:我没有检查这段代码是否编译,但你应该了解整个想法。 我的范围计算可能是错误的(一些 +1/-1 丢失,但不难通过测试验证),如果两个相同的连续图像之间存在 space 怎么办?您可能想要获取 (NSString *stringBetween = [[attributedString string] substringWithRange:NSMakeRange(lastRange.location+lastRange.length, range.location-lastRange.location+lastRange.length)] 之间的字符串,并检查 spaces、标点符号(有很多方法可以做到)。

补充说明: 在你的情况下,只比较 image != newImage 可能就足够了,但如果你使用网络图像,或者甚至在你的包中使用不同名称但相同的两个图像,那么知道它们是否相同是另一个问题。 SO 上有几个关于比较两个图像的问题,但这应该需要一些 time/ressources.