我可以在 python 中使用 NLTK 从 Spacy 依赖树中找到主题吗?

Can I find subject from Spacy Dependency tree using NLTK in python?

我想从使用 Spacy 的句子中找到 主题 。下面的代码工作正常,并给出了一个 依赖树 .

import spacy
from nltk import Tree

en_nlp = spacy.load('en')

doc = en_nlp("The quick brown fox jumps over the lazy dog.")

def to_nltk_tree(node):
    if node.n_lefts + node.n_rights > 0:
        return Tree(node.orth_, [to_nltk_tree(child) for child in node.children])
    else:
        return node.orth_


[to_nltk_tree(sent.root).pretty_print() for sent in doc.sents]

从这个依赖树代码中,我能找到这句话的主题吗?

我不确定您是否想使用 nltk 解析树编写代码(请参阅 )。但是,spacy 也会使用 word.dep_ 属性 的 'nsubj' 标签生成它。

import spacy
from nltk import Tree

en_nlp = spacy.load('en')

doc = en_nlp("The quick brown fox jumps over the lazy dog.")

sentence = next(doc.sents) 
for word in sentence:
...     print "%s:%s" % (word,word.dep_)
... 
The:det
quick:amod
brown:amod
fox:nsubj
jumps:ROOT
over:prep
the:det
lazy:amod
dog:pobj

提醒一下,如果有多个,可能会出现更复杂的情况。

>>> doc2 = en_nlp(u'When we study hard, we usually do well.')
>>> sentence2 = next(doc2.sents)
>>> for word in sentence2:
...     print "%s:%s" %(word,word.dep_)
... 
When:advmod
we:nsubj
study:advcl
hard:advmod
,:punct
we:nsubj
usually:advmod
do:ROOT
well:advmod
.:punct

和leavesof3一样,我更喜欢用spaCy来达到这种目的。它具有更好的可视化效果,即

主语将是具有依存关系 属性 "nsubj" 或 "normal subject"

的单词或短语(如果您使用名词组块)

You can access displaCy (spaCy visualization) demo here

试试这个:

import spacy
import en_core_web_sm
nlp = spacy.load('en_core_web_sm')
sent = "I need to be able to log into the Equitable siteI tried my username and password from the AXA Equitable site which worked fine yesterday but it won't allow me to log in and when I try to change my password it says my answer is incorrect for the secret question I just need to be able to log into the Equitable site"
nlp_doc=nlp(sent)
subject = [tok for tok in nlp_doc if (tok.dep_ == "nsubj") ]
print(subject)