NLTK Word Tokenize 没有 return 任何东西

NLTK Word Tokenize doesn't return anything

我正在尝试标记一个句子,我相信代码是正确的,但没有输出。可能是什么问题呢?这是代码。

import nltk
from nltk.tokenize import word_tokenize
text = word_tokenize("And now for something completely different")
nltk.pos_tag(text)

text = word_tokenize("They refuse to permit us to obtain the refuse permit")
nltk.pos_tag(text)

似乎缺少以下软件包。

  1. 朋克
  2. averaged_perceptron_tagger

注意:第一次下载需要下载

试试这个..

import nltk

nltk.download('punkt')
nltk.download('averaged_perceptron_tagger')

from nltk.tokenize import word_tokenize
text = word_tokenize("And now for something completely different")
print(nltk.pos_tag(text))

text = word_tokenize("They refuse to permit us to obtain the refuse permit")
print(nltk.pos_tag(text))

print("----End of execution----")

Try this on IDE