解析单词 polysemy/homonymy 的最先进算法是什么?

What are the state-of-art algorithms for resolving words polysemy/homonymy?

我试图通过类似 word2vec 的神经网络 () 解决单词多义问题(修复文本中多义词的 WordNet 同义词集),但结果太差。 解析单词 polysemy/homonymy 的其他最先进算法是什么?你能给我一些文章吗?

您可以从 spacy's implementation of sense2vec. It is based on the original sense2vec paper 开始。来自摘要:

This paper presents a novel approach which addresses these concerns by modeling multiple embeddings for each word based on supervised disambiguation, which provides a fast and accurate way for a consuming NLP model to select a sense-disambiguated embedding. We demonstrate that these embeddings can disambiguate both contrastive senses such as nominal and verbal senses as well as nuanced senses such as sarcasm.

this page you can find NLP STATE-OF-THE-ART publications and rank, particularly word sense disambiguation - WSD SOTA. You might be interested on supWSDembUKB分别是当前时间的有监督和无监督SOTA。