nce_loss中使用了什么激活函数?
What activation function is used in the nce_loss?
我受困于 nce_loss activation function in the word2vec model. I want to figure out what activation function it uses among all these listed here:
These include smooth nonlinearities (sigmoid, tanh, elu, softplus,
and softsign), continuous but not everywhere differentiable functions
(relu, relu6, crelu and relu_x), and random regularization (dropout).
我在 this function 和其他地方搜索过它,但没有得到任何想法。
我想这是 relu* 系列。有什么提示吗?
None 个。它使用 CrossEntropy.
我受困于 nce_loss activation function in the word2vec model. I want to figure out what activation function it uses among all these listed here:
These include smooth nonlinearities (sigmoid, tanh, elu, softplus, and softsign), continuous but not everywhere differentiable functions (relu, relu6, crelu and relu_x), and random regularization (dropout).
我在 this function 和其他地方搜索过它,但没有得到任何想法。 我想这是 relu* 系列。有什么提示吗?
None 个。它使用 CrossEntropy.