Swift Tensorflow 中的#tfop 是什么,它在哪里定义的?

What is #tfop in Swift Tensorflow and where is it defined?

我正在浏览 swift 张量流代码,偶然发现了

var result = #tfop("Mul", a, b)

#tfop 在文档 here 中得到了很好的解释,在 'what it does' 的意义上,但我也对从语言的角度或作为函数实现的实际情况感兴趣。

除了计算图的句柄之外,#tfop 代表什么?为何 '#'?如果需要,我在哪里可以找到 tfop 实现? (我浏览了代码,但没有运气,虽然我不能保证我没有遗漏任何东西)。

克里斯·拉特纳:

#tfop is a “well known” representation used for tensor operations. It is an internal implementation detail of our stack that isn’t meant to be user visible, and is likely to change over time.

In Swift, "#foo(bar: 42)” is the general syntax used for “macro like” and “compiler magic” operations. For example C things like FILE are spelled as #file in swift: https://github.com/apple/swift-evolution/blob/master/proposals/0034-disambiguating-line.md

And the “#line 42” syntax used by the C preprocesser is represented with arguments like this: #sourceLocation(file: "foo", line: 42)

In the case of #tfop specifically, this is represented in the Swift AST as an ObjectLiteralExpr, which is the normal AST node for this sort of thing: https://github.com/google/swift/blob/tensorflow/include/swift/AST/Expr.h#L1097

We use special lowering magic to turn it into a SIL builtin instruction in SILGen, which are prefixed with "__tfop_" https://github.com/google/swift/blob/tensorflow/lib/SILGen/SILGenExpr.cpp#L3009

I’d like to move away from using builtin instructions for this, and introduce a first-class sil instruction instead, that’s tracked by: https://github.com/google/swift/issues/16

These instructions are specially recognized by the partitioning pass of GPE: https://github.com/google/swift/blob/tensorflow/lib/SILOptimizer/Mandatory/TFUtilities.cpp#L715

来源here