fast.ai 在张量流中等效
fast.ai equivalent in tensorflow
tensorfow 中是否有任何 equivalent/alternate 库用于 fastai 以更轻松地训练和调试深度学习模型,包括分析 Tensorflow 中训练模型的结果。
Fastai 建立在 pytorch 之上,在 tensorflow 中寻找类似的。
显而易见的选择是使用 tf.keras
。
它与 tensorflow 捆绑在一起,并正在成为其官方的“高级”API——到了在 TF 2 中你可能需要特意去做的地步,而不是完全使用它。
这显然是 fastai
的灵感来源,可以像 Keras 对 tensorflow 一样轻松使用 pytorch,如 mentionned by the authors time and again:
Unfortunately, Pytorch was a long way from being a good option for part one of the course, which is designed to be accessible to people with no machine learning background. It did not have anything like the clear simple API of Keras for training models. Every project required dozens of lines of code just to implement the basics of training a neural network. Unlike Keras, where the defaults are thoughtfully chosen to be as useful as possible, Pytorch required everything to be specified in detail. However, we also realised that Keras could be even better. We noticed that we kept on making the same mistakes in Keras, such as failing to shuffle our data when we needed to, or vice versa. Also, many recent best practices were not being incorporated into Keras, particularly in the rapidly developing field of natural language processing. We wondered if we could build something that could be even better than Keras for rapidly training world-class deep learning models.
tensorfow 中是否有任何 equivalent/alternate 库用于 fastai 以更轻松地训练和调试深度学习模型,包括分析 Tensorflow 中训练模型的结果。 Fastai 建立在 pytorch 之上,在 tensorflow 中寻找类似的。
显而易见的选择是使用 tf.keras
。
它与 tensorflow 捆绑在一起,并正在成为其官方的“高级”API——到了在 TF 2 中你可能需要特意去做的地步,而不是完全使用它。
这显然是
fastai
的灵感来源,可以像 Keras 对 tensorflow 一样轻松使用 pytorch,如 mentionned by the authors time and again:Unfortunately, Pytorch was a long way from being a good option for part one of the course, which is designed to be accessible to people with no machine learning background. It did not have anything like the clear simple API of Keras for training models. Every project required dozens of lines of code just to implement the basics of training a neural network. Unlike Keras, where the defaults are thoughtfully chosen to be as useful as possible, Pytorch required everything to be specified in detail. However, we also realised that Keras could be even better. We noticed that we kept on making the same mistakes in Keras, such as failing to shuffle our data when we needed to, or vice versa. Also, many recent best practices were not being incorporated into Keras, particularly in the rapidly developing field of natural language processing. We wondered if we could build something that could be even better than Keras for rapidly training world-class deep learning models.