sklearn:SVR 无法泛化加法器函数
sklearn: SVR fails to generalise adder function
这是我学习加法函数的 SVR (y=x1 + x2):
%reset -f
#Libs
from sklearn import svm;
#PROGRAMME ENTRY POINT==========================================================
#Data, addition
#Exp[I] = sum(Inp[I])
Inp = [[1,2],[3,4],[5,6],[7,8],[9,0]];
Exp = [ 3, 7, 11, 15, 9 ];
#Train
Model = svm.SVR(kernel="poly",degree=3);
Model.fit(Inp,Exp);
#Infer
print("Input values are those in the train data:");
print(f"1 + 2 = {Model.predict([[1,2]])[0]:.6f}");
print("\nInput values are those in the train data:");
print(f"5 + 6 = {Model.predict([[5,6]])[0]:.6f}");
print("\nInput values are those NOT in the train data, but in range:");
print(f"5 + 5 = {Model.predict([[5,5]])[0]:.6f}");
print("\nInput values are those NOT in the train data, and OUT of range:");
print(f"9 + 1 = {Model.predict([[9,1]])[0]:.6f}");
#EOF
但结果不是预期的结果:
Input values are those in the train data:
1 + 2 = 6.007171
Input values are those in the train data:
5 + 6 = 9.595818
Input values are those NOT in the train data, but in range:
5 + 5 = 8.533934
Input values are those NOT in the train data, and OUT of range:
9 + 1 = 9.170507
sklearn SVR是否可以泛化加法器函数?在上面的代码中应该更改什么以使 SVR 学习 x1+x2?
三阶多项式核的方差太大,无法正确预测如此简单的函数,尤其是在如此小的数据集上。这是基于 bias/variance 权衡。在这种情况下,您的模型在方差上有所损失,而在偏差方面几乎没有任何收获(您的函数过于复杂)。低阶多项式和径向基函数也是如此。
降低模型的方差就可以了。就用线性核。
Model = svm.SVR(kernel="linear")
具有线性核的 SVM 的结果是:
Input values are those in the train data:
1 + 2 = 3.100000
Input values are those in the train data:
5 + 6 = 10.966667
Input values are those NOT in the train data, but in range:
5 + 5 = 9.983333
Input values are those NOT in the train data, and OUT of range:
9 + 1 = 9.983333
这是我学习加法函数的 SVR (y=x1 + x2):
%reset -f
#Libs
from sklearn import svm;
#PROGRAMME ENTRY POINT==========================================================
#Data, addition
#Exp[I] = sum(Inp[I])
Inp = [[1,2],[3,4],[5,6],[7,8],[9,0]];
Exp = [ 3, 7, 11, 15, 9 ];
#Train
Model = svm.SVR(kernel="poly",degree=3);
Model.fit(Inp,Exp);
#Infer
print("Input values are those in the train data:");
print(f"1 + 2 = {Model.predict([[1,2]])[0]:.6f}");
print("\nInput values are those in the train data:");
print(f"5 + 6 = {Model.predict([[5,6]])[0]:.6f}");
print("\nInput values are those NOT in the train data, but in range:");
print(f"5 + 5 = {Model.predict([[5,5]])[0]:.6f}");
print("\nInput values are those NOT in the train data, and OUT of range:");
print(f"9 + 1 = {Model.predict([[9,1]])[0]:.6f}");
#EOF
但结果不是预期的结果:
Input values are those in the train data:
1 + 2 = 6.007171
Input values are those in the train data:
5 + 6 = 9.595818
Input values are those NOT in the train data, but in range:
5 + 5 = 8.533934
Input values are those NOT in the train data, and OUT of range:
9 + 1 = 9.170507
sklearn SVR是否可以泛化加法器函数?在上面的代码中应该更改什么以使 SVR 学习 x1+x2?
三阶多项式核的方差太大,无法正确预测如此简单的函数,尤其是在如此小的数据集上。这是基于 bias/variance 权衡。在这种情况下,您的模型在方差上有所损失,而在偏差方面几乎没有任何收获(您的函数过于复杂)。低阶多项式和径向基函数也是如此。
降低模型的方差就可以了。就用线性核。
Model = svm.SVR(kernel="linear")
具有线性核的 SVM 的结果是:
Input values are those in the train data:
1 + 2 = 3.100000
Input values are those in the train data:
5 + 6 = 10.966667
Input values are those NOT in the train data, but in range:
5 + 5 = 9.983333
Input values are those NOT in the train data, and OUT of range:
9 + 1 = 9.983333