如何从数组中删除所有字符,只留下数字?
How to remove all chars from an array, leaving it only numbers?
我正在使用 sklearn.MLPClassifer 并且我将我的权重导出到一个 txt 文件中:
arr = list(np.array(model.coefs_, dtype = object))
file = open("weightss.txt", "w+")
file.write (str(arr))
file.close()
问题是,model.coefs_ 中每个 [] 的权重都不相同,并且它与“数组”和我不想要的其他字符一起导出。如何从此数组中仅导出数字(浮点型)?
我得到的输出:
array([[ 0.055124 , 0.04365641, -0.0271139 , ..., 0.02899075,
-0.14442606, -0.04212195],
[-0.01849607, -0.01092366, 0.01105672, ..., -0.00383433,
-0.23810516, 0.03760549],
[ 0.02357766, -0.04846305, 0.0028944 , ..., -0.00372297,
-0.23913875, 0.03809774],
...,
[ 0.04734891, 0.00075733, 0.05273402, ..., 0.02994463,
-0.00568819, -0.05830916],
[ 0.12168126, 0.06481046, 0.03357674, ..., -0.06854297,
0.00458801, -0.06863405],
[ 0.09052689, -0.03500097, -0.06934987, ..., -0.05713005,
-0.04042818, -0.07102473]])
array([[ 6.49345111e-01, -2.80867026e+00, -4.47025490e-01,
5.47926183e-02, -2.32258820e-01, 3.00891945e-01,
-3.28820315e+00, -1.11907300e+00, 2.13128839e-01,
-2.60551663e+00, 2.42408007e+00, -1.31142015e+00,
-7.13036636e-01, 2.98367056e+00, 1.23166718e+00,
1.84157657e+00, 2.50413248e-01, -6.74166192e-01,
-1.04169355e-01, -8.85277883e-01],
[ 3.82943554e-02, -1.3093057
等等...
我希望它只是数字,没有那些“,”和其他字符,所以我可以将它导入回去以学习这些权重。
您可以尝试类似的方法:
import os
with open('weightss.txt', 'w') as fp:
for a in model.coefs_:
np.savetxt(fp, a, fmt='%f')
fp.write(os.linesep)
演示:
rng = np.random.default_rng(2022)
coefs = [np.array(rng.random((4, 5))), np.array(rng.random((5, 1)))]
print(coefs)
# Output
[array([[0.24742606, 0.09299006, 0.61176337, 0.06066207, 0.66103343],
[0.75515778, 0.1108689 , 0.04305584, 0.41441747, 0.98862926],
[0.96919869, 0.25697153, 0.55876211, 0.24234798, 0.32202029],
[0.89135975, 0.94611366, 0.72253931, 0.92847437, 0.99608701]]),
array([[0.2494223 ],
[0.06229007],
[0.94479027],
[0.65028587],
[0.32167568]])]
weightss.txt
的内容:
0.247426 0.092990 0.611763 0.060662 0.661033
0.755158 0.110869 0.043056 0.414417 0.988629
0.969199 0.256972 0.558762 0.242348 0.322020
0.891360 0.946114 0.722539 0.928474 0.996087
0.249422
0.062290
0.944790
0.650286
0.321676
我正在使用 sklearn.MLPClassifer 并且我将我的权重导出到一个 txt 文件中:
arr = list(np.array(model.coefs_, dtype = object))
file = open("weightss.txt", "w+")
file.write (str(arr))
file.close()
问题是,model.coefs_ 中每个 [] 的权重都不相同,并且它与“数组”和我不想要的其他字符一起导出。如何从此数组中仅导出数字(浮点型)?
我得到的输出:
array([[ 0.055124 , 0.04365641, -0.0271139 , ..., 0.02899075,
-0.14442606, -0.04212195],
[-0.01849607, -0.01092366, 0.01105672, ..., -0.00383433,
-0.23810516, 0.03760549],
[ 0.02357766, -0.04846305, 0.0028944 , ..., -0.00372297,
-0.23913875, 0.03809774],
...,
[ 0.04734891, 0.00075733, 0.05273402, ..., 0.02994463,
-0.00568819, -0.05830916],
[ 0.12168126, 0.06481046, 0.03357674, ..., -0.06854297,
0.00458801, -0.06863405],
[ 0.09052689, -0.03500097, -0.06934987, ..., -0.05713005,
-0.04042818, -0.07102473]])
array([[ 6.49345111e-01, -2.80867026e+00, -4.47025490e-01,
5.47926183e-02, -2.32258820e-01, 3.00891945e-01,
-3.28820315e+00, -1.11907300e+00, 2.13128839e-01,
-2.60551663e+00, 2.42408007e+00, -1.31142015e+00,
-7.13036636e-01, 2.98367056e+00, 1.23166718e+00,
1.84157657e+00, 2.50413248e-01, -6.74166192e-01,
-1.04169355e-01, -8.85277883e-01],
[ 3.82943554e-02, -1.3093057
等等... 我希望它只是数字,没有那些“,”和其他字符,所以我可以将它导入回去以学习这些权重。
您可以尝试类似的方法:
import os
with open('weightss.txt', 'w') as fp:
for a in model.coefs_:
np.savetxt(fp, a, fmt='%f')
fp.write(os.linesep)
演示:
rng = np.random.default_rng(2022)
coefs = [np.array(rng.random((4, 5))), np.array(rng.random((5, 1)))]
print(coefs)
# Output
[array([[0.24742606, 0.09299006, 0.61176337, 0.06066207, 0.66103343],
[0.75515778, 0.1108689 , 0.04305584, 0.41441747, 0.98862926],
[0.96919869, 0.25697153, 0.55876211, 0.24234798, 0.32202029],
[0.89135975, 0.94611366, 0.72253931, 0.92847437, 0.99608701]]),
array([[0.2494223 ],
[0.06229007],
[0.94479027],
[0.65028587],
[0.32167568]])]
weightss.txt
的内容:
0.247426 0.092990 0.611763 0.060662 0.661033
0.755158 0.110869 0.043056 0.414417 0.988629
0.969199 0.256972 0.558762 0.242348 0.322020
0.891360 0.946114 0.722539 0.928474 0.996087
0.249422
0.062290
0.944790
0.650286
0.321676