将数据从 postgres 传输到 json 时出现问题
problems transfering data from postgres to json
我试图导出为 .json 一个巨大的 table 但我遇到了关于意外字符的错误所以,我 运行 为所有字段避免任何冲突:
regexp_replace(field_name,'[^a-zA-Z0-9 \-_\(\)]','','g')
但结果是一切似乎都是空的 regexp_replace N 作为值。
我认为这会解决问题。有没有办法避免意外字符或 NaN 的错误,运行在 table 中进行一些查询?或者一些正确转换 .json 或将数据从 postgres 传输到 .json.
的过程
这些是在 运行 之前的命令之后没有出现的原始错误 regexp_replace 所有字段的 N:
- 发现意外字符:
centroid.geojson:13811:
Found unexpected character
In JSON object {"type":"Feature","geometry":{"type":"Point","coordinates":[-0.175797882,51.56044564]},"properties":{"field1":"atribute1","field2":"atribute2","field3":"atribute3","field4":"atribute4","field5":"","atribute5":"","field6":"","field7":"","field8":"","field9":"","date":"27-02-1987","field10":"atribute10","field11":"","atribute11":"","field13":"","field14":"atribute14","field16...
- 发现 NaN 的拼写错误:与另一个相似,但在使用 multipoligons 时,
path/to/file.json:398: Found misspelling of NaN
In
JSON object {"type":"Feature","geometry":{"type":"MultiPolygon","coordinates":[[[[-0.018498801,51.50229262],[-0.018494037,51.502309446],[-0.018509668,51.502311149],[-0.01851684,51.5023119],[-0.018519242,51.502303037],[-0.01864384,51.502317193],[-0.018640275,51.502329632],[-0.018563854,51.502613229],[-0.018558638,51.502630497],[-0.01842617,51.502615039],[-0.018433776,51.502589179],[-0.018286221,51.502572747],[-0.018048472,51.502546247],[-0.01764496,51.502501208],[-0.017683038,51.502367501],[-0.01768609,51...
- 发现 Infinity 的拼写错误:与上一个相同
在运行查询之后:
UPDATE table SET field = replace(field, '''', '');
然后对于 NaN 和 NULL:
UPDATE table SET field= '' WHERE field= 'NaN';
然后再次在问题中发布查询,并没有再次抛出之前提到的任何错误。似乎所有的数据都在那里而且是正确的。
我试图导出为 .json 一个巨大的 table 但我遇到了关于意外字符的错误所以,我 运行 为所有字段避免任何冲突:
regexp_replace(field_name,'[^a-zA-Z0-9 \-_\(\)]','','g')
但结果是一切似乎都是空的 regexp_replace N 作为值。
我认为这会解决问题。有没有办法避免意外字符或 NaN 的错误,运行在 table 中进行一些查询?或者一些正确转换 .json 或将数据从 postgres 传输到 .json.
的过程这些是在 运行 之前的命令之后没有出现的原始错误 regexp_replace 所有字段的 N:
- 发现意外字符:
centroid.geojson:13811:
Found unexpected character In
JSON object {"type":"Feature","geometry":{"type":"Point","coordinates":[-0.175797882,51.56044564]},"properties":{"field1":"atribute1","field2":"atribute2","field3":"atribute3","field4":"atribute4","field5":"","atribute5":"","field6":"","field7":"","field8":"","field9":"","date":"27-02-1987","field10":"atribute10","field11":"","atribute11":"","field13":"","field14":"atribute14","field16...
- 发现 NaN 的拼写错误:与另一个相似,但在使用 multipoligons 时,
path/to/file.json:398: Found misspelling of NaN In
JSON object {"type":"Feature","geometry":{"type":"MultiPolygon","coordinates":[[[[-0.018498801,51.50229262],[-0.018494037,51.502309446],[-0.018509668,51.502311149],[-0.01851684,51.5023119],[-0.018519242,51.502303037],[-0.01864384,51.502317193],[-0.018640275,51.502329632],[-0.018563854,51.502613229],[-0.018558638,51.502630497],[-0.01842617,51.502615039],[-0.018433776,51.502589179],[-0.018286221,51.502572747],[-0.018048472,51.502546247],[-0.01764496,51.502501208],[-0.017683038,51.502367501],[-0.01768609,51...
- 发现 Infinity 的拼写错误:与上一个相同
在运行查询之后:
UPDATE table SET field = replace(field, '''', '');
然后对于 NaN 和 NULL:
UPDATE table SET field= '' WHERE field= 'NaN';
然后再次在问题中发布查询,并没有再次抛出之前提到的任何错误。似乎所有的数据都在那里而且是正确的。