I have tried to export as .json a huge table but I got errors regarding unexpected characters so, I run for all the fields to avoid any conflict:
regexp_replace(field_name,'[^a-zA-Z0-9 \-_\(\)]','','g')
But the results is that everything appears to be empty regexp_replace N as value.
I thought that would resolve the issue. Are there anyway to avoid errors of unexpected characters or NaN, running some query in the table? Or some process to correctly transform the .json or to transfer the data from postgres to .json.
These are the original errors that after running the command before didn't appear regexp_replace N for all the fields:
centroid.geojson:13811:
Found unexpected character In
JSON object {"type":"Feature","geometry":{"type":"Point","coordinates":[-0.175797882,51.56044564]},"properties":{"field1":"atribute1","field2":"atribute2","field3":"atribute3","field4":"atribute4","field5":"","atribute5":"","field6":"","field7":"","field8":"","field9":"","date":"27-02-1987","field10":"atribute10","field11":"","atribute11":"","field13":"","field14":"atribute14","field16...
path/to/file.json:398: Found misspelling of NaN In
JSON object {"type":"Feature","geometry":{"type":"MultiPolygon","coordinates":[[[[-0.018498801,51.50229262],[-0.018494037,51.502309446],[-0.018509668,51.502311149],[-0.01851684,51.5023119],[-0.018519242,51.502303037],[-0.01864384,51.502317193],[-0.018640275,51.502329632],[-0.018563854,51.502613229],[-0.018558638,51.502630497],[-0.01842617,51.502615039],[-0.018433776,51.502589179],[-0.018286221,51.502572747],[-0.018048472,51.502546247],[-0.01764496,51.502501208],[-0.017683038,51.502367501],[-0.01768609,51...
After running the query:
UPDATE table SET field = replace(field, '''', '');
and then for NaN and for NULL:
UPDATE table SET field= '' WHERE field= 'NaN';
and then again the query posted in the question and didn't throw again any of the errors mention before. Seems that all the data are there and correct.