Skip to content Skip to sidebar Skip to footer

Python Load UTF-8 JSON

I have the following JSON (for simplicity's sake I'll only use one but there are 100 entries in reality): { 'Active': false, 'Book': 'US Derivat. London, Mike Übersax/Mic

Solution 1:

In Python 2, the csv module does not support writing Unicode. You need to encode it manually here, as otherwise your Unicode values are encoded for you using ASCII (which is why you got the encoding exception).

This also means you need to write the UTF-8 BOM manually, but only if you really need it. UTF-8 can only be written one way, a Byte Order Mark is not needed to read UTF-8 files. Microsoft likes to add it to files to make the task of detecting file encodings easier for their tools, but the UTF-8 BOM may actually make it harder for other tools to work correctly as they won't ignore the extra initial character.

Use:

with open('EFSDUMP.csv', 'wb') as csv_file:
    csv_file.write(codecs.BOM_UTF8)
    content_writer = csv.writer(csv_file)
    content_writer.writerow([unicode(v).encode('utf8') for v in data.values()])

Note that this'll write your values in arbitrary (dictionary) order. The unicode() call will convert non-string types to unicode strings first before encoding.

To be explicit: you've loaded the JSON data just fine. It is the CSV writing that failed for you.


Post a Comment for "Python Load UTF-8 JSON"