Fastest way to parse JSON strings into numpy arrays
33,707
Solution 1
The simplest answer would just be:
numpy_2d_arrays = np.array(dict["rings"])
As this avoids explicitly looping over your array in python you would probably see a modest speedup. If you have control over the creation of json_input
it would be better to write out as a serial array. A version is here.
Solution 2
Since JSON syntax is really near to Python syntax, I suggest you to use ast.literal_eval
. It may be faster…
import ast
import numpy as np
json_input = """{"rings" : [[[-8081441.0, 5685214.0],
[-8081446.0, 5685216.0],
[-8081442.0, 5685219.0],
[-8081440.0, 5685211.0],
[-8081441.0, 5685214.0]]]}"""
rings = ast.literal_eval(json_input)
numpy_2d_arrays = [np.array(ring) for ring in rings["rings"]]
Give it a try. And tell us.
Comments
-
Below the Radar about 4 years
I have huge json objects containing 2D lists of coordinates that I need to transform into numpy arrays for processing.
However using
json.loads
followed withnp.array()
is too slow.Is there a way to increase the speed of creation of numpy arrays from json?
import json import numpy as np json_input = '{"rings" : [[[-8081441.0, 5685214.0], [-8081446.0, 5685216.0], [-8081442.0, 5685219.0], [-8081440.0, 5685211.0], [-8081441.0, 5685214.0]]]}' dict = json.loads(json_input) numpy_2d_arrays = [np.array(ring) for ring in dict["rings"]]
I would take any solution whatsoever!
-
Below the Radar over 7 yearsThank you but ast.literal_eval is slower than json.loads() with my data
-
ei-grad almost 4 yearsusing eval is unsafe
-
Gribouillis over 3 years@ei-grad It depends on the context. Python has always been made for consenting adults!