Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
401 views
in Technique[技术] by (71.8m points)

php - Getting Data From a Huge JSON File and Transfer it to CSV or SQL

Originally I have JSON file containing 1Million+ books Information and Size is approx 3.1 GB.

I want to transfer this data to some other format to use it more comfortably in sql or csv or other.

Is there any way to do it in simple way.

I know php but 3.1GB is so huge for my system to handle.

suggest any other language(with code to do the same i only understand php) or application that can do the same.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

used halaxa/json-machine.

Usage in case of iteration over JSON is the same as in case of json_decode, but it will not hit memory limit no matter how big your file is. No need to implement anything, just your foreach.

Example:

$users = JsonMachineJsonMachine::fromFile('500MB-users.json');

foreach ($users as $id => $user) {
    // process $user as usual
}

See github readme for more details.

One alternative here is to use the salsify/jsonstreamingparser

You need to create your own Listener.

$testfile = '/path/to/file.json';
$listener = new MyListener();
$stream = fopen($testfile, 'r');
try {
    $parser = new JsonStreamingParserParser($stream, $listener);
    $parser->parse();
    fclose($stream);
} catch (Exception $e) {
    fclose($stream);
    throw $e;
}

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...