used halaxa/json-machine.
Usage in case of iteration over JSON is the same as in case of json_decode, but it will not hit memory limit no matter how big your file is. No need to implement anything, just your foreach.
Example:
$users = JsonMachineJsonMachine::fromFile('500MB-users.json');
foreach ($users as $id => $user) {
// process $user as usual
}
See github readme for more details.
One alternative here is to use the salsify/jsonstreamingparser
You need to create your own Listener.
$testfile = '/path/to/file.json';
$listener = new MyListener();
$stream = fopen($testfile, 'r');
try {
$parser = new JsonStreamingParserParser($stream, $listener);
$parser->parse();
fclose($stream);
} catch (Exception $e) {
fclose($stream);
throw $e;
}
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…