I am trying to parsing a large XML file and load it into MySQL. I have used simplexml
to parse it, and it works perfectly, but its way to slow for this large XML file. Now i am trying to use XMLReader
.
Here is the sample of the XML:
<?xml version="1.0" encoding="UTF-8"?>
<drug type="biotech" created="2005-06-13" updated="2015-02-23">
<drugbank-id primary="true">DB00001</drugbank-id>
<drugbank-id>BIOD00024</drugbank-id>
<drugbank-id>BTD00024</drugbank-id>
<name>Lepirudin</name>
<description>Lepirudin is identical </description>
<cas-number>120993-53-5</cas-number>
<groups>
<group>approved</group>
</groups>
<pathways>
<pathway>
<smpdb-id>SMP00278</smpdb-id>
<name>Lepirudin Action Pathway</name>
<drugs>
<drug>
<drugbank-id>DB00001</drugbank-id>
<name>Lepirudin</name>
</drug>
<drug>
<drugbank-id>DB01373</drugbank-id>
<name>Calcium</name>
</drug>
</drugs>
...
</drug>
<drug type="biotech" created="2005-06-15" updated="2015-02-25">
...
</drug>
Here is my approach using simplexml
:
<?php
$xml = simplexml_load_file('drugbank.xml');
$servername = "localhost"; // Example : localhost
$username = "root";
$password = "pass";
$dbname = "dbname";
// Create connection
$conn = new mysqli($servername, $username, $password, $dbname);
// Check connection
if ($conn->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
$xmlObject_count = $xml->drug->count();
for ($i=0; $i < $xmlObject_count; $i++) {
$name = $xml->drug[$i]->name;
$description = $xml->drug[$i]->description;
$casnumber = $xml->drug[$i]->{'cas-number'};
// ...
$created = $xml->drug[$i]['created'];
$updated = $xml->drug[$i]['updated'];
$type = $xml->drug[$i]['type'];
$sql = "INSERT INTO `drug` (name, description,cas_number,created,updated,type)
VALUES ('$name', '$description','$casnumber','$created','$updated','$type')";
if ($conn->query($sql) === TRUE) {
$last_id = $conn->insert_id;
} else {
echo "outer else Error: " . $sql . "<br>" . $conn->error. "<br>" ;
}
}
$conn->close();
It works okay and it gives me 7,789 rows. But, I want to use XMLReader
to parse this. But the problem with XMLReader
I am finding it give more than 35,000 rows.
If you look at the XML you can see that inside the <drug />
nodes there are also some other <drugs><drug>
child nodes. How can I overcome this?
Here is my procedure with XMLReader
:
<?php
$servername = "localhost"; // Example : localhost
$username = "root";
$password = "pass";
$dbname = "dbname";
// Create connection
$conn = new mysqli($servername, $username, $password, $dbname);
// Check connection
if ($conn->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
$reader = new XMLReader();
$reader->open('drugbank.xml');
while ($reader->read())
{
if ($reader->nodeType == XMLReader::ELEMENT && $reader->name == 'drug')
{
$doc = new DOMDocument('1.0', 'UTF-8');
$xml = simplexml_import_dom($doc->importNode($reader->expand(),true));
$name = $xml->name;
$description = $xml->description;
$casnumber = $xml->{'cas-number'};
// ...
$sql = "INSERT INTO `drug` (name, description,cas_number,created,updated,type)
VALUES ('$name', '$description','$casnumber','$created','$updated','$type')";
if ($conn->query($sql) === TRUE) {
$last_id = $conn->insert_id;
} else {
echo "outer else Error: " . $sql . "<br>" . $conn->error. "<br>" ;
}
}
}
$conn->close();
With this example, I am finding it give more than 35,000 rows.
Alright, I have a working example for you with much improvement in execution speed, memory usage, and database load:
<?php
define('INSERT_BATCH_SIZE', 500);
define('DRUG_XML_FILE', 'drugbank.xml');
$servername = "localhost"; // Example : localhost
$username = "root";
$password = "pass";
$dbname = "dbname";
function parseXml($mysql)
{
$drugs = array();
$xmlReader = new XMLReader();
$xmlReader->open(DRUG_XML_FILE);
// Move our pointer to the first <drug /> element.
while ($xmlReader->read() && $xmlReader->name !== 'drug') ;
$drugCount = 0;
$totalDrugs = 0;
// Iterate over the outer <drug /> elements.
while ($xmlReader->name == 'drug')
{
// Convert the node into a SimpleXMLElement for ease of use.
$item = new SimpleXMLElement($xmlReader->readOuterXML());
$name = $item->name;
$description = $item->description;
$casNumber = $item->{'cas-number'};
$created = $item['created'];
$updated = $item['updated'];
$type = $item['type'];
$drugs[] = "('$name', '$description','$casNumber','$created','$updated','$type')";
$drugCount++;
$totalDrugs++;
// Once we've reached the desired batch size, insert the batch and reset the counter.
if ($drugCount >= INSERT_BATCH_SIZE)
{
batchInsertDrugs($mysql, $drugs);
$drugCount = 0;
}
// Go to next <drug />.
$xmlReader->next('drug');
}
$xmlReader->close();
// Insert the leftovers from the last batch.
batchInsertDrugs($mysql, $drugs);
echo "Inserted $totalDrugs total drugs.";
}
function batchInsertDrugs($mysql, &$drugs)
{
// Generate a batched INSERT statement.
$statement = "INSERT INTO `drug` (name, description, cas_number, created, updated, type) VALUES";
$statement = $statement . ' ' . implode(",\n", $drugs);
echo $statement, "\n";
// Run the batch INSERT.
if ($mysql->query($statement))
{
echo "Inserted " . count($drugs) . " drugs.";
}
else
{
echo "INSERT Error: " . $statement . "<br>" . $mysql->error. "<br>" ;
}
// Clear the buffer.
$drugs = array();
}
// Create MySQL connection.
$mysql = new mysqli($servername, $username, $password, $dbname);
if ($mysql->connect_error)
{
die("Connection failed: " . $mysql->connect_error);
}
parseXml($mysql);
I tested this example using the same dataset. Using SimpleXML in the way that you are leads to parsing the entire document in memory, which is slow and memory-intensive. This approach uses XMLReader, which is a fast pull-parser. You can probably make this faster still using the PHP SAX XML Parser, but it's a bit more complex of a pattern, and the above example will be noticeably better than what you started with.
The other significant change in my example is that we're using MySQL Batched Inserts, so we only actually hit the database every 500
(configurable) items we process. You can tweak this number for better performance. After a certain point, the query will become too large for MySQL to process, but you may be able to do a lot more than 500
at one time.
If you'd like me to explain any part of this further, or if you have any problems with it, just let me know in the comments! :)