Over a million developers have joined DZone.

MongoCursorException: E11000 Duplicate Key Error Index

· Database Zone

To stay on top of the changing nature of the data connectivity world and to help enterprises navigate these changes, download this whitepaper from Progress Data Direct that explores the results of the 2016 Data Connectivity Outlook survey.

So I’m working on a project where I’m taking a csv file that contains a little more than 100 columns of data by 10,000 rows.  (It’s a sample db file — the final file will be about 200,000,000 rows…) and writing a PHP script to process the csv file into structures that can be inserted as mongodb collections.

I’m rocking along and all is working well for initial tests of the algorithm (header + first row of actual data) but when I turn-on processing for the other 9,999 rows, all I get stored into mongo is the first row of data.

I add an echo statement after the insert and I see 10,000 names scroll across my terminal.  So the problem isn’t that I’m not getting the data, it’s that the data isn’t being stored into Mongo.  I try turning on safe writes on my $mongo->insert() function and *bam*, error message:

PHP Fatal error:  Uncaught exception ‘MongoCursorException’ with message ‘E11000 duplicate key error index: insite.testdata.$_id_  dup key: { : ObjectId(’4d25fd9a7e03972618000000′) }’ in /htdocs/framework/parseCSV.php:449
Stack trace:
#0 /htdocs/framework/parseCSV.php(449): MongoCollection->insert(Object(mongorriffic), Array)
#1 {main}
thrown in /htdocs/framework/parseCSV.php on line 449

My code looks like this:

if (($handle = fopen($argv[1], "r")) !== FALSE) {
while (($data = fgetcsv($handle, 0, ",")) !== FALSE) {
if (!$row) {
// use the csv headers to instantiate the new mongo object
$objMongo = new mongorriffic($data);
} else {
// parse the data into the existing structure
$objMongo->storeData($data);
$mongoCollection->insert($objMongo);
}
$row++;
}
fclose($handle);
}

I’m calling my internal method (storeDate()) but I’m not creating, populating or resetting an “_id” field value because mongodb is supposed to automagically handle that for me.  What’s actually happening is that mongodb  is creating a valid “_id” value for the first record, but since my method does nothing to manipulate the field, the value persists through iterations of the data. I fix this by adding the following line of code after I invoke the insert method:

unset($objMongo->_id);

I re-run my script and *success*!   10,000 records are stored in less than 6-seconds using 4 collections.  BTW,  am *loving* Mongodb….

Turn Data Into a Powerful Asset, Not an Obstacle with Democratize Your Data, a Progress Data Direct whitepaper that explains how to provide data access for your users anywhere, anytime and from any source.

Topics:

Published at DZone with permission of Micheal Shallop, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

The best of DZone straight to your inbox.

SEE AN EXAMPLE
Please provide a valid email address.

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.
Subscribe

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}