Symfony2 / Doctrine2 throwing index error when flushing the entity manager Symfony2 / Doctrine2 throwing index error when flushing the entity manager symfony symfony

Symfony2 / Doctrine2 throwing index error when flushing the entity manager


I had this same problem, in my case it was because I was doing an entity remove & flush in a postRemove LifeCycle Event. From what I could grok in the UnitOfWork logic, you cannot call flush in that event. Doing so will flush any actions still pending to be flushed from the first call to flush, which will already have been removed by the flush call in the event. After figuring that out I was able to find this section of the Doctrine manual that confirmed my suspicion:

http://docs.doctrine-project.org/en/latest/reference/events.html#postupdate-postremove-postpersist

You are correct to assume that flushing each step individually is a bad approach, but if you don't make use of LifeCycle events then I'm not sure what could be causing your particular problem. I was able to debug the issue by error logging the object id variable ($oid) in the executeDeletions function of UnitOfWork.php. I noticed the same id was being deleted repeatedly, which once unset from $this->entityIdentifiers, will fail on subsequent deletes.

My solution was to simply catalog each ID in the postRemove event, and then actually remove the entities in a postFlush event, which is not a lifecycle event and therefore can make subsequent persist operations:

http://docs.doctrine-project.org/en/latest/reference/events.html#lifecycle-events

I'm sure you have since moved on, but in case anyone else runs into this problem...


This is not a "solution" but I hope it provides more troubleshooting information to others.

I had this same exact error when following Doctrine's recommendation for batch processing mass inserts. FYI, this is from a controller (not a lifecycle event like the other answer mentions).

Doctrine's Recommended Method

$batchSize = 20;for ($i = 1; $i <= 10000; ++$i) {    $user = new CmsUser;    $user->setStatus('user');    $user->setUsername('user' . $i);    $user->setName('Mr.Smith-' . $i);    $em->persist($user);    if (($i % $batchSize) === 0) {        $em->flush();        $em->clear(); // Detaches all objects from Doctrine!    }}$em->flush(); //Persist objects that did not make up an entire batch$em->clear();

When I used something similar to the recommended code above, it would fail with the same errors after about 2000 inserts.

Change Order of Persist

The order I persisted the 10,000 entities made no difference. And it made no difference if I persisted, flushed and cleared every single loop (not ideal, but I tried it).

Remove Clear

If I just commented out the $em->clear() within the $batchSize check and made it only clear after the loop finished, then it timed out:

Fatal error: Maximum execution time of 30 seconds exceeded in /var/www/core/cms/vendor/doctrine/orm/lib/Doctrine/ORM/UnitOfWork.php on line 541

So I put a set_time_limit(3600) on the script to prevent time out, and it no longer gave the error, but it ran out of memory :P

This would suggest that the problem occurs when $em->clear() is executed within the loop. This is consistent with other questions. Unfortunately, without $em->clear(), you run out of memory quickly.

Disable Event Listeners

The other answer mentions that Event Listeners may be causing this, so I disabled them like it suggested:

foreach ($em->getEventManager()->getListeners() as $event => $listeners) {    foreach ($listeners as $listener) {        $em->getEventManager()->removeEventListener($event, $listener);    }}

But that didn't work either... although it seems like this could be the issue, somehow, and that this doesn't actually successfully disable them.

Validate Schema

I also validated my schema:

php app/console doctrine:schema:validate

And no errors are reported.

[Mapping]  OK - The mapping files are correct.[Database] OK - The database schema is in sync with the mapping files.