Why "Allowed memory size exhausted"?

10,688

Solution 1

Try using http://www.php.net/manual/en/function.mysql-unbuffered-query.php (mysql_unbuffered_query()) to prevent the whole table being loaded into memory, but still avoiding pagination.

Solution 2

Limit your query to for example 1k results and execute it again (with the offset ofcourse) until you've gone through all the table. Your current unset makes no difference, since $row gets overwritten with each while iteration, so you can skip it.

$chunk_size = 1000;
$done = 0;

$keep_asking_for_data = true;
do{
    $result = mysql_query("SELECT * FROM `large_table` LIMIT {$done}, {$chunk_size}");
    $num_rows = mysql_num_rows($result);
    if($num_rows){
        $done += $num_rows;
        while($row = mysql_fetch_assoc($result)){
            echo "{$row['id']}\n";
        }
    } else {
        $keep_asking_for_data = false;
    }
    mysql_free_result($result);
}while($keep_asking_for_data);

Just compiled on my head, hope it works =D

Solution 3

If you're using MySQL, page your results so you don't exhaust your available memory. MySQL itself is taking up this memory with your database resultset. Have a look at the following link, in particular the LIMIT offset, limit syntax of SELECT:

http://dev.mysql.com/doc/refman/5.0/en/select.html

Solution 4

I had the same problem with a large database. I ran out of memory, despite unsetting the $row variable at about 400,000 records, but the unbuffered query fixed it.

Just for reference for others (and me when I do it again!), some unbuffered query example code is:

$sql = "SELECT SEQ, RECTYPE, ROSTERGRP, EMPNM, EMPNUM, DT, RDUTYCAT, ADUTYCAT FROM " .
        $tblRosters . " ORDER BY EMPNUM,DT";
$result  = mysql_unbuffered_query( $sql, $dbConn );
$svRow = array();
while ( $row = mysql_fetch_array( $result ) )
    {
         // your processing code here
    }
    // Unset, close db etc. if you are finished goes here.
Share:
10,688
Matthieu Napoli
Author by

Matthieu Napoli

I am a software engineer passionate about code and human interactions around it. I like to work with great people, learn and get things done. You can read more about me on my blog or on my GitHub profile. Here are some projects I'm working on: bref.sh: deploy PHP on AWS Lambda to create serverless applications PHP-DI - Dependency injection library for PHP externals.io @matthieunapoli

Updated on July 29, 2022

Comments

  • Matthieu Napoli
    Matthieu Napoli almost 2 years

    I am writing a batch script and get a Allowed memory size of 134217728 bytes exhausted error.

    I don't understand why the memory is filling up. I tried unsetting the $row variable, but that didn't change a thing. Here is my code:

    // ... (sql connection)
    $result = mysql_query("SELECT * FROM large_table");
    
    while ($row = mysql_fetch_array($result)) {
        echo $row['id'] . PHP_EOL;
        unset($row);
    }
    

    (simplified code)

    Why does the memory fill up, and how can I avoid it?

    Note: this is a batch script. This is normal that I have to handle data like that (go through 1 million lines).

    Update: The out of memory happens around the 400 000th line, so this has got to be something in the loop? I'd like to avoid having to implement the paging if possible.

  • Matthieu Napoli
    Matthieu Napoli over 12 years
    That was an idea but I confess I was too lazy to do it :p. Anyway, the error happens around the 400 000th line (see question update) so it may be something in the loop that maybe I can fix?
  • Matthieu Napoli
    Matthieu Napoli over 12 years
    I tried adding the unset to force the collection of the memory used by the variable, which is not guaranteed else (PHP GC). Anyway, see the comment to the other answer and the question update
  • Vilius Sutkus '89
    Vilius Sutkus '89 over 12 years
    Damn! "I'd like to avoid having to implement the paging if possible." This wasn't there when I made the code :D
  • halfer
    halfer over 12 years
    Your approach as it stands ought to buffer some results - removing buffering so paging becomes unnecessary is probably not the way to do it, imo. Anyway it should be easy now - @PalmTree has done it for you :)
  • Matthieu Napoli
    Matthieu Napoli over 12 years
    That did it! Your intuition was right, and I like it it answers the root of the problem.
  • Matthieu Napoli
    Matthieu Napoli over 12 years
    btw I don't understand the downvotes without explanations... I upvoted to compensate
  • Matthieu Napoli
    Matthieu Napoli over 12 years
    Why do you think removing paging is not the good way to do it?
  • Matthieu Napoli
    Matthieu Napoli over 12 years
    @PalmTree Yep sorry about the update ;), +1 for the effort. Without the solution of KillerX I would have taken that code gladly now the work is done ;)
  • halfer
    halfer over 12 years
    No, I think paging should be added in; it's removing buffering that I felt was not the right approach. Removing buffering will (presumably) reduce its performance - this is fine if it is for good technical reasons, but being too lazy isn't one of them ;-)
  • halfer
    halfer over 12 years
    I'm not sure this is the right approach, but +1 since it does answer the original question :)
  • Vilius Sutkus '89
    Vilius Sutkus '89 over 12 years
    This seems to be the "proper" way, although I haven't heard of it before. Props
  • Vilius Sutkus '89
    Vilius Sutkus '89 over 12 years
  • Vilius Sutkus '89
    Vilius Sutkus '89 over 12 years
    Would have posted a 'copy' of your answer, but I can only speak in code lines :D
  • KillerX
    KillerX over 12 years
    Upvote for a more "conventional" solution than what I posted.
  • KillerX
    KillerX over 12 years
    To tell the truth I do not have any experience with the function so I do not know if there are any pitfails that you might run into later, like the MySQL connection dying in the middle of processing. If someone reading this knows about any problems of that kind, please post, it would be interesting to know about it :)
  • Pete855217
    Pete855217 almost 11 years
    I think using the unbuffered query is a better solution - requires far less mucking around.