Handling large datasets with PHP/Drupal

Posted by jo on Stack Overflow See other posts from Stack Overflow or by jo
Published on 2010-03-19T12:27:45Z Indexed on 2010/03/19 12:31 UTC
Read the original article Hit count: 188

Filed under:
|
|
|

Hi all,

I have a report page that deals with ~700k records from a database table. I can display this on a webpage using paging to break up the results. However, my export to PDF/CSV functions rely on processing the entire data set at once and I'm hitting my 256MB memory limit at around 250k rows.

I don't feel comfortable increasing the memory limit and I haven't got the ability to use MySQL's save into outfile to just serve a pre-generated CSV. However, I can't really see a way of serving up large data sets with Drupal using something like:

$form = array();
$table_headers = array();
$table_rows = array();
$data = db_query("a query to get the whole dataset");
while ($row = db_fetch_object($data)) {
    $table_rows[] = $row->some attribute;
}

$form['report'] = array('#value' => theme('table', $table_headers, $table_rows);
return $form;

Is there a way of getting around what is essentially appending to a giant array of arrays? At the moment I don't see how I can offer any meaningful report pages with Drupal due to this.

Thanks

© Stack Overflow or respective owner

Related posts about drupal

Related posts about php