Application Code Redesign to reduce no. of Database Hits from Performance Perspective
Posted
by Rachel
on Stack Overflow
See other posts from Stack Overflow
or by Rachel
Published on 2010-03-12T14:49:44Z
Indexed on
2010/03/13
22:15 UTC
Read the original article
Hit count: 214
Scenario
I want to parse a large CSV file and inserts data into the database, csv file has approximately 100K rows of data.
Currently I am using fgetcsv
to parse through the file row by row and insert data into Database and so right now I am hitting database for each line of data present in csv file so currently database hit count is 100K which is not good from performance point of view.
Current Code:
public function initiateInserts()
{
//Open Large CSV File(min 100K rows) for parsing.
$this->fin = fopen($file,'r') or die('Cannot open file');
//Parsing Large CSV file to get data and initiate insertion into schema.
while (($data=fgetcsv($this->fin,5000,";"))!==FALSE)
{
$query = "INSERT INTO dt_table (id, code, connectid, connectcode)
VALUES (:id, :code, :connectid, :connectcode)";
$stmt = $this->prepare($query);
// Then, for each line : bind the parameters
$stmt->bindValue(':id', $data[0], PDO::PARAM_INT);
$stmt->bindValue(':code', $data[1], PDO::PARAM_INT);
$stmt->bindValue(':connectid', $data[2], PDO::PARAM_INT);
$stmt->bindValue(':connectcode', $data[3], PDO::PARAM_INT);
// Execute the statement
$stmt->execute();
$this->checkForErrors($stmt);
}
}
I am looking for a way wherein instead of hitting Database for every row of data, I can prepare the query and than hit it once and populate Database with the inserts.
Any Suggestions !!!
Note: This is the exact sample code that I am using but CSV file has more no. of field and not only id, code, connectid and connectcode but I wanted to make sure that I am able to explain the logic and so have used this sample code here.
Thanks !!!
© Stack Overflow or respective owner