Mar 21, 2011
I'm trying to speed up a script that we use to let bots crawl our site. This is the only page that the bots are allowed to access.
View 3 Replies
We have over 2 million records in our database and we want one column from the table `linv_inventory` shown in an HTML table paged in 10,000 row increments.
// counting the offset
// $rowsPerPage = 10000
$offset = ($pageNum - 1) * $rowsPerPage;
$query = "SELECT `inventory_part_number` FROM `linv_inventory` ORDER BY inventory_part_number` LIMIT $offset, $rowsPerPage";
$result = mysql_query($query) or die('Error, query failed');
Which of these would be faster? The timing results that I have gotten have really been inconclusive. In the first, we query the first 10,000 records, extract them into an array, then iterate through the array in a foreach.
Are here any other means that I may use to make this faster as this script has been causing problems due to the number of bots that hit it simultaniously and brigning my database down to a crawl.