Subject: | Bad performance iterating through large # of rows. |
The performance of the following code over a simple table with 10000
rows is an order of magnitude slower than normal CDBI::Iterator. Normal
CDBi takes 10 seconds. CDBI::Plugin::Iterator takes over a minute.
package Foo;
use base qw(Class::DBI);
use Class::DBI::Plugin::Iterator;
Foo->connection("DBI:mysql:database=test;host=localhost", '', '');
Foo->table("foo");
Foo->columns( All => "id" );
$it = Foo->retrieve_all;
1 while $it->next;
Your MySQL connection id is 9 to server version: 5.0.27
Show quoted text
mysql> describe foo;
+-------+---------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-------+---------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
+-------+---------+------+-----+---------+----------------+
1 row in set (0.08 sec)
Show quoted textmysql> select count(*) from foo;
+----------+
| count(*) |
+----------+
| 10000 |
+----------+
1 row in set (0.00 sec)
I have posted a CDBI patch which provides an OnDemand iterator, like
yours, with performance nearly equal to the existing PreFetch iterator.
What I don't have is the clever COUNT stuff. See 24959.