Skip Menu |

This queue is for tickets about the Inline-Java CPAN distribution.

Report information
The Basics
Id: 21741
Status: resolved
Priority: 0/
Queue: Inline-Java

People
Owner: Nobody in particular
Requestors: joe.workman [...] gs.com
Cc:
AdminCc:

Bug Information
Severity: Normal
Broken in: (no value)
Fixed in: (no value)



Subject: Unusual slowness when processing JDBC results with next & getString
I am connecting to a database via JDBC and everything is working great! However, I feel that is is taking an extremely unusual amount of time to process the results from my query. Environment: Solaris 5.8 Generic_108528-18 sun4u sparc SUNW,Ultra-80 perl v5.8.6 Inline::Java v0.51 JDBC v0.01 Runtime Bechmarks: If you look at the code below, it take about 21 seconds to process 1100 rows or data. However, if I comment out just the while loop, that runtime jumps down to around 7 seconds. It seems like a very long time (14 secs) just to process 1100 rows of data. Program: #!/bin/perl use strict ; use JDBC ; JDBC->load_driver('in.co.daffodil.db.rmi.RmiDaffodilDBDriver'); warn "driver loaded" ; my $url = 'jdbc:daffodilDB://localhost:3456/ovaa'; my $user = 'DAFFODIL'; my $pass = 'daff0d1l'; my $con; eval { warn "getting connection" ; $con = JDBC->getConnection($url, $user, $pass); warn "got connection" ; } ; die $@->getMessage() if $@ ; my $sql = $con->createStatement(); my $query = 'select nodename, filename, file_type from ovpa_to_node'; my $results = $sql->executeQuery($query); #This loop is the reason for this programs sluggish behavior while ($results->next) { my $node = $results->getString(1); my $file = $results->getString(2); my $type = $results->getString(3); #print "Query Result: $node, $file, $type"; }
Subject: Re: [rt.cpan.org #21741] Unusual slowness when processing JDBC results with next & getString
Date: Wed, 27 Sep 2006 20:03:24 -0400
To: bug-Inline-Java [...] rt.cpan.org
From: "Patrick LeBoutillier" <patrick.leboutillier [...] gmail.com>
Joe, Show quoted text
> #This loop is the reason for this programs sluggish behavior > while ($results->next) { > my $node = $results->getString(1); > my $file = $results->getString(2); > my $type = $results->getString(3); > #print "Query Result: $node, $file, $type"; > }
In this case you are calling Java 4400 times, and there is a significant overhead each time because of all the "magic" going on between the scene, Using JNI mode should be faster. See the docs for more info. I optimized Inline::Java as best as I could around 0.50, so I don't think there's much quick and drity improvement left to be done. Patrick -- ===================== Patrick LeBoutillier Laval, Québec, Canada
Subject: RE: [rt.cpan.org #21741] Unusual slowness when processing JDBC results with next & getString
Date: Thu, 28 Sep 2006 11:25:18 +0100
To: <bug-Inline-Java [...] rt.cpan.org>
From: "Workman, Joe" <joe.workman [...] gs.com>
Thank you very much. The JNI did improve the performance slightly. Cheers Joe Show quoted text
-----Original Message----- From: Patrick LeBoutillier via RT [mailto:bug-Inline-Java@rt.cpan.org] Sent: Thursday, September 28, 2006 1:04 AM To: Workman, Joe Subject: Re: [rt.cpan.org #21741] Unusual slowness when processing JDBC results with next & getString <URL: http://rt.cpan.org/Ticket/Display.html?id=21741 > Joe,
> #This loop is the reason for this programs sluggish behavior > while ($results->next) { > my $node = $results->getString(1); > my $file = $results->getString(2); > my $type = $results->getString(3); > #print "Query Result: $node, $file, $type"; > }
In this case you are calling Java 4400 times, and there is a significant overhead each time because of all the "magic" going on between the scene, Using JNI mode should be faster. See the docs for more info. I optimized Inline::Java as best as I could around 0.50, so I don't think there's much quick and drity improvement left to be done. Patrick -- ===================== Patrick LeBoutillier Laval, Québec, Canada
Due to the long passage of time, I am closing this ticket. If the problem still exists, please reopen it (with up-to-date data) and I will be pleased to take another look!