Subject: | Crashes, incorrect checks and typos |
Summary:
There are a couple of mistakes in the TVDB::API 0.33 which mean that
the update operations will always fail on updated episodes.
Additionally there were typos in the warnings, and a mistake in the
operator used to assign to the database which meant that episodes could
be erased from the database by calling getEpisodeId.
Details:
There are a number of issues in the 0.33 version of TVDB::API. I have
addressed those that I can in the patch, attached. I shall describe
each and how it was observed, together with examples from the patch.
The first problem I saw was that the updates were not pulling in
episode changes at all - in particular, FlashForward 1x01 had been
fetched the day it was shown and contained no information other than
the episode title. As of today, when an update was attempted, the TVDB
contains a description for this episode, and even after repeatedly
doing updates for the 'month' type, no more data was obtained.
Debugging in the code showed that in the updates, the following line
was being triggered:
next unless defined $series->{$ep->{Series}};
Indeed, $series->{$ep->{Series}} is not defined. This is because $ep->
{Series} is an arrayref - because ForceArray has been set on 'Series'.
This has had two effects - firstly the database has filled with large
numbers of keys marked 'ARRAY(0x########)', which are probably making
it slower and larger than it needs to be.
A fix for this was to change the behaviour so that we read the single
seriesid from the arrayref, thus:
my $sid = $ep->{Series};
$sid = $sid->[0] if (ref $sid eq 'ARRAY');
# Don't update if we don't already have this series
next unless defined $series->{$sid};
A similar change is necessary for the banners code that follows it in
the database to ensure that it updates images which have been
downloaded. In my own cache db, I added some code to clear up the ARRAY
(0x########) keys which were redundant, but this has not been included
in this patch.
Having fixed this, fetches of episodes were failing with errors (die-
ing out of perl) complaining that a numeric value could not be used as
a hashref. Debugging showed that the expression:
$episodes->{$eid}->{lastupdated}
was the problem; $episodes->{$eid} was not a hashref, but instead it
was a number. Hunting for how this was assigned pointed to
getEpisodeId, thus:
$cache->{Episode}->{$eid} = $new->{Episode}-{$eid};
which shouldn't be a subtraction; it should be a ->.
As a consequence of this breakage, previous use of the getEpisodeId
function might have inserted rubbish into the database. To clear this
up and thus prevent the library from crashing due to the use of a
number as a hashref, a small protection is added to the update routines:
if (defined $episodes->{$eid} && !ref $episodes->{$eid})
{
$episodes->{$eid} = { lastupdated=>0 };
}
which will force the update of that episode id, replacing the old value.
I found it difficult to understand the 'unless' clause which followed
this, and the updates were not happening properly for the episodes, so
I restructured the code to use:
next unless defined $episodes->{$eid};
# Update if there is a more recent version
if ($ep->{time} > $episodes->{$eid}->{lastupdated}) {
...
which is closer to what the series code does, and looks more obvious
(plus it caused all the episodes to update as I expected).
Finally there were a lot of typos of TVDB as TBDB which I fixed. Those
might confuse the diffs slightly, but they were easy enough to include.
I have attached both the diffs and the patched version of the API.pm
file.
Admin:
Perl version in use here: 5.8.8.
Distribution: Debian 'etch'.
Kernel: 2.6.30.1
Flagged as Important because the functioning of the Updates is failing,
and the issue with the numeric values used as hashrefs can cause the
library to crash.
Although I've observed the problem in 0.33, I believe these issues have
been present in quite a few earlier versions.
Subject: | diffs-033.txt |
Message body is not shown because it is too large.
Subject: | API.pm |
Message body is not shown because it is too large.