Skip Menu |

This queue is for tickets about the SQL-Translator CPAN distribution.

Report information
The Basics
Id: 121901
Status: patched
Priority: 0/
Queue: SQL-Translator

People
Owner: Nobody in particular
Requestors: ppisar [...] redhat.com
Cc:
AdminCc:

Bug Information
Severity: (no value)
Broken in: 0.11021
Fixed in: (no value)



Subject: t/23json.t test fails with JSON-PP-2.93
After upgrading JSON-PP from 2.27400 to 2.93, t/23json.t test fails. There is a difference in "order" values. It expects unquoted number (3), but the new JSON-PP quotes the value ("3"). I do not paste the log here because its large.
From: ppisar [...] redhat.com
Dne Pá 26.Květen.2017 03:07:37, ppisar napsal(a): Show quoted text
> After upgrading JSON-PP from 2.27400 to 2.93, t/23json.t test fails. > There is a difference in "order" values. It expects unquoted number > (3), but the new JSON-PP quotes the value ("3"). I do not paste the > log here because its large.
Vice versa. It expects quotes, but it gets unquoted: # | 39| "age" : { | "age" : { | # | 40| "data_type" : "integer", | "data_type" : "integer", | # | 41| "default_value" : null, | "default_value" : null, | # | 42| "is_nullable" : 1, | "is_nullable" : 1, | # | 43| "is_primary_key" : 0, | "is_primary_key" : 0, | # | 44| "is_unique" : 0, | "is_unique" : 0, | # | 45| "name" : "age", | "name" : "age", | # * 46| "order" : 3, | "order" : "3", * # | 47| "size" : [ | "size" : [ | # | 48| "0" | "0" | # | 49| ] | ] | # | 50| }, | }, |
From: ppisar [...] redhat.com
Dne Pá 26.Květen.2017 03:07:37, ppisar napsal(a): Show quoted text
> After upgrading JSON-PP from 2.27400 to 2.93, t/23json.t test fails.
This can be reduced to: $ perl -I/tmp/JSON-PP/lib -Ilib -MSQL::Translator -e 'print SQL::Translator->new(parser => q{SQLite}, producer => q{JSON}, data => q{create table person ( person_id INTEGER);})->translate, qq{\n}' {"translator":{"add_drop_table":0,"producer_args":{},"no_comments":0,"filename":null,"parser_type":"SQL::Translator::Parser::SQLite","trace":0,"parser_args":{},"version":"0.11021","producer_type":"SQL::Translator::Producer::JSON","show_warnings":0},"schema":{"triggers":{},"views":{},"tables":{"person":{"fields":{"person_id":{"order":1,"default_value":null,"is_unique":0,"is_primary_key":0,"name":"person_id","data_type":"INTEGER","size":["0"],"is_nullable":1}},"options":[],"constraints":[],"name":"person","indices":[],"order":1}},"procedures":{}}} Only the "order" value is affected. It's a number and thus it should not be quoted. It's not a string. I think this is actually a fix in the JSON::PP and SQL-Translator should correct the test. This is triggered by JSON-PP commit: commit 87bd6a49bacc3a2634cbb1dd0ce9cc75675bb524 (HEAD, refs/bisect/bad) Author: Graham Knop <haarg@haarg.org> Date: Mon Feb 1 18:36:21 2016 -0500 abuse bitwise ops to check for num vs string to avoid B.pm B.pm is a rather large module to load, and numbers and strings can be detected by abusing the behavior of bitwise operators. Using a bitwise operator is faster as well. A string & "" results in "" A number & "" results in 0 (with a warning) Some variables won't be fully caught by this check alone. A string like "0.0" that has been used in a numeric context will pass the check as if it was a number. We detect this by making sure the string is the same as stringifing the numeric form. "nan" and "inf" that have been used as numbers will also pass that check, so we detect them by making sure multiplying by 0 gives 0.
From: ppisar [...] redhat.com
Dne Pá 26.Květen.2017 04:05:16, ppisar napsal(a): Show quoted text
> I think this is actually a fix in the JSON::PP and SQL-Translator > should correct the test. >
The fix is attached. It's a patch against latest git master HEAD.
Subject: 0001-Adapt-tests-to-JSON-PP-2.92.patch
From bf5c5c0689394cddccc9a23d897ab58fb3f5b32a Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Petr=20P=C3=ADsa=C5=99?= <ppisar@redhat.com> Date: Fri, 26 May 2017 10:43:05 +0200 Subject: [PATCH] Adapt tests to JSON-PP-2.92 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit JSON-PP-2.91_01 fixed numeric value detection and that cused t/23json.t test failure. There was a change in "order" value quoting. Previously, the value was encoded as a string, now it's encoded as number. CPAN RT#121901 Signed-off-by: Petr Písař <ppisar@redhat.com> --- t/23json.t | 37 +++++++++++++++++++++---------------- 1 file changed, 21 insertions(+), 16 deletions(-) diff --git a/t/23json.t b/t/23json.t index 0f96e47..1f4f278 100644 --- a/t/23json.t +++ b/t/23json.t @@ -15,8 +15,7 @@ BEGIN { } my $sqlt_version = $SQL::Translator::VERSION; -use JSON; -my $json = to_json(from_json(<<JSON), { canonical => 1, pretty => 1 }); +my $json_string = <<JSON; { "schema" : { "procedures" : {}, @@ -62,7 +61,7 @@ my $json = to_json(from_json(<<JSON), { canonical => 1, pretty => 1 }); "is_primary_key" : 0, "is_unique" : 0, "name" : "age", - "order" : "3", + "order" : 3, "size" : [ "0" ] @@ -74,7 +73,7 @@ my $json = to_json(from_json(<<JSON), { canonical => 1, pretty => 1 }); "is_primary_key" : 0, "is_unique" : 0, "name" : "description", - "order" : "6", + "order" : 6, "size" : [ "0" ] @@ -86,7 +85,7 @@ my $json = to_json(from_json(<<JSON), { canonical => 1, pretty => 1 }); "is_primary_key" : 0, "is_unique" : 0, "name" : "iq", - "order" : "5", + "order" : 5, "size" : [ "0" ] @@ -98,7 +97,7 @@ my $json = to_json(from_json(<<JSON), { canonical => 1, pretty => 1 }); "is_primary_key" : 0, "is_unique" : 1, "name" : "name", - "order" : "2", + "order" : 2, "size" : [ "20" ] @@ -115,7 +114,7 @@ my $json = to_json(from_json(<<JSON), { canonical => 1, pretty => 1 }); "is_primary_key" : 1, "is_unique" : 0, "name" : "person_id", - "order" : "1", + "order" : 1, "size" : [ "0" ] @@ -127,7 +126,7 @@ my $json = to_json(from_json(<<JSON), { canonical => 1, pretty => 1 }); "is_primary_key" : 0, "is_unique" : 0, "name" : "weight", - "order" : "4", + "order" : 4, "size" : [ "11", "2" @@ -137,7 +136,7 @@ my $json = to_json(from_json(<<JSON), { canonical => 1, pretty => 1 }); "indices" : [], "name" : "person", "options" : [], - "order" : "1" + "order" : 1 }, "pet" : { "constraints" : [ @@ -196,7 +195,7 @@ my $json = to_json(from_json(<<JSON), { canonical => 1, pretty => 1 }); "is_primary_key" : 0, "is_unique" : 0, "name" : "age", - "order" : "4", + "order" : 4, "size" : [ "0" ] @@ -208,7 +207,7 @@ my $json = to_json(from_json(<<JSON), { canonical => 1, pretty => 1 }); "is_primary_key" : 0, "is_unique" : 0, "name" : "name", - "order" : "3", + "order" : 3, "size" : [ "30" ] @@ -220,7 +219,7 @@ my $json = to_json(from_json(<<JSON), { canonical => 1, pretty => 1 }); "is_primary_key" : 1, "is_unique" : 0, "name" : "person_id", - "order" : "2", + "order" : 2, "size" : [ "0" ] @@ -232,7 +231,7 @@ my $json = to_json(from_json(<<JSON), { canonical => 1, pretty => 1 }); "is_primary_key" : 1, "is_unique" : 0, "name" : "pet_id", - "order" : "1", + "order" : 1, "size" : [ "0" ] @@ -241,7 +240,7 @@ my $json = to_json(from_json(<<JSON), { canonical => 1, pretty => 1 }); "indices" : [], "name" : "pet", "options" : [], - "order" : "2" + "order" : 2 } }, "triggers" : { @@ -259,7 +258,7 @@ my $json = to_json(from_json(<<JSON), { canonical => 1, pretty => 1 }); "fields" : null, "name" : "pet_trig", "on_table" : "pet", - "order" : "1", + "order" : 1, "perform_action_when" : "after", "scope": "row" } @@ -268,7 +267,7 @@ my $json = to_json(from_json(<<JSON), { canonical => 1, pretty => 1 }); "person_pet" : { "fields" : [], "name" : "person_pet", - "order" : "1", + "order" : 1, "sql" : "select pr.person_id, pr.name as person_name, pt.name as pet_name\\n from person pr, pet pt\\n where person.person_id=pet.pet_id\\n" } } @@ -290,6 +289,12 @@ my $json = to_json(from_json(<<JSON), { canonical => 1, pretty => 1 }); } } JSON +use JSON; +if (JSON->is_pp && JSON->VERSION < 2.91_01) { + # JSON-PP-2.91_01 fixed numeric value detection, CPAN RT#121901 + $json_string =~ s/("order" : )(\d+)(,?)/$1"$2"$3/g; +} +my $json = to_json(from_json($json_string), { canonical => 1, pretty => 1 }); my $file = "$Bin/data/sqlite/create.sql"; open my $fh, '<', $file or die "Can't read '$file': $!\n"; -- 2.9.4
Hi Petr, Thanks for the report, and sorry for the slow reply. Instead of tweaking the JSON, I've changed the test (and the YAML one) to decode both the expected and generated string and comparing the resulting data structures with 'is_deeply()'. This should be more robust against future changes in serialisers, while still testing that we generate the right shape of data. This is in git as https://github.com/dbsrgits/sql-translator/commit/49f0f316cadb67de79e4b361e38d0139cf3b5fc4 and will be in the next release. Regards, Ilmari