Discussion:
[perl #52198] [BUG]: Test failures on Win32: t/op/arithmetics.t t/op/sprintf.t t/pmc/complex.t t/pmc/float.t
(too old to reply)
James Keenan
2008-03-28 18:06:40 UTC
Permalink
# New Ticket Created by James Keenan
# Please include the string: [perl #52198]
# in the subject line of all future correspondence about this issue.
# <URL: http://rt.perl.org/rt3/Ticket/Display.html?id=52198 >


This is one of a series of tickets reporting issues encountered at a
Parrot/Rakudo buildfest held at Toronto Perlmongers on March 27, 2008.

An experienced Perl-Win32 developer was able to configure Parrot
successfully with 'perl Configure.pl' and build it with 'nmake'. All
Parrot tests passed except those in the 4 named files. See
attachment for output of 'prove -v'.

Environment:
Windows XP
MSVC 6.0
Chromatic
2008-03-28 18:41:01 UTC
Permalink
Post by James Keenan
An experienced Perl-Win32 developer was able to configure Parrot
successfully with 'perl Configure.pl' and build it with 'nmake'. All
Parrot tests passed except those in the 4 named files. See
attachment for output of 'prove -v'.
Any chance of getting the diagnostic output from the failing tests?

-- c
James Keenan via RT
2008-03-28 18:55:30 UTC
Permalink
Post by Chromatic
Any chance of getting the diagnostic output from the failing tests?
Am confused. What diagnostic output beyond 'prove -v' are you referring to?

Once I complete all the postings from this session, I'll be posting
links to the various RTs on the Toronto.pm list. It will be up to the
original participants to pursue matters from there. I'm hoping that
they will provide follow-up and thereby begin to be involved in the
Parrot project, but that's their choice.
James Keenan via RT
2008-03-28 19:14:20 UTC
Permalink
Ah, I see that all I was sent was the output of 'prove' -- not 'prove -v'.

I'll have to see if the original tester can re-run the tests.
Chromatic
2008-03-28 19:06:51 UTC
Permalink
Post by James Keenan via RT
Am confused. What diagnostic output beyond 'prove -v' are you referring to?
For example...

t/op/arithmetics....1..26
ok 1 - take the negative of a native integer
ok 2 - take the absolute of a native integer
ok 3 - add native integer to native integer
ok 4 - subtract native integer from native integer
ok 5 - multiply native integer with native integer
ok 6 - divide native integer by native integer
not ok 7 - turn a native number into its negative

... diagnostic output missing right here...

ok 8 - take the absolute of a native number

The test expected the output:

-0.000000
0.000000
-123.456789
123.456789
-0.000000
0.000000
-123.456789
123.456789

Obviously it received something else. What did it receive?

-- c
Ron Blaschke
2008-03-29 15:41:02 UTC
Permalink
Post by Chromatic
Post by James Keenan via RT
Am confused. What diagnostic output beyond 'prove -v' are you referring to?
For example...
t/op/arithmetics....1..26
ok 1 - take the negative of a native integer
ok 2 - take the absolute of a native integer
ok 3 - add native integer to native integer
ok 4 - subtract native integer from native integer
ok 5 - multiply native integer with native integer
ok 6 - divide native integer by native integer
not ok 7 - turn a native number into its negative
... diagnostic output missing right here...
ok 8 - take the absolute of a native number
-0.000000
0.000000
-123.456789
123.456789
-0.000000
0.000000
-123.456789
123.456789
Obviously it received something else. What did it receive?
As far as I know, Visual C++ is a bit shaky on the arithmetic side with
regards to +/- 0.0. For example, the constant C<-0.0> is turned into
C<0.0>. One has to revert to hacks like C<-1.0 * 0.0>, if I recall
correctly.

t/op/arithmetics.t
not ok 7 - turn a native number into its negative
# Failed test 'turn a native number into its negative'
# at t\op\arithmetics.t line 210.
# got: '0.000000
# 0.000000
# -123.456789
# 123.456789
# 0.000000
# 0.000000
# -123.456789
# 123.456789
# '
# expected: '-0.000000
# 0.000000
# -123.456789
# 123.456789
# -0.000000
# 0.000000
# -123.456789
# 123.456789
# '

t/op/sprintf.t
not ok 157 - [%.0g] C99 standard mandates minus sign but C89 does not
skip: MSWin32 VMS hpux:10.20 openbsd netbsd:1.5 irix actual: >0<

t/pmc/complex.t
not ok 48 - sinh of complex numbers
# Failed test 'sinh of complex numbers'
# at t\pmc\complex.t line 1647.
# got: '
# sinh(0-2i)
# got 0.000000-0.909297i
# not -0.000000-0.909297i
#
# sinh(0+2i)
# got 0.000000+0.909297i
# not -0.000000+0.909297i
# done
# '
# expected: 'done
# '

t/pmc/float.t
not ok 23 - neg 0
# Failed test 'neg 0'
# at t\pmc\float.t line 547.
# '0'
# doesn't match '/^-0/
# '

Ron
James Keenan via RT
2008-06-18 12:16:34 UTC
Permalink
Post by Ron Blaschke
t/op/sprintf.t
not ok 157 - [%.0g] C99 standard mandates minus sign but C89 does not
skip: MSWin32 VMS hpux:10.20 openbsd netbsd:1.5 irix actual: >0<
This particular TODO-ed test has begun to "unexpectedly" PASS for me on
linux-i386 and darwin-ppc.

Has it begun to pass on MSWin32 and the other systems?

kid51
James Keenan via RT
2008-07-30 23:01:23 UTC
Permalink
Would it be possible to get an update from Win32 users on these tests?

Specifically, output of 'prove -v' for these tests:

t/op/arithmetics.t

t/pmc/complex.t

t/pmc/float.t


(I'm going to assume that the sprintf test is still problematic.)

Thank you very much.
kid51
James Keenan via RT
2008-07-30 23:09:46 UTC
Permalink
Interestingly enough, we are also getting failures on these 4 test files
on the OpenBSD Smolder tester:

http://smolder.plusthree.com/app/public_projects/report_details/3135

But, AFAICT, the Smolder server doesn't identify the particular tests
within the file which are failing.

kid51
Will Coleda
2008-07-30 23:25:42 UTC
Permalink
On Wed, Jul 30, 2008 at 7:09 PM, James Keenan via RT
Post by James Keenan via RT
Interestingly enough, we are also getting failures on these 4 test files
http://smolder.plusthree.com/app/public_projects/report_details/3135
But, AFAICT, the Smolder server doesn't identify the particular tests
within the file which are failing.
kid51
Click "Goto first failure". Click "Failed". Click the box of the same
color as the failed box: You should see the tap output:

not ok 5 - Float NaN
# Failed test 'Float NaN'
# at t/compilers/imcc/syn/veracity.t line 113.
# got: ''
# expected: 'NaN is true
# '
# Looks like you failed 1 test of 5.

Regards.
--
Will "Coke" Coleda
James Keenan via RT
2008-07-30 23:58:56 UTC
Permalink
Post by Will Coleda
Click "Goto first failure". Click "Failed". Click the box of the same
Thanks, Coke. That enabled me to verify that for these 3 test files:

t/op/arithmetics.t

t/pmc/complex.t

t/pmc/float.t


... the errors we're getting on the OpenBSD Smolder server are the same
individual tests that Ron B. was reporting on Win32.

kid51
Ron Blaschke
2008-12-21 19:19:40 UTC
Permalink
Post by Ron Blaschke
t/op/arithmetics.t
t/pmc/complex.t
t/pmc/float.t
For the record, according to our Smolder reports for MSWin32, these 3
files have tests that are either still TODO-ed out or are failing
outright on some builds.
There are a number of tests skipped that pass for me on Windows using
VC9. I'll have to check with MinGW, but many may no longer be relevant.
I'm working on that in /branches/vc9. Unless I'm mistaken, many issues
came up because of VC7.1's strange floating point handling.

Ron

Ron Blaschke
2008-12-21 19:13:25 UTC
Permalink
Post by Will Coleda
On Wed, Jul 30, 2008 at 7:09 PM, James Keenan via RT
Post by James Keenan via RT
Interestingly enough, we are also getting failures on these 4 test files
http://smolder.plusthree.com/app/public_projects/report_details/3135
But, AFAICT, the Smolder server doesn't identify the particular tests
within the file which are failing.
kid51
Click "Goto first failure". Click "Failed". Click the box of the same
not ok 5 - Float NaN
# Failed test 'Float NaN'
# at t/compilers/imcc/syn/veracity.t line 113.
# got: ''
# expected: 'NaN is true
# '
# Looks like you failed 1 test of 5.
BTW, this happens because atof does not handle the 'NaN' string. I've
fixed this in /branches/vc9.

Ron
Loading...