Path: utzoo!utgpu!jarvis.csri.toronto.edu!mailrus!csd4.csd.uwm.edu!cs.utexas.edu!uunet!mcvax!ukc!stl!idec!prlhp1!yuleat
From: yuleat@prlhp1.prl.philips.co.uk (yuleat)
Newsgroups: comp.lang.c
Subject: Nude Vicar in Sex Romp!
Message-ID: <960@prlhp1.prl.philips.co.uk>
Date: 15 Aug 89 13:28:56 GMT
Reply-To: yuleat@prlhp1.UUCP ()
Organization: Philips Research Laboratories, Redhill, UK
Lines: 44

Now that I've got your attention, why not consider the following
piece of code (which has nothing to do with nude vicars!) :

#include 

main()
 {
    float x= 0.1234;

    fn1(x);
 }

fn1(x)
float x;
 {
    float y= x;

    fn2(&x, &y);
 }

fn2(x, y)
float *x, *y;
 {
    printf("In fn2 values are x= %f & y= %f\n", *x, *y);
 }


The interesting thing is that on both the compilers (HP & Apollo)
I've tried this on, the values printed out by the printf() are
different. Specifically, x is gibberish, whilst y is correct
(0.1234).  I believe that the reason for this is that x in main()
is converted to a double as it is passed to fn1(), thus the pointer
that is passed to fn2() is really a pointer to a double and hence
when it is de-refernced as a pointer to a float, it gets the "wrong"
answer.

What I would like to know is whether this is what the compiler
should do (I've looked in K&R and I couldn't find anything
that addressed this problem specifically).

----------------------------------------------------------------------
Andy Yule                                       Philips Research Labs,
(yuleat@prl.philips.co.uk)                           Redhill, England.
======================================================================