Path: utzoo!attcan!uunet!lll-winken!lll-tis!helios.ee.lbl.gov!pasteur!ucbvax!bloom-beacon!athena.mit.edu!scs From: scs@athena.mit.edu (Steve Summit) Newsgroups: comp.lang.c Subject: Re: Curious about function prototypes... Summary: what's wrong with old-style function definitions? Message-ID: <5808@bloom-beacon.MIT.EDU> Date: 17 Jun 88 02:58:26 GMT References: <654@orion.cf.uci.edu> <8073@brl-smoke.ARPA> Sender: daemon@bloom-beacon.MIT.EDU Reply-To: scs@adam.pika.mit.edu (Steve Summit) Lines: 27 In article <8073@brl-smoke.ARPA> gwyn@brl.arpa (Doug Gwyn (VLD/VMB)) writes: >However, it is not a good idea to mix new-style (prototype) and old-style >function syntax. The proposed ANSI C provides rules for resolving the >collision of these two ways of doing business, but that's intended for >grandfathering in existing code, not to be used for new code. New code >should consistently use prototype style for both declarations and >definitions. How important is this? I'm starting to use prototypes for external declarations, because they're easy to #ifdef out and they give me some (not all :-( ) of the advantages of lint on systems without it, but I'm going to use the old style in function definitions (i.e. the thing that sits at the top of the actual body of the function) for quite a while, to ensure portability to non-ANSIfied compilers (such as my '11 at home which is not likely to get an ANSI-style compiler, ever). How much, and why, are old-style declarations disparaged in new-style code? Microsoft's compiler warns you (with the maximum warning level set); I wish it wouldn't, because then the maximum warning level (/W3) would be almost as good as lint. Is it wrong for a compiler to construct some kind of Miranda prototype, when an old-style function definition is seen, which could be used for lint-style argument mismatch warnings (_n_o_t ANSI-style implicit argument coercions)? Steve Summit scs@adam.pika.mit.edu