Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10 5/3/83; site umcp-cs.UUCP
Path: utzoo!watmath!clyde!burl!ulysses!mhuxl!houxm!houxz!vax135!floyd!cmcl2!seismo!rlgvax!cvl!umcp-cs!chris
From: chris@umcp-cs.UUCP
Newsgroups: net.lang.c
Subject: Re: C compiler for pdp-11 under RSX
Message-ID: <7447@umcp-cs.UUCP>
Date: Sun, 10-Jun-84 22:33:03 EDT
Article-I.D.: umcp-cs.7447
Posted: Sun Jun 10 22:33:03 1984
Date-Received: Mon, 11-Jun-84 23:50:34 EDT
References: <1880@sdccsu3.UUCP>, <192@dicomed.UUCP> <2849@brl-vgr.ARPA>, <873@orca.UUCP>, <3946@utzoo.UUCP> <2525@allegra.UUCP>
Organization: Univ. of Maryland, Computer Science Dept.
Lines: 39

I agree with this:

	From: alan@allegra.UUCP (Full Name Has Been Stripped By Bugs)
				(but it was Alan S. Driscoll)

	To export a symbol, you say something like

		int foo;

	To import it, you say

		extern int foo;

	There really isn't any problem.

My question:  is the following legal?

    extern int foo;
    .
    .
    .
    int foo;

(Within the same file.)  How about if the order is reversed?

I know of no machine's loader format, no matter how bizarre, that
could not handle this (after all, you can always have the compiler
set up to effectively say ``if I see a non-extern declaration
anywhere in the source, allocate space for the variable; if not,
"import" it'').  But there is apparently at least one machine out
there whose C compiler rejects this.  Can I simply declare that
machine's compiler to be incorrect, or do I have to resort to
the ``#define extern'' haque?

[A ``hack'' is a hack, but a ``haque'' is an Elegant Hack. :-)]
-- 
In-Real-Life: Chris Torek, Univ of MD Comp Sci (301) 454-7690
UUCP:	{seismo,allegra,brl-bmd}!umcp-cs!chris
CSNet:	chris@umcp-cs		ARPA:	chris@maryland