Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP Posting-Version: version B 2.10.2 9/18/84; site harvard.ARPA Path: utzoo!watmath!clyde!burl!ulysses!allegra!mit-eddie!godot!harvard!breuel From: breuel@harvard.ARPA (Thomas M. Breuel) Newsgroups: net.lang.pascal,net.lang.c Subject: PASCAL as a system's programming language Message-ID: <252@harvard.ARPA> Date: Sun, 6-Jan-85 15:51:11 EST Article-I.D.: harvard.252 Posted: Sun Jan 6 15:51:11 1985 Date-Received: Tue, 8-Jan-85 02:41:00 EST Distribution: net Organization: Harvard University Lines: 28 Xref: watmath net.lang.pascal:170 net.lang.c:3737 The fact that PASCAL has clear semantics and consistent syntax makes it very usable for 'Programming 101'. The fact that PASCAL has also a rich set of data structures and rudimentary mechanisms for modular program design allows it to be used for system's programming as well. The language definition by Jensen&Wirth leaves extensions for integration into a host environment unspecified (although an example is given). All implementations that I have encountered (except for Berkley/UN*X 'pc') provide such extensions, and with them PASCAL is an excellent tool for system's programming. Calling it a 'toy' language is completely unjustified. 'C', on the other hand, is a usable, although a bit outdated, workhorse for architectures very similar to the PDP, VAX, or 68000. Unfortunately, 'C' can be adapted only with great difficulties to other architectures, like the Cyber 173, the DEC-20, &c. (not to be misunderstood, 'C' compilers for these machines exists, it is just that practically no program that Joe Random Hacker produces on a VAX will run on them unaltered). Other problems of the language include that its syntax appears convoluted or counter-intuitive to many people (e.g. "return (fun)(*(fun)rtoi(cdr(def)))();" ), and that 'C' is hard to optimise or adapt to parallel execution. I don't want to start another PASCAL vs. 'C' debate here. Which language you prefer depends mostly on your taste (both of them are Turing equivalent :-). Thomas. breuel@harvard.