Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP Posting-Version: Notesfiles $Revision: 1.7.0.5 $; site ccvaxa Path: utzoo!watmath!clyde!cbosgd!ihnp4!inuxc!pur-ee!uiucdcs!ccvaxa!seefromline From: preece@ccvaxa.UUCP Newsgroups: net.lang.c Subject: Re: C programming style Message-ID: <2600005@ccvaxa> Date: Thu, 11-Jul-85 10:49:00 EDT Article-I.D.: ccvaxa.2600005 Posted: Thu Jul 11 10:49:00 1985 Date-Received: Sat, 13-Jul-85 08:38:10 EDT References: <11434@brl-tgr.ARPA> Lines: 51 Nf-ID: #R:brl-tgr.ARPA:-1143400:ccvaxa:2600005:000:2735 Nf-From: ccvaxa.UUCP!preece Jul 11 09:49:00 1985 > > Typing has nothing to do with it, "i++" is clearer than "i = i + 1" to > > me, especially (as someone else has pointed out) when 'i' is a longish > > expression. > > Maybe "i++" is clearer to you, but do you only write programs for > yourself? To me "i++" is the kind of statement typical of an APL-type > language, not a language that is supposed to be structured and easy to > understand. "i++" could mean something else in another language. But > almost all high level languages (even APL) use some form of "i = i + 1" > to increment a variable. If I want to distinguish between incrementing > and adding, then I would define a procedure such as "increment(i)", > which can be immediately understood. ---------- I try not to write "i++" as a statement, though I use it in for statements, of course. But I NEVER write "i = i + 1", always using the form "i += 1", which conveys the obvious information that this is an increment rather than a generic assignment. If the language provides a way for you to tell the compiler more about what you're doing, use it. I also would claim that a form specifically indicating that you're incrementing a value is clearer to the reader, especially the relatively inexperienced reader. The form "i = i + 1" is a text-book example of a concept difficult for beginners to understand. ---------- > Progams should be as language-independent as possible. It shouldn't > matter whether the program is in C, Pascal, Ada, or even the dreaded > APL, but that it can be understood by *anyone* who needs to look at the > program. If you limit that *anyone* to "experienced C programmers", > you're limiting the possibilities of that program. ---------- I submit that the phrase "least common denominator," applied to anything but its mathematical meaning, is pejorative. Society respects those who use tools WELL. That usually means idiomatically. No, I don't believe in performing professional tasks with undue attention to the needs of unprofessional successors, except where that is a specific goal (writing, for instance, code intended to be modified by end users). It is VITAL to make code clear and understandable, but it is not necessary to assume that the person reading it doesn't know the language. Some people seem to think that idioms exist to obscure meaning. Nothing could be further from the truth. Idioms provide insight into meaning, to those who know the language. Idioms arise because they are natural ways to express something: a person does something in a way that seems natural and others, seeing that expression and finding it pleasing, adopt it. Idioms help the reader figure out what is going on. -- scott preece gould/csd - urbana ihnp4!uiucdcs!ccvaxa!preece