Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP Posting-Version: version B 2.10.3 4.3bsd-beta 6/6/85; site topaz.ARPA Path: utzoo!watmath!clyde!cbosgd!cbdkc1!desoto!cord!pierce!topaz!vijay From: vijay@topaz.ARPA (P. Vijay) Newsgroups: net.lang.c Subject: Re: more about programming style Message-ID: <2689@topaz.ARPA> Date: Fri, 12-Jul-85 12:34:10 EDT Article-I.D.: topaz.2689 Posted: Fri Jul 12 12:34:10 1985 Date-Received: Sat, 13-Jul-85 16:43:45 EDT References: <11457@brl-tgr.ARPA> Organization: Rutgers Univ., New Brunswick, N.J. Lines: 25 > [DHowell.ES@Xerox writes about not using idioms...] >..... > variable is being incremented when it shouldn't be. However I don't > know that the variable is being incremented because I see this cryptic > "stuff++" which I pretty much ignore because the code is full of things > like that which gets lost in the shuffle. I'm lost, the programmer > doesn't know what's wrong, and we're stuck. > > However if the program said "stuff = stuff + 1" or even > "increment(stuff)", I could say "Aha! I think I know why it's not It all comes down to personal taste. You quote "stuff = stuff + 1" as a very readable statement. But is it not another of those 'idioms' that you seem to have a distaste for? While I do think usage of obscure idioms does tend to make the program text difficult to understand, certain common idioms are in fact quite useful in getting the point across both to the compiler and to the human reader. If you are into the business of fixing code in language X, I am afraid sooner or later you going to have to learn the idomatic usages (at least the common ones) of X, not only so that you could understand other's code, but also that others may understand your code. --Vijay--