Path: utzoo!mnetor!uunet!husc6!cmcl2!nrl-cmf!ames!pasteur!ucbvax!hplabs!hp-pcd!uoregon!markv
From: markv@uoregon.uoregon.edu (Mark VandeWettering)
Newsgroups: comp.lang.misc
Subject: Re: mathematics [was Re : Language illiteracy]
Message-ID: <1958@uoregon.uoregon.edu>
Date: 9 May 88 06:31:28 GMT
References: <786@trwcsed.trwrb.UUCP> <8088@ames.arpa> <765@l.cc.purdue.edu> <11526@ut-sally.UUCP> <5400@megaron.arizona.edu>
Reply-To: markv@drizzle.UUCP (Mark VandeWettering)
Distribution: na
Organization: University of Oregon, Computer Science, Eugene OR
Lines: 71

In article <5400@megaron.arizona.edu> debray@arizona.edu (Saumya Debray) writes:
>In article <1940@uoregon.uoregon.edu>, Mark VandeWettering writes:
>> 	Mathematics suffers from exactly the same problems as
>> 	programming languages: ideas get muddled in notation.

>This is silly!  Mathematical formalisms provide you with tools to define
>and reason about your ideas in a precise and unambiguous manner.  

	Perhaps I was overly terse in my answer (which ought to be
	considered a virtue, but what the heck).  

	I agree, mathematical formalisms (notation) provide you with
	tools to concisely express ideas unabiguously.  Programming
	languages serve precisely the same purpose in the world of
	computer programming.  A program is a description of a task,
	written (hopefully) unabiguously.

	Now, the question is:  Is mathematics a good notation for
	describing problems that are typical in computer science?  

	My answer is: no.  

>If someone can't use these tools effectively, the problem is with him, not
>with mathematics.  

	My point is:  the majority of tasks cannot be expressed within a
	strict mathematical framwork.  Try to describe the actions of a
	modern operating system in terms of ANY formalism.  A feat which
	I am sure most will agree is beyond doing.  And the problem
	remains:  Now that I have described this operating system, can I
	actually convert this description into runnable object code for
	some machine.

>Just because I can write unintelligible code in Lisp
>or Prolog doesn't make them poor languages; just because I can flatten
>my thumb with a hammer doesn't make the hammer a bad tool.

	But, a hammer is used for hammering, and not ballet dancing. 

>In article <11526@ut-sally.UUCP>, nather@ut-sally.UUCP (Ed Nather) writes:
>> It's much worse than that.  The basic notation -- and therefore the thought
>> processes it fosters -- describes a system of "eternal truth", usually
>> shown by the equals sign ( = ).  It not only says stuff on each side is
>> equivalent; it implies it always has been, and always will be.  Whatever
>> process change is needed must be artificially imposed from outside.
>
>That depends on the kind of system you're working with.  First order
>predicate logic won't let you reason (directly) about change, but try the
>various temporal, modal and dynamic logics that are around.

	Again, if it can be translated into some sort of executable code
	for a machine, then it probably can be used as a programming
	language.  That doesn't guarantee that it is good at expressing
	tasks in a given problem domain.

	The level of depth of postings in this area (while not
	particularly Saumya Debray) have been very low, indicating
	people who don't have significant depth in the areas of
	compilation and programming languages.  

	I am arguing, not against formalism, or formal methods at all.
	I am arguing that traditional mathematical notations (such as
	predicate logic) are probably inappropriate forms to express
	tasks to a computer.

>Saumya Debray		CS Department, University of Arizona, Tucson
>
>     internet:   debray@arizona.edu
>     uucp:       {allegra, cmcl2, ihnp4} !arizona!debray

mark vandewettering