Path: utzoo!attcan!uunet!lll-winken!lll-lcc!ames!mailrus!tut.cis.ohio-state.edu!bloom-beacon!athena.mit.edu!peter From: peter@athena.mit.edu (Peter J Desnoyers) Newsgroups: comp.lang.c Subject: Re: Self-modifying code Message-ID: <6232@bloom-beacon.MIT.EDU> Date: 15 Jul 88 21:03:44 GMT References: <225800044@uxe.cso.uiuc.edu> <1100@nusdhub.UUCP> <12382@ut-sally.UUCP> Sender: daemon@bloom-beacon.MIT.EDU Reply-To: peter@athena.mit.edu (Peter J Desnoyers) Organization: Massachusetts Institute of Technology Lines: 24 In article <12382@ut-sally.UUCP> nather@ut-sally.UUCP (Ed Nather) writes: [talking about properly written self-modifying code] > >It would be even better if it could be done in a HLL like, say, C -- >with dangerous and confusing possibilities sharply restricted by the >language itself so the resulting code can be readily understood. The example that started this whole conversation was partial application. That IS "normally" incorporated into a language, although usually obscure, esoteric ones. Or else in Lisp, where it is partly supported as closures. The example faked partial application in C, which, like recursion in FORTRAN, is doable (on some architectures) but ugly. The necessity of using self-modifying code to implement partial application does not make it bad programming practice, any more than the necessity of using ML gotos to implement 'if' makes if-then statements bad programming practice. However, if you can't isolate the grungy part in the compiler (preferably) or a system call, then any advantages of this programming paradigm may be lost in the complexity (and danger) of using it. Peter Desnoyers peter@athena.mit.edu